CN116363082A - Collision detection method, device, equipment and program product for map elements - Google Patents

Collision detection method, device, equipment and program product for map elements Download PDF

Info

Publication number
CN116363082A
CN116363082A CN202310253343.8A CN202310253343A CN116363082A CN 116363082 A CN116363082 A CN 116363082A CN 202310253343 A CN202310253343 A CN 202310253343A CN 116363082 A CN116363082 A CN 116363082A
Authority
CN
China
Prior art keywords
map
displayed
electronic map
electronic
depth data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310253343.8A
Other languages
Chinese (zh)
Inventor
刘飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba China Co Ltd
Original Assignee
Alibaba China Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba China Co Ltd filed Critical Alibaba China Co Ltd
Priority to CN202310253343.8A priority Critical patent/CN116363082A/en
Publication of CN116363082A publication Critical patent/CN116363082A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application provides a collision detection method, device, equipment and program product of map elements, wherein the method comprises the following steps: detecting a state change of an electronic map currently displayed on a screen, when the state change indicates that a map element in the electronic map to be displayed is different from a map element in the electronic map currently displayed and the map element in the electronic map to be displayed is stable, performing collision detection at least when the electronic map to be displayed is displayed on the screen, including: reading depth data of map elements of a preset type from rendering data of an electronic map to be displayed, wherein the preset type does not comprise interest points; determining depth data of interest points in an electronic map to be displayed; and comparing the depth data of the map elements of the preset type with the depth data of the interest points to obtain the interest points to be displayed in the electronic map to be displayed. According to the map rendering method and device, the resource consumption of the user equipment can be reduced, and further the problems that the map rendering is blocked or the user equipment generates heat and the like are avoided.

Description

Collision detection method, device, equipment and program product for map elements
Technical Field
The present disclosure relates to the field of map rendering technologies, and in particular, to a method, an apparatus, a device, and a program product for detecting a collision of map elements.
Background
With the development of location-based services, the electronic map is an expression of the real world in the digital world, and a user can browse and inquire places in the real world, such as shops, hotels, bus stops, shops, houses and the like through the electronic map displayed on a screen of the user equipment, wherein the places of the shops, the hotels, the bus stops, the shops, the houses and the like are commonly called as points of interest (Point of Interest, POI) in the electronic map, the points of interest are map elements in the electronic map, and the electronic map also comprises map elements such as roads, greenbelts, water systems and the like.
In order to avoid the problem of shielding between POIs or between POIs and other map elements (such as buildings, trees, roads and the like), in the prior art, before the electronic map is displayed on the screen of the user equipment, collision detection is performed on the map elements in the electronic map, namely, the map elements with shielding problem are detected, and certain processing is performed on the map elements with shielding problem, so as to solve the problem of poor user experience caused by shielding.
As the display effect of the electronic map is developed from two dimensions to three dimensions, the consumption of the map rendering on the user equipment resources is increased, and if the collision detection is performed once every time the electronic map is rendered, the consumption of the user equipment resources is very large, so that the map rendering has problems of clamping or heating of the user equipment and the like.
Disclosure of Invention
The main objective of the embodiments of the present application is to provide a method, an apparatus, a device, and a program product for detecting collision of map elements, which can reduce resource consumption of user equipment, further avoid problems such as jamming or heating of the user equipment in map rendering, and promote service experience of users.
In a first aspect, an embodiment of the present application provides a method for detecting a collision of map elements, including:
detecting a state change of an electronic map currently displayed on a screen, and performing collision detection when the state change indicates that a map element in the electronic map to be displayed is different from a map element in the electronic map currently displayed and the map element in the electronic map to be displayed is stable, at least when the electronic map to be displayed is displayed on the screen, the collision detection including:
reading depth data of map elements of a preset type from rendering data of an electronic map to be displayed, wherein the preset type does not comprise interest points;
Determining depth data of interest points in an electronic map to be displayed;
and comparing the depth data of the map elements of the preset type with the depth data of the interest points to obtain the interest points to be displayed in the electronic map to be displayed.
Optionally, when detecting that the map element in the electronic map to be displayed is different from the map element in the electronic map currently displayed, the method further includes:
detecting the change duration, executing collision detection once when the change duration reaches a preset detection duration but the map elements in the electronic map to be displayed are unstable, returning to the step of detecting the change duration if the map elements in the electronic map to be displayed are unstable, and executing collision detection when the electronic map to be displayed is displayed on the screen until the map elements in the electronic map to be displayed are stable.
Optionally, the detecting the state change of the electronic map currently displayed on the screen specifically includes:
and detecting the state change of the scale of the electronic map currently displayed on the screen or the state change of the camera.
Optionally, the change of the scale state is from a target scale to a scale different from the target scale, and the map element in the electronic map to be displayed is different from the map element in the electronic map currently displayed, and the target scale is the scale of the electronic map currently displayed.
Optionally, the camera state change includes a direction state change and a position state change;
if the difference between the orientation angle of the camera of the electronic map to be displayed and the orientation angle of the camera of the electronic map currently displayed is greater than a preset angle threshold value, the map elements in the electronic map to be displayed are different from the map elements in the electronic map currently displayed,
or if the distance from the position of the camera of the electronic map to be displayed to the position of the camera of the current display electronic map is greater than a preset distance threshold, indicating that the map elements in the electronic map to be displayed are different from the map elements in the current display electronic map.
Optionally, the method further comprises:
generating, based on map data of an electronic map to be displayed, rendering data of map elements included in the map data, the rendering data including: depth data corresponding to a pixel point.
Optionally, the graphics processor GPU generates rendering data of map elements included in map data based on the map data of the electronic map to be displayed;
the reading of the depth data of the map elements of the preset type from the rendering data of the electronic map to be displayed specifically includes:
And the CPU reads the depth data of map elements of a preset type from the rendering data of the electronic map to be displayed, which is generated by the GPU.
In a second aspect, an embodiment of the present application provides a collision detection apparatus for a map element, including:
the first detection module is used for detecting the state change of the electronic map currently displayed on the screen;
the second detection module is used for triggering the collision detection module at least when the state change indicates that the map elements in the electronic map to be displayed are different from the map elements in the electronic map to be displayed currently and the map elements in the electronic map to be displayed are stable;
the collision detection module includes:
a depth data reading unit, configured to read depth data of map elements of a preset type from rendering data of an electronic map to be displayed, where the preset type does not include interest points;
a depth data calculation unit for determining depth data of interest points in the electronic map to be displayed;
and the depth data comparison unit is used for comparing the depth data of the map elements of the preset type with the depth data of the interest points to obtain the interest points to be displayed in the electronic map to be displayed.
Optionally, the apparatus further comprises: a third detection module;
the third detection module is configured to detect a change duration when detecting that a map element in an electronic map to be displayed is different from a map element in a current electronic map to be displayed, and execute the collision detection once when the change duration reaches a preset detection duration but the map element in the electronic map to be displayed is unstable, and return to a step of detecting the change duration if the map element in the electronic map to be displayed is unstable until the map element in the electronic map to be displayed is stable, and trigger the collision detection module when displaying the electronic map to be displayed on the screen.
In a third aspect, an embodiment of the present application provides an electronic device, including:
at least one processor; and
a memory communicatively coupled to the at least one processor;
wherein the memory stores instructions executable by the at least one processor to cause the electronic device to perform the method of any of the above aspects.
In a fourth aspect, embodiments of the present application provide a computer program product comprising a computer program which, when executed by a processor, implements the method of any of the above aspects.
According to the collision detection method, the device, the equipment and the program product for the map elements, the state change of the electronic map currently displayed on the screen can be detected, and when the state change indicates that the map elements in the electronic map to be displayed are different from the map elements in the electronic map currently displayed and the map elements in the electronic map to be displayed are stable, the collision detection is performed at least when the electronic map to be displayed is displayed on the screen; further, the collision detection includes: reading depth data of map elements of a preset type from rendering data of an electronic map to be displayed, wherein the preset type does not comprise interest points; determining depth data of interest points in an electronic map to be displayed; and comparing the depth data of the map elements of the preset type with the depth data of the interest points to obtain the interest points to be displayed in the electronic map to be displayed. According to the method and the device, whether collision detection is needed or not can be determined by detecting the state change of the electronic map currently displayed on the screen, instead of collision detection in the rendering process of each frame of map in the electronic map, POI updating is achieved, and performance consumption is reduced; if it is determined that collision detection is executed, firstly, depth data of map elements of a preset type, which do not comprise POIs, are read from rendering data of an electronic map to be displayed, meanwhile, the depth data of the POIs in the electronic map to be displayed are determined, then, the depth data of the POIs and the depth data of the POIs are compared, and whether the POIs are rendered on the map is determined, so that collision detection is more visual and effective, accuracy of map rendering results is further guaranteed, and service experience of a user is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
Fig. 1 is a schematic view of an application scenario according to an embodiment of the present application;
fig. 2 is a flow chart of a method for detecting collision of map elements according to an embodiment of the present application;
fig. 3 is a schematic view of another application scenario provided in an embodiment of the present application;
fig. 4 is a flowchart of another method for detecting a collision of map elements according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a collision detection device of a map element according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Specific embodiments thereof have been shown by way of example in the drawings and will herein be described in more detail. These drawings and the written description are not intended to limit the scope of the inventive concepts in any way, but to illustrate the concepts of the present application to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present application.
The terms referred to in this application are explained first:
depth data: the distance from the map element to the camera in the electronic map is represented, and since a certain amount of pixels are covered when the map element is displayed on the screen, one depth data or depth value is provided for each pixel covered by the map element. When the hardware performance of the terminal device performing map rendering is common, the value of the depth data is converted into a value of RGB.
In application software requiring the use of an electronic map, if the logic of displaying the POI does not execute the logic of collision detection with map elements of a preset type (such as buildings, trees, etc.), in the electronic map, particularly in a three-dimensional electronic map, it means that the POI located behind a certain building is displayed on the building, while in the real world, a user cannot penetrate the building to see the POI behind the building, and when the user sees the POI displayed on the building in the three-dimensional electronic map, confusion is generated, and it is unclear whether the POI is located inside or behind the building. Therefore, in order to ensure user experience, in the process of rendering an electronic map, collision detection needs to be performed on map elements, that is, map elements with shielding problems are detected, and certain processing is performed on map elements with shielding, so as to solve the problem of poor user experience caused by shielding.
However, as the display effect of the electronic map is developed from two dimensions to three dimensions, the consumption of the user equipment resources by the map rendering is increasing, and if the collision detection is performed once every time an electronic map is rendered, the consumption of the user equipment resources is very large, resulting in problems such as jamming or heating of the user equipment in the map rendering. Therefore, the prior art has the problems that the user equipment resource consumption is very large, and the map rendering is blocked or the user equipment generates heat, so that the user experience is poor.
In view of this, in order to improve user experience and reduce user equipment resource consumption, and further avoid problems such as jamming or heating of the user equipment in map rendering, the embodiment of the present application may determine whether collision detection is required by detecting a state change of an electronic map currently displayed on a screen, instead of performing collision detection in a rendering process of each frame of the electronic map, thereby reducing performance consumption; if it is determined that collision detection is executed, depth data of map elements of a preset type are read from rendering data of an electronic map to be displayed, and compared with the depth data of POIs in the electronic map to be displayed, POIs which can be rendered on the electronic map are determined, so that power consumption of collision detection is lower, accuracy of map rendering results is guaranteed, and user experience is improved.
The following describes in detail a process of performing collision detection between map elements (for example, buildings, trees, and the like, hereinafter referred to as objects, and taking the objects as the buildings as examples) of a preset type and POIs (for example, places such as shops, hotels, buses, stores, houses, and the like in an electronic map).
Fig. 1 is a schematic diagram of an application scenario according to an embodiment of the present application. As shown in fig. 1, the method for detecting collision of map elements provided in the embodiment of the present application may be applied to an application scenario shown in fig. 1, where the application scenario includes: a user equipment; the user device is provided with a client 101 (the client may be an application program/software) with an electronic map rendering capability, and the user views the electronic map through the client 101 to obtain information needed by the user through the electronic map (such as food near a location where the user is located, or road details around a certain location, etc., which are not limited specifically herein). The electronic map is rendered in a screen of the user equipment, and the logic processing unit 102 (such as a CPU) in the user equipment generally performs logic processing on rendering data of the electronic map, and the display chip 103 (such as a GPU) in the user equipment renders the electronic map on the screen of the user equipment for viewing by a user. Since electronic maps are representations of the real world in the digital world, there is a need to ensure that the user's experience with electronic maps is consistent with the real world, e.g., people do not see geographic elements that are obscured by buildings.
In order to ensure the experience of using the electronic map, when the user performs interactive operation on the electronic map currently displayed in the user equipment screen, for example, the map scaling operation is detected by the CPU, whether the electronic map to be displayed in the user equipment screen changes compared with the electronic map currently displayed is judged, if the electronic map to be displayed in the user equipment screen changes and the electronic map to be displayed is not changed, namely, map elements in the electronic map to be displayed are stable (namely, the user scaling operation is finished, that is, the user does not perform operation on the electronic map any more), collision detection is performed at least when the electronic map to be displayed is displayed on the screen, so that the CPU determines whether to perform collision detection according to the state change of the electronic map displayed in the user equipment screen, and does not need to perform collision detection on each frame of map rendered in the electronic map, thereby reducing the resource consumption of the user equipment. The following describes related embodiments of the present application in detail with reference to the drawings.
Taking a three-dimensional rendering scene of an electronic map as an example, when a user selects the three-dimensional rendering scene through a client, the user can see the three-dimensional electronic map on a screen, and drag, render and zoom operations can be performed on the electronic map by the user, so that geographic elements rendered and displayed in the electronic map can be changed based on the operations of the user. In order to reduce the resource consumption of the user equipment, the method and the device further determine whether to execute collision detection by detecting the state change of the electronic map currently displayed on the screen, namely, determine the time of collision detection, and not execute collision detection on each rendered frame of electronic map.
Specifically, the CPU may detect a state change of the electronic map currently displayed on the screen, determine whether a map element in the electronic map to be displayed is the same as a map element in the electronic map currently displayed, and if the detected state change indicates that the map element in the electronic map to be displayed is different from the map element in the electronic map currently displayed, confirm whether to perform collision detection according to the following two determination conditions:
and (1) if the condition 1 detects that the map elements in the electronic map to be displayed are stable, namely the electronic map to be displayed does not change any more, determining that the condition for executing collision detection is met, and executing the collision detection.
Condition 2, determining whether to trigger collision detection by detecting a change period: performing a collision detection whenever the change duration reaches a preset detection duration but a map element in the electronic map to be displayed is unstable (e.g., a time interval from a current change state to a next change state reaches a preset time interval threshold); if the map element in the electronic map to be displayed is unstable, returning to the step of detecting the change duration until the map element in the electronic map to be displayed is stable, indicating that the map element is changed to be unchanged, and therefore, executing collision detection when the electronic map to be displayed is displayed on a screen.
As mentioned above, the map elements in the electronic map are changed, that is, the user performs a corresponding operation on the electronic map currently displayed in the screen, and the duration of the user operation is not long, so, in order to ensure that the user does not have a conflict problem in the electronic map seen on the screen, the priority of condition 1 is higher than that of condition 2, that is, once the user does not perform an operation on the screen any more, the collision detection must be performed on the electronic map currently to be rendered; if the operation time of the user on the electronic map is longer, for example, exceeds the preset time interval threshold, in order to ensure that the electronic map rendered in the middle process does not have a conflict problem, the collision detection step needs to be executed every preset time interval threshold in the process.
The performing of the collision detection may include: based on a preset configuration table, reading depth data of map elements of a preset type which do not comprise POIs from rendering data of an electronic map to be displayed, acquiring the depth data of the POIs in the electronic map to be displayed, comparing the depth data of the POIs and the depth data of the POIs, and further determining which POIs or POIs need to be displayed in the electronic map and which POIs or POIs need not to be displayed.
Fig. 2 is a schematic flow chart of a method for detecting a collision of map elements according to an embodiment of the present application, and fig. 2 shows a three-dimensional rendering scene of an electronic map, including updating of map elements and POIs, triggering of collision detection, execution of collision detection, and rendering display of the three-dimensional map. Specifically, when a user interacts with a client (e.g., an electronic map client), collision detection of map elements based on map rendering can be realized through a multi-thread (at least including a rendering thread and a logic thread, wherein the logic thread runs on a CPU, and the rendering thread runs on a GPU) in a synchronous and interactive manner: on the logic thread, according to the detected current scene (here, three-dimensional rendering scene), firstly, a building (here, a building to be rendered, which has shielding collision with a POI nearest to a camera) needing to be collided in the current scene is determined based on the type of map elements needing to be collided and recorded in a preset configuration table, the building is marked in a rendering pipeline, and the rendering pipeline is put into the rendering pipeline to update the rendering information of the building.
On a rendering thread, rendering a building according to updated rendering information (or rendering data), converting depth into RGB color, determining whether an instruction for reading the depth data needs to be sent to the CPU or not based on whether a request for reading the depth data sent by the CPU is received or not, and if the request for reading the depth data is not received, not sending the instruction for reading the depth data; if a request for reading the depth data is received, an instruction for reading the depth data is sent, RGB color data in the GPU is given to the memory, the CPU updates the POI, and in the POI updating process, the RGB color data is read, so that the depth data of map elements of a preset type are obtained. The CPU sends a request for reading depth data to the GPU, and after the GPU receives the instruction, the GPU sends the depth data to the memory, so that the CPU can read the depth data of the map element of the preset type, for example, the depth data of the building, from the rendering data.
In addition, when the POI is updated, the position of the pixel where the POI is located is obtained by calling a predefined interface, RGB color data of the pixel where the POI is located is found from the read RGB color data based on the position, the RGB color data is converted into corresponding depth data, then the updated depth data of the POI (the depth data of the POI in the electronic map to be displayed is obtained by calculating based on the obtained coordinates of the POI and the parameters of the camera, and then the two depth data are compared (namely, the POI and the building are compared to be closer to the camera), whether the POI is blocked or not is judged, if the depth data of the building is larger than the depth data of the POI, the POI is indicated to be in front of the building, and no blocking is needed, and subsequent collision processing is needed; if the depth data of the building is less than the depth data of the POI, it is indicated that the POI is behind the building, and if the depth data of the building is equal to the depth data of the POI, the POI is inside the building, both of which consider the POI to be occluded and the POI is not displayed on the map.
For example, referring to fig. 3, fig. 3 is a schematic view of another application scenario provided in the embodiment of the present application. A user obtains valuable map information of a region in a client through a triggering operation (such as an operation of inputting, clicking, sliding, dragging, rotating, zooming, etc.) through a map operation interface (such as a home interface 31 of the client): the user performs a first triggering operation on the client (when the user opens the home page interface 31, the current home page interface 31 displays an electronic map (or map) corresponding to the position where the user is located), the home page interface may also display a first function key area 311 for interface switching, including function keys such as "home page", "nearby", "my", and the like, the user may click on any function key to switch to an interface of a corresponding function), for example, a certain destination name (such as destination 1) is input in a predetermined input box 312, map query of a position and a surrounding environment is performed by clicking on a query button (see (a) in fig. 3), the client performs map rendering processing based on the first triggering operation, and displays an electronic map 1 (or map 1) corresponding to the first triggering operation on a screen (see (b) in fig. 3), the electronic map also displays description information of the destination and a second function key area 321 including function keys such as "surrounding", "navigation", "route", and the like, and the user may click on the function key to provide a corresponding function service for the user; if the user wants to view a map in a certain area or a certain position in the electronic map, a second trigger operation, such as a sliding operation or a zooming operation, is performed on the electronic map, for example, a sliding operation (see (c) in fig. 3), the map server sends corresponding map data to the client, and the client renders and displays the electronic map corresponding to the second input operation (for example, at least including a building nearest to a camera of a scene) in the map display area 32 on the device screen: electronic map 2 (or map 2) (see (d) in fig. 3).
In the running process of the client, the map service terminal uses a multithreading technology to process business logic and rendering based on the triggering operation of the client; for example, the rendering thread runs on a display chip (such as GPU) for realizing map rendering, and the logic thread runs on a logic processing unit (such as CPU) for realizing whether to execute collision detection, whether to update POI, whether to notify the rendering thread to send depth data reading instruction, whether to read depth data, whether to determine rendering map elements, POI and other business processes.
Specifically, when the second triggering operation of the user is to realize the electronic map to be viewed through the drag, rotation or zoom operation, the electronic map can be a three-dimensional rendering scene, namely, the electronic map of the three-dimensional scene is seen by the user on the screen, when the user operates the electronic map, the electronic map of the three-dimensional scene is still seen by the user, and the content in the seen electronic map is different. In the process of dragging or rotating, the electronic map (in a changing state, that is, the camera in the electronic map is in a moving state, or in the process of zooming, the electronic map is in a changing state, that is, the scale in the electronic map is in a changing state, so, in order to reduce the performance consumption of the CPU and the GPU, it is not necessary to execute the collision detection, the POI update and the logic processing of the depth data in the logic thread when each frame of electronic map is rendered in the rendering thread to judge whether to render the map element and the POI, and the state change of the electronic map currently displayed on the screen can be based on, for example, when the state change indicates that the map element in the electronic map to be displayed is different from the map element in the electronic map currently displayed and the map element in the electronic map to be displayed is stable, the collision detection is determined, so that the POI to be displayed in the electronic map to be displayed and the rendered map element and the POI are displayed in the electronic map.
If it is determined that collision detection is performed, update of the POI may be immediately triggered, and a notification for indicating that the updated depth data of the building is allowed to be sent is sent to the rendering thread by the logic thread, the rendering thread sends a read instruction of the depth data to the logic thread, the logic thread sends the associated data of the depth data read from the GPU (here, RGB data converted from the depth data), that is, the GPU sends the associated data to the memory of the CPU, and then the CPU performs POI occlusion judgment through the acquired information of the updated POI (for example, three-dimensional coordinates of the updated POI, a position of a pixel where the updated POI is located, and the like), so as to determine whether to render and display the POI.
When a user stops dragging, rotating or zooming operation, and map elements in the electronic map are stabilized for the first time, the POI is forcedly updated, in the updating process of the POI, the logic thread reads depth data of the building after rendering, POI shielding judgment is carried out based on information of the updated POI, and whether the POI is rendered and displayed is judged. When the map elements in the electronic map are non-primary stable, whether collision detection is performed or not is judged by detecting the change duration and/or the change of the scale state of the scene or whether the change of the camera state meets the preset condition, and whether the POI is rendered and displayed or not is further determined.
When map elements of the electronic map are stable, in order to ensure consistency of scenes and depth data, the stability of collision effect is realized, POIs can be updated when the collision detection operation is determined to be triggered, after the rendering thread renders a building, in the process that the logic thread executes the POIs updating, the logic thread can timely read the depth data of the building in the rendering thread and judge whether the updated POIs are blocked by the building, if not, collision processing is carried out, and the POIs are rendered in the map, namely the POIs are displayed; if occluded, the POI does not need to be rendered in the map, i.e., not displayed.
Therefore, by determining the map elements which need to collide with the POI, then rendering the map elements on the map, and determining whether to update the POI according to the current scene, instead of updating the POI in the rendering process of each frame of electronic map, the performance consumption is reduced; if the updated POI is determined, when the map element of the currently displayed electronic map is stable, the time for triggering the updating of the POI and the time for reading the depth data of the map element are controlled to process the depth data and the current scene, so that whether the POI is rendered on the electronic map is determined, the consistency of the depth data and the current scene is ensured when the POI is updated, the collision effect is stable, the accuracy of the map rendering result is further ensured, and the service experience of a user is improved.
It should be noted that, the foregoing determining whether collision detection is required by detecting the state change of the electronic map currently displayed on the screen, if it is determined that collision detection is performed, comparing the depth data of the map elements of the preset type, which are read from the rendering data of the electronic map to be displayed and do not include POIs, with the depth data of the POIs in the electronic map to be displayed, to obtain the execution step of the POIs to be displayed in the electronic map to be displayed, which is merely illustrative, and may also be processed only by splitting more platforms and more modules. Wherein information (such as positioning information), target objects, rendering information, positions, coordinates, etc. for the acquired user are user authorized.
The technical scheme of the present application is described in detail below with specific examples. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
Fig. 4 is a schematic flow chart of another method for detecting a collision of map elements according to an embodiment of the present application. The present embodiment can be applied to any device capable of detecting a collision of a map element, for example, a user device (or a terminal), and a method for detecting a collision of a map element will be described in detail below with reference to a building as an example. As shown in fig. 4, the collision detection method of the map element may include:
S401, detecting state change of an electronic map currently displayed on a screen, and executing collision detection at least when the electronic map to be displayed is displayed on the screen when the state change indicates that map elements in the electronic map to be displayed are different from map elements in the electronic map currently displayed and the map elements in the electronic map to be displayed are stable.
The stabilization means that the electronic map to be displayed does not change any more, because there is one process of dragging, zooming and rotating, and the electronic map renders tens of frames for 1 second, and when the difference of the electronic maps displayed before the electronic map to be displayed is not large finally, the elements are considered to be stable. The stable judgment is carried out by the parameters of the camera when the angle change is small or the distance change is not large. Since the electronic map renders several tens of frames for 1 second, the electronic map currently displayed in the present application refers to the electronic map displayed in the screen before the collision detection is performed, and does not refer to a certain frame of electronic map in particular.
Specifically, the process is performed by the CPU: whether collision detection is executed or not is determined by detecting the state change of the electronic map currently displayed by the screen, and collision detection is not needed in each frame rendering process, so that the resource consumption of user equipment can be reduced. If the map element in the electronic map to be displayed is different from the map element in the electronic map to be displayed currently and the map element in the electronic map to be displayed is stable, it is explained that the condition for performing collision detection is satisfied, and therefore, at least when the electronic map to be displayed is displayed on the screen, collision detection is performed.
S402, reading depth data of map elements of a preset type from rendering data of an electronic map to be displayed, wherein the preset type does not comprise interest points.
Specifically, the embodiment of the present application is applied to a 3D scene, and records the type of map elements that need to be subjected to collision detection by having a configuration table in advance, for example: buildings, trees, and the like, and map elements of a preset type (here, map elements corresponding to the map element types described in the configuration table) and POIs may be determined from among map elements of the electronic map.
The number of the POIs can be one or more, and each POI needs to perform shielding judgment with the map elements of the preset type. Here, the collision takes a preset type of map element as an example of a building, for example, a POI is inside the building, or a POI is outside the building. Because frequent reading of depth data to update the scene can affect the performance of the GPU, in order to reduce the performance consumption of the GPU, the scene is not required to be updated by reading the depth data every frame, and whether to execute collision detection or not can be determined by analyzing the scene of the electronic map to be displayed based on the state change of the electronic map currently displayed on the screen, so that scene update is realized.
Optionally, first, the map element to be collision detected is determined:
marking an object (namely, a map element) to be collided according to triggering operation of a user on a map operation interface through user equipment, and taking the object to be collided as the map element to be collided for detection; the map element to be detected by collision is an object to be rendered nearest to a camera in the map;
and updating the rendering information of the map element to be detected by collision according to the mark.
Specifically, as shown in fig. 1 to 3, a user performs a triggering operation on a map operation interface provided by a client through a user device installed with the client, for example, a drag, rotation or zoom operation on a current electronic map, based on the drag, rotation or zoom operation, a logic thread may determine an object, such as a building (the building is a building to be rendered), that needs to be collided closest to a camera in a current scene based on a parameter of the camera in the current scene, and mark the building that needs to be collided in a rendering pipeline and submit the same to the rendering pipeline. Because the map has many layers, but many layers do not need to generate depth information (or depth data), the unlabeled object does not need to update the depth data and render, so that the performance consumption of the user equipment can be reduced; and building data also includes several types, all types of buildings need to be collected, so that the object needs to be updated only according to the mark, and the rendering information of the building is updated. After updating the rendering information of the buildings, the updated rendering information is used to provide data for the rendering threads how to render the buildings.
Optionally, rendering the map element to be detected by collision in a map includes:
and acquiring updated rendering information corresponding to the map elements to be detected by collision, converting the depth data in the updated rendering information into RGB data, and storing the RGB data in a display chip.
In the embodiment of the application, firstly, in a logic thread, a building which is closest to a camera and needs to collide is determined as a target object based on a current electronic map (or a current scene), and then the building is put into a rendering pipeline (called a rendering pipeline or a pixel pipeline which is a parallel processing unit for processing graphic signals in a display chip and is mutually independent; the rendering pipeline is used for improving the working capacity and efficiency of a display card), and building rendering information (or rendering data) is updated; and rendering the building in a rendering thread, converting depth data in the updated rendering information (here, the updated depth data of the building in the rendering process) into RGB color data. Where depth data (here depth values) is a single precision floating point number and RGB is an integer of three 0-255, the data is stored in texture after conversion.
Since the rendering operation is executed in the rendering thread (here, processing by GPU), whether to display POI is determined in the logical thread (here, processing by CPU), the depth data of the rendered building needs to be put on the memory to be processed by CPU, and the management rationality of the depth data is realized by the cross-thread data synchronization processing.
Optionally, the graphics processor GPU generates rendering data of map elements included in map data based on the map data of the electronic map to be displayed;
the reading of the depth data of the map elements of the preset type from the rendering data of the electronic map to be displayed specifically includes:
and the CPU reads the depth data of map elements of a preset type from the rendering data of the electronic map to be displayed, which is generated by the GPU.
The rendering data is generated by a GPU, and the CPU reads the depth data from the GPU.
Optionally, based on map data of an electronic map to be displayed, rendering data of map elements included in the map data is generated, the rendering data including: depth data corresponding to a pixel point.
Specifically, the process is performed by the GPU: the depth data is generated during the GPU rendering process by map elements of a preset type. The depth data here is depth data updated in the process of updating the rendering information of the map element. If the updated POI is determined, the update of the POI and the reading time of the depth data of the map element are controlled, so that the depth data is consistent with the scene, and the POI and the map element can collide accurately. The method for detecting the collision of the map elements has the advantages that the accuracy of the collision effect is guaranteed while the performance is guaranteed, and the accuracy of the map rendering result is further guaranteed.
Optionally, the depth data is pixel-by-pixel.
S403, determining depth data of interest points in the electronic map to be displayed.
In the embodiment of the application, the collision detection is determined to be executed, and further, the POI to be displayed in the electronic map to be displayed is determined. Specifically, according to map data (here, the coordinates of the POI in the world coordinate system) which is newly downloaded from the server, the depth data of the POI, that is, the depth data of the POI in the electronic map to be displayed, is calculated by the parameters of the camera in the electronic map which is currently displayed.
S404, comparing the depth data of the map elements of the preset type with the depth data of the interest points to obtain the interest points to be displayed in the electronic map to be displayed.
In the embodiment of the application, based on the position and the depth data of the pixel where the POI is located and in combination with the updated depth data of the map elements of the preset type, whether the POI is shielded by the map elements of the preset type is judged, if the POI is shielded, the POI is not displayed, if the POI is not shielded, collision processing is carried out, and finally the object and the POI are rendered and displayed. Therefore, by reading the depth data of the map elements of the preset type, the depth comparison of the POI and the map elements of the preset type, namely the position comparison of the relative camera, is used for judging whether the POI is blocked by the map elements of the preset type or not, so that the collision processing is more visual and accurate, and the rendering result is accurate.
Therefore, the method for detecting the collision of the map elements provided by the embodiment of the application can detect the state change of the electronic map currently displayed on the screen, and when the state change indicates that the map elements in the electronic map to be displayed are different from the map elements in the electronic map currently displayed and the map elements in the electronic map to be displayed are stable, the collision detection is performed at least when the electronic map to be displayed is displayed on the screen; further, the collision detection includes: reading depth data of map elements of a preset type from rendering data of an electronic map to be displayed, wherein the preset type does not comprise interest points; determining depth data of interest points in an electronic map to be displayed; and comparing the depth data of the map elements of the preset type with the depth data of the interest points to obtain the interest points to be displayed in the electronic map to be displayed. According to the method and the device, whether collision detection is needed or not can be determined by detecting the state change of the electronic map currently displayed on the screen, instead of collision detection in the rendering process of each frame of map in the electronic map, POI updating is achieved, and performance consumption is reduced; if it is determined that collision detection is executed, firstly, depth data of map elements of a preset type, which do not comprise POIs, are read from rendering data of an electronic map to be displayed, meanwhile, the depth data of the POIs in the electronic map to be displayed are determined, then, the depth data of the POIs and the depth data of the POIs are compared, and whether the POIs are rendered on the map is determined, so that collision detection is more visual and effective, accuracy of map rendering results is further guaranteed, and service experience of a user is improved.
Optionally, when detecting that the map element in the electronic map to be displayed is different from the map element in the electronic map currently displayed, the method further includes:
detecting the change duration, executing collision detection once when the change duration reaches a preset detection duration but the map elements in the electronic map to be displayed are unstable, returning to the step of detecting the change duration if the map elements in the electronic map to be displayed are unstable, and executing collision detection when the electronic map to be displayed is displayed on the screen until the map elements in the electronic map to be displayed are stable.
The change duration is from when it is detected that the map element in the electronic map to be displayed is different from the map element in the electronic map currently displayed, and when the time is 200ms, for example, but the map element is not stable, collision detection is performed, and then 200ms is counted again.
Specifically, if it is detected that the map elements in the electronic map to be displayed are different from the map elements in the electronic map to be displayed currently, and the different time intervals (i.e., the change time durations) of the detected map elements reach the preset detection time duration, for example, the time duration is greater than or equal to the preset interval threshold, when the map elements in the electronic map to be displayed are unstable, the collision detection is performed once; and returning to the step of detecting the change duration because the map elements in the electronic map to be displayed are unstable, and continuously executing collision detection until the map elements in the electronic map to be displayed are stable, namely executing collision detection when the electronic map to be displayed is displayed on the screen.
For example, when it is detected that the map element in the electronic map to be displayed is different from the map element in the electronic map to be displayed currently, if the map element in the electronic map to be displayed is stable within 200ms, collision detection is performed, otherwise, collision detection is performed once every 200ms until the map element in the electronic map to be displayed is stable, and the last collision detection is performed to ensure that the electronic map stably displayed in the screen does not store the shielding problem.
Optionally, detecting a state change of the electronic map currently displayed on the screen specifically includes:
and detecting the state change of the scale of the electronic map currently displayed on the screen or the state change of the camera.
In the embodiment of the application, the scale state of the electronic map is changed through zooming operation; through drag or rotation operation, the state of the camera in the electronic map changes. Therefore, based on the state change of the scale of the electronic map currently displayed on the screen or the state change of the camera, whether collision detection is executed can be determined, and collision detection is not required to be carried out on each frame of the rendered map in the electronic map, so that the resource consumption of user equipment is reduced.
Optionally, the camera state change includes a direction state change and a position state change;
If the difference between the orientation angle of the camera of the electronic map to be displayed and the orientation angle of the camera of the electronic map currently displayed is greater than a preset angle threshold value, the map elements in the electronic map to be displayed are different from the map elements in the electronic map currently displayed,
or if the distance from the position of the camera of the electronic map to be displayed to the position of the camera of the current display electronic map is greater than a preset distance threshold, indicating that the map elements in the electronic map to be displayed are different from the map elements in the current display electronic map.
Here, the distance refers to a pixel distance of the screen. Such as a change in direction state, such as a change in angle of orientation.
Specifically, if the difference between the orientation angle of the camera of the electronic map to be displayed and the orientation angle of the camera of the electronic map currently displayed is greater than a preset angle threshold, it is indicated that the change of the orientation of the camera reaches a certain value, that is, the change data of the parameters of the camera in the electronic map in two scenes meets the preset change condition, then collision detection is performed. If the distance from the position of the camera of the electronic map to be displayed to the position of the camera of the electronic map currently displayed is larger than a preset distance threshold value, the camera movement is indicated to reach a certain value, namely, the change data of the parameters of the camera in the electronic map in two scenes meets the preset change condition, and then collision detection is executed.
Optionally, the method for detecting collision of map elements may further determine whether to update the point of interest, and the method further includes:
acquiring the state of the electronic map when the interest point is triggered and updated in the scene of the electronic map rendered last time, the moment when the interest point is triggered and updated in the scene of the electronic map rendered last time and the current moment;
and determining whether to update the interest point according to the state of the current electronic map, the state of the electronic map when the interest point is updated by the last trigger, and the time interval between the time of the last trigger for updating the interest point and the current time.
In the embodiment of the application, based on the state of the current electronic map, whether to update the POI is determined by analyzing the scene of the electronic map rendered last time: the scene of the electronic map rendered last time can comprise the state of the electronic map when the POI point is triggered and updated last time, the moment when the POI point is triggered and updated last time and the like, and whether the interest point is updated or not is judged based on the state of the current electronic map and combined with the state of the electronic map when the POI point is triggered and updated last time and/or the time interval between the moment when the POI point is triggered and updated last time and the current moment. For example, whether the time interval reaches a preset interval threshold, whether the change data of the parameters of the camera in the electronic map in the two scenes meets a preset change condition and other free combination strategies are determined, and whether to update the POI is determined. If it is determined to update the POI and an instruction for reading the depth data is sent to the logic thread by the rendering thread, the logic thread reads the depth data and determines whether to render the POI in combination with the updated information of the POI.
And if the updated POI is determined, comparing the depth data of the building with the depth value of the POI by reading the depth data of the building, and further determining whether to render and display the POI, wherein the depth data in the rendering data is not used for each frame, so that GPU performance consumption caused by frequently taking the depth data to update the scene is reduced.
Optionally, determining whether to update the interest point according to the state of the current electronic map, the state of the electronic map when the interest point is updated by the last trigger, and the time interval between the time of the last trigger to update the interest point and the current time may be implemented in at least two ways:
mode 1, determining whether to update a POI based on a state of a current electronic map and a state of the electronic map when the point of interest is updated last time (for example, a combination strategy of changing data of a time interval between a time when the point of interest is updated last time and a current time and a parameter of a camera in a picture in two scenes is determined, wherein the map element of the current electronic map is unstable and the map element of the electronic map is unstable when the point of interest is updated last time, the map element of the current electronic map is stable and the map element of the electronic map is stable when the point of interest is updated last time, and the map element of the electronic map is unstable when the point of interest is updated last time).
If the map elements of the current electronic map are unstable and the map elements of the electronic map are unstable when the interest points are updated by the last trigger, comparing the time interval with a preset interval threshold value, and if the time interval is greater than or equal to the preset interval threshold value, determining to update the interest points;
if the map element of the electronic map is unstable and the map element of the electronic map is stable when the interest point is updated by the last trigger, or if the map element of the current electronic map is stable and the map element of the electronic map is stable when the interest point is updated by the last trigger, determining to execute collision detection (determining to update the interest point) if the change data of the parameters of the camera in the electronic map in the two scenes meets the preset change condition, or if the change data of the parameters of the camera in the electronic map in the two scenes does not meet the preset change condition and the time interval is greater than or equal to the preset interval threshold;
if the map element of the current electronic map is stable and the map element of the electronic map is unstable when the interest point is updated last time, the interest point is determined to be updated (the interest point is determined to be updated).
Specifically, it is determined whether to update the POI or whether a read depth data command needs to be sent. The judgment basis is as follows:
a: time interval. If the time interval between sending the read command twice (or the time interval between triggering the update of the POI twice) is smaller than a preset interval threshold, such as 400 milliseconds, the POI is not updated and the read depth data command is not sent, and the scene is not updated;
b: whether the current scene is in a stable state or not is judged, if so, the POI is updated and a reading depth data command is sent no matter whether the interval reaches 400 milliseconds or not, and then the scene is updated. If not, judging whether to update according to the step a.
If the current scene first enters a stable state, that is, the state of the last scene is a change state (here, the map element of the electronic map is unstable), and the state of the current scene is a static state, in order to ensure the consistency of the scene and the depth data, the timing of updating the POI and the timing of reading the depth data are determined to update the scene.
C: whether the camera of the scene is moving. Recording parameters of the camera when the two scenes (here, the current scene and the scene of the last rendered electronic map) enter a static state and send a reading command, if the camera moves and the direction change does not reach a certain value, namely, the change data of the parameters of the camera in the electronic map in the two scenes does not meet the preset change condition, not updating, otherwise updating.
Illustratively, first, a time interval is used to avoid sending a command to read depth data every frame; and secondly, when a command for reading depth data is sent, the state of the current camera is recorded, when the command is sent next time, the state of the camera at that time is compared with the recorded state, and if the state (position and orientation) of the camera does not change greatly (the moving distance is not more than 2 pixels, the orientation is not more than 1 degree, and the level scaling is not more than 0.1), the command for reading depth data is not sent, and the POI is not updated.
Therefore, the mode 1 is based on the state of the current electronic map and the state of the electronic map when the interest point is updated by last triggering, and the specific scenes with different states correspond to different processing logics or processing strategies.
And 2, determining a strategy for judging whether to update the POI based on whether the state of the current electronic map is the same as the state of the electronic map when the interest point is updated by the last trigger, and further determining whether to update the POI and reading depth data.
If the state of the current electronic map is different from the state of the electronic map when the interest point is updated by the last trigger, determining to update the interest point;
if the state of the current electronic map is the same as the state of the electronic map when the interest point is updated by the last trigger, and the time interval is greater than or equal to a preset interval threshold value and/or the change data of the parameters of the cameras in the electronic map in two scenes meets a preset change condition, determining to update the interest point.
Specifically, mode 2 is described in detail by the following three implementation modes:
mode 21, if the two states are different, directly determining to update the POI; if the two states are the same, judging whether the time interval reaches a preset threshold value, if so, updating the POI, and if not, not updating;
mode 22, if the two states are different, directly determining to update the POI; if the two states are the same, judging whether the parameter change of the camera in the two scenes reaches a preset change condition, if so, updating, and if not, not updating;
mode 23, if the two states are different, directly determining to update the POI; if the two states are the same, judging whether the parameter change of the camera in the two scenes reaches a preset change condition or not and judging whether the time interval reaches a preset threshold value or not, if both the parameter change and the time interval reach the preset threshold value, updating the POI, and if any one of the parameter changes does not reach the preset change condition, not updating the POI.
Therefore, the above three implementations of the mode 2 determine the corresponding processing logic based on whether the two states are the same, the implementation logic is simple, and the accuracy of rendering is improved compared with the logic processing without scene refinement analysis under the stabilization of the map elements in the prior art.
Optionally, before reading the depth data of the map elements of the preset type from the rendering data of the electronic map to be displayed, the method further comprises: and determining the time for triggering the updating of the interest points and the time for reading the depth data.
The determining the time for triggering the interest point update and the time for reading the depth data comprises the following steps:
triggering the updating of the interest point when the static frame number counted after the current electronic map is in the stable state of the map element reaches N frames, and recording the moment of triggering the updating of the interest point currently;
in the (n+1) th frame, the RGB data is sent to a logic processing unit through a display chip, and the RGB data is read through the logic processing unit, and is converted into depth data to determine whether the interest point is rendered in the map.
The time for triggering the point of interest currently is recorded for calculating the time interval at the next triggering time, and then the time is used as a judging basis for updating the POI.
Specifically, when the current electronic map is in a change state and the state of the electronic map is in a stable state when the point of interest is updated by last triggering (herein, when the picture first enters a stable state), the depth data is forcedly updated (the update of the depth data refers to that the depth data is transmitted from the GPU to the memory), or when the current electronic map is in a stable state and the state of the electronic map is in a stable state when the point of interest is updated by last triggering, and when the update scene is determined, the update of the depth data is performed in the rendering thread after entering the stable state, which means that the update of the depth data is performed after one frame, then the update time of the POI (i.e. the time for triggering the update of the point of interest) is put after entering the stable state for one frame, and the process is implemented by calculating the number of frames after the stabilization:
after entering a stable state, how many frames are stabilized, then when N frames (for example, 3 frames; since the POI collecting data interval is 4 frames) are stabilized, the update of the POI is triggered, for example, a logic thread triggers the update of the POI in a third frame, applies for depth data to a rendering thread, reads the depth data in a fourth frame, reads the depth data, and then carries out logic processing based on the information of the updated POI and the read depth data, so that the depth data is ensured to be consistent with a scene at the moment, and the collision of the POI is correct. Such as: when the business detects that the electronic map has three frames of the same picture, a command for updating the depth data is sent, and then the data is read in the fourth frame, and the processing of the third frame and the fourth frame is used for solving the problem of cross-thread data synchronization.
Since the rendering operation is performed in a rendering thread (here, processed by a GPU), whether or not to display the POI is determined in a logical thread (here, processed by a CPU), depth data of the rendered building needs to be put on the CPU to be processed. When the service detects that the electronic map has N frames of the same picture, the service applies for reading depth data in the N frames, namely, a logic thread tells a rendering thread that a command for reading the depth data can be sent, and inquires and processes in the N+1th frame, so that the problem of data synchronization is solved, and therefore, the management rationality of the depth data is realized through the cross-thread data synchronization processing; by controlling updating of the POI and reading time of the depth data, the POI and the object can be accurately collided.
In addition, in some cases, for example, in the case of horizontal-vertical screen switching, no building collision of POIs, etc., relevant data corresponding to depth data of some buildings need to be released, in order to ensure that other threads can operate correctly when releasing, when rendering threads perform rendering, the depth data are converted into RGB color data and stored in a video memory, when relevant data are released, a logic processing thread adopts a mode of reading RGB color data in a corresponding scene, converts the RGB color data into the depth data again, and performs logic processing on the depth data, and the operations of the two threads do not affect each other.
Optionally, determining whether to render the POI includes:
acquiring depth data of the updated interest points and positions of pixels of the updated interest points in the map;
and determining whether to render the interest point in the map according to the updated depth data of the interest point, the position of the pixel of the updated interest point in the map and the depth data.
The obtaining of the depth data of the updated interest point may be calculating to obtain the depth data of the updated interest point according to the obtained coordinates of the updated interest point in the world coordinate system through parameters (such as position, orientation, etc.) of the camera in the current electronic map.
Specifically, based on the position of the pixel where the updated POI is located and the updated depth data of the POI, and combining the updated depth data of the building, judging whether the POI is shielded by the building, if so, not displaying the POI, and if not, performing collision processing, and finally rendering and displaying the building and the POI.
By means of reading the depth data of the building and updating the POI, the POI is compared with the depth value of the building, namely, compared with the position of the camera, whether the POI is shielded by the building or not is judged visually and accurately, collision processing is accurate, and rendering results are accurate.
Optionally, determining whether to render the interest point in the map according to the updated depth data of the interest point, the position of the pixel of the updated interest point in the map and the depth data includes:
acquiring depth data of a pixel where the interest point is located from the depth data according to the position of the pixel in the map of the updated interest point; the depth data of the pixel where the interest point is located is used for representing the depth data of a collision part with the interest point in the target object;
comparing the depth data of the pixel where the interest point is located with the depth data of the interest point to determine whether the target object and the interest point are shielded;
if no shielding exists, rendering the interest point in the map by performing collision processing on the target object and the interest point; if there is occlusion, the point of interest is not rendered in the map.
Specifically, firstly, the GPU puts the RGB color data on the memory, the CPU reads the RGB color data from the memory and converts the RGB color data into a depth value, and the process can be understood as updating the depth data. The position of the pixel where the POI is located is used for searching the depth data corresponding to the position from the read depth data, namely, the depth value of the building colliding with the POI, and the depth value is compared with the depth value of the POI, so that the position relation between the POI and the building in the current scene relative to the camera can be intuitively and accurately judged.
Therefore, the method and the device acquire the depth data by converting the depth value into the RGB color value, and realize the effect of POI building collision under multithreading by reading the depth map data; the movement of the camera is utilized, the frequency of reading depth map data is reduced, the performance is improved, the POI can be accurately collided with a scene by controlling the updating of the POI and the reading time of the depth data, the rendering result is accurate, and the service experience of a user is improved.
In response to the above-mentioned method for detecting a collision of a map element, an embodiment of the present application provides a device for detecting a collision of a map element, and fig. 5 is a schematic structural diagram of the device for detecting a collision of a map element, where the device includes:
a first detection module 501, configured to detect a state change of an electronic map currently displayed on a screen;
a second detection module 502, configured to trigger a collision detection module at least when the state change indicates that a map element in an electronic map to be displayed is different from a map element in the currently displayed electronic map and the map element in the electronic map to be displayed is stable, and the electronic map to be displayed is displayed on the screen;
The collision detection module 503 includes:
a depth data reading unit, configured to read depth data of map elements of a preset type from rendering data of an electronic map to be displayed, where the preset type does not include interest points;
a depth data calculation unit for determining depth data of interest points in the electronic map to be displayed;
and the depth data comparison unit is used for comparing the depth data of the map elements of the preset type with the depth data of the interest points to obtain the interest points to be displayed in the electronic map to be displayed.
The collision detection device for map elements provided in the embodiment of the present application may be used to implement the technical solutions of the embodiments shown in fig. 1 to 4, and the implementation principle and the technical effects are similar, which are not repeated here.
Optionally, the apparatus further comprises: a third detection module;
the third detection module is configured to detect a change duration when detecting that a map element in an electronic map to be displayed is different from a map element in a current electronic map to be displayed, and execute the collision detection once when the change duration reaches a preset detection duration but the map element in the electronic map to be displayed is unstable, and return to a step of detecting the change duration if the map element in the electronic map to be displayed is unstable until the map element in the electronic map to be displayed is stable, and trigger the collision detection module when displaying the electronic map to be displayed on the screen.
Optionally, the first detection module 501 is specifically configured to:
and detecting the state change of the scale of the electronic map currently displayed on the screen or the state change of the camera.
Optionally, the change of the scale state is from a target scale to a scale different from the target scale, and the map element in the electronic map to be displayed is different from the map element in the electronic map currently displayed, and the target scale is the scale of the electronic map currently displayed.
Optionally, the camera state change includes a direction state change and a position state change;
if the difference between the orientation angle of the camera of the electronic map to be displayed and the orientation angle of the camera of the electronic map currently displayed is greater than a preset angle threshold value, the map elements in the electronic map to be displayed are different from the map elements in the electronic map currently displayed,
or if the distance from the position of the camera of the electronic map to be displayed to the position of the camera of the current display electronic map is greater than a preset distance threshold, indicating that the map elements in the electronic map to be displayed are different from the map elements in the current display electronic map.
Optionally, the collision detection device of the map element further includes: a generation module for:
generating, based on map data of an electronic map to be displayed, rendering data of map elements included in the map data, the rendering data including: depth data corresponding to a pixel point.
Optionally, the graphics processor GPU is configured to generate rendering data of map elements included in map data based on map data of an electronic map to be displayed; and the CPU is used for reading the depth data of the map elements of the preset type from the rendering data of the electronic map to be displayed, which is generated by the GPU.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 6, the electronic device of the present embodiment may include:
at least one processor 601; and
a memory 602 communicatively coupled to the at least one processor;
wherein the memory 602 stores instructions executable by the at least one processor 601, the instructions being executable by the at least one processor 601 to cause the electronic device to perform the method according to any one of the embodiments described above.
Alternatively, the memory 602 may be separate or integrated with the processor 601. Alternatively, the memory 602 may be coupled to the processor 601 through the bus 603.
The implementation principle and technical effects of the electronic device provided in this embodiment may be referred to the foregoing embodiments, and will not be described herein again.
The embodiment of the application further provides a computer readable storage medium, in which computer executable instructions are stored, which when executed by a processor, implement the method according to any of the previous embodiments.
Embodiments of the present application also provide a computer program product comprising a computer program which, when executed by a processor, implements the method according to any of the preceding embodiments.
In the technical scheme of the application, the related processes of collecting, storing, using, processing, transmitting, providing, disclosing and the like of various information of users and merchants are in accordance with the regulations of related laws and regulations, and the public order is not violated.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple modules may be combined or integrated into another system, or some features may be omitted or not performed.
The integrated modules, which are implemented in the form of software functional modules, may be stored in a computer readable storage medium. The software functional modules described above are stored in a storage medium and include instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or processor to perform some of the steps of the methods described in various embodiments of the present application.
It should be appreciated that the processor may be a central processing unit (Central Processing Unit, CPU for short), other general purpose processors, digital signal processor (Digital Signal Processor, DSP for short), application specific integrated circuit (Application Specific Integrated Circuit, ASIC for short), etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in a processor for execution. The memory may comprise a high-speed RAM memory, and may further comprise a non-volatile memory NVM, such as at least one magnetic disk memory, and may also be a U-disk, a removable hard disk, a read-only memory, a magnetic disk or optical disk, etc.
The storage medium may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an application specific integrated circuit (Application Specific Integrated Circuits, ASIC for short). It is also possible that the processor and the storage medium reside as discrete components in an electronic device or a master device.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), comprising several instructions for causing a terminal device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method described in the embodiments of the present application.
The foregoing description is only of the preferred embodiments of the present application, and is not intended to limit the scope of the claims, and all equivalent structures or equivalent processes using the descriptions and drawings of the present application, or direct or indirect application in other related technical fields are included in the scope of the claims of the present application.

Claims (11)

1. A collision detection method of map elements, wherein the method comprises:
detecting a state change of an electronic map currently displayed on a screen, and executing collision detection at least when the electronic map to be displayed is displayed on the screen when the state change indicates that map elements in the electronic map to be displayed are different from map elements in the electronic map currently displayed and the map elements in the electronic map to be displayed are stable;
the collision detection includes: reading depth data of map elements of a preset type from rendering data of an electronic map to be displayed, wherein the preset type does not comprise interest points;
determining depth data of interest points in an electronic map to be displayed;
and comparing the depth data of the map elements of the preset type with the depth data of the interest points to obtain the interest points to be displayed in the electronic map to be displayed.
2. The method of claim 1, wherein upon detecting that the map element in the electronic map to be displayed and the map element in the currently displayed electronic map are different, the method further comprises:
detecting the change duration, executing collision detection once when the change duration reaches a preset detection duration but the map elements in the electronic map to be displayed are unstable, returning to the step of detecting the change duration if the map elements in the electronic map to be displayed are unstable, and executing collision detection when the electronic map to be displayed is displayed on the screen until the map elements in the electronic map to be displayed are stable.
3. The method of claim 1, wherein detecting the state change of the electronic map currently displayed on the screen specifically comprises:
and detecting the state change of the scale of the electronic map currently displayed on the screen or the state change of the camera.
4. A method according to claim 3, wherein the change in scale status is from a target scale to a scale different from the target scale, the target scale being the scale of the currently displayed electronic map, indicating that map elements in the electronic map to be displayed are different from map elements in the currently displayed electronic map.
5. A method according to claim 3, wherein the camera state changes include a direction state change and a position state change;
if the difference between the orientation angle of the camera of the electronic map to be displayed and the orientation angle of the camera of the electronic map currently displayed is greater than a preset angle threshold value, the map elements in the electronic map to be displayed are different from the map elements in the electronic map currently displayed,
or if the distance from the position of the camera of the electronic map to be displayed to the position of the camera of the current display electronic map is greater than a preset distance threshold, indicating that the map elements in the electronic map to be displayed are different from the map elements in the current display electronic map.
6. The method of any one of claims 1-5, wherein the method further comprises:
generating, based on map data of an electronic map to be displayed, rendering data of map elements included in the map data, the rendering data including: depth data corresponding to a pixel point.
7. The method of claim 6, wherein the graphics processor GPU generates rendering data for map elements included in map data based on map data for an electronic map to be displayed;
the reading of the depth data of the map elements of the preset type from the rendering data of the electronic map to be displayed specifically includes:
and the CPU reads the depth data of map elements of a preset type from the rendering data of the electronic map to be displayed, which is generated by the GPU.
8. A collision detection device for map elements, comprising:
the first detection module is used for detecting the state change of the electronic map currently displayed on the screen;
the second detection module is used for triggering the collision detection module at least when the state change indicates that the map elements in the electronic map to be displayed are different from the map elements in the electronic map to be displayed currently and the map elements in the electronic map to be displayed are stable;
The collision detection module includes:
a depth data reading unit, configured to read depth data of map elements of a preset type from rendering data of an electronic map to be displayed, where the preset type does not include interest points;
a depth data calculation unit for determining depth data of interest points in the electronic map to be displayed;
and the depth data comparison unit is used for comparing the depth data of the map elements of the preset type with the depth data of the interest points to obtain the interest points to be displayed in the electronic map to be displayed.
9. The apparatus of claim 8, wherein the apparatus further comprises: a third detection module;
the third detection module is configured to detect a change duration when detecting that a map element in an electronic map to be displayed is different from a map element in a current electronic map to be displayed, and execute the collision detection once when the change duration reaches a preset detection duration but the map element in the electronic map to be displayed is unstable, and return to a step of detecting the change duration if the map element in the electronic map to be displayed is unstable until the map element in the electronic map to be displayed is stable, and trigger the collision detection module when displaying the electronic map to be displayed on the screen.
10. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor;
wherein the memory stores instructions executable by the at least one processor to cause the electronic device to perform the method of any one of claims 1-7.
11. A computer program product comprising a computer program, wherein the computer program, when executed by a processor, implements the method of any of claims 1-7.
CN202310253343.8A 2023-03-10 2023-03-10 Collision detection method, device, equipment and program product for map elements Pending CN116363082A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310253343.8A CN116363082A (en) 2023-03-10 2023-03-10 Collision detection method, device, equipment and program product for map elements

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310253343.8A CN116363082A (en) 2023-03-10 2023-03-10 Collision detection method, device, equipment and program product for map elements

Publications (1)

Publication Number Publication Date
CN116363082A true CN116363082A (en) 2023-06-30

Family

ID=86913254

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310253343.8A Pending CN116363082A (en) 2023-03-10 2023-03-10 Collision detection method, device, equipment and program product for map elements

Country Status (1)

Country Link
CN (1) CN116363082A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117171277A (en) * 2023-09-08 2023-12-05 北京唯得科技有限公司 Method, system, equipment and medium for loading marks of electronic map
CN117909008A (en) * 2023-12-25 2024-04-19 北京宇天恒瑞科技发展有限公司 Map display method, map display device, electronic equipment and readable storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117171277A (en) * 2023-09-08 2023-12-05 北京唯得科技有限公司 Method, system, equipment and medium for loading marks of electronic map
CN117171277B (en) * 2023-09-08 2024-04-30 北京唯得科技有限公司 Method, system, equipment and medium for loading marks of electronic map
CN117909008A (en) * 2023-12-25 2024-04-19 北京宇天恒瑞科技发展有限公司 Map display method, map display device, electronic equipment and readable storage medium

Similar Documents

Publication Publication Date Title
US11650708B2 (en) System and method of indicating the distance or the surface of an image of a geographical object
CN116363082A (en) Collision detection method, device, equipment and program product for map elements
US8803992B2 (en) Augmented reality navigation for repeat photography and difference extraction
KR101295712B1 (en) Apparatus and Method for Providing Augmented Reality User Interface
US8654151B2 (en) Apparatus and method for providing augmented reality using synthesized environment map
US20210248817A1 (en) Data processing method and apparatus
JP2018525702A (en) System, method and apparatus for data processing and display
JP2012221250A (en) Image processing system, display control method and program
CN111882583B (en) Moving object detection method, device, equipment and medium
US20160098863A1 (en) Combining a digital image with a virtual entity
CN109587031A (en) Data processing method
KR101996241B1 (en) Device and method for providing 3d map representing positon of interest in real time
CN113722043A (en) Scene display method and device for AVP, electronic equipment and storage medium
CN111275611B (en) Method, device, terminal and storage medium for determining object depth in three-dimensional scene
CN115018967B (en) Image generation method, device, equipment and storage medium
US11756267B2 (en) Method and apparatus for generating guidance among viewpoints in a scene
CN111798573B (en) Electronic fence boundary position determination method and device and VR equipment
KR101912241B1 (en) Augmented reality service providing apparatus for providing an augmented image relating to three-dimensional shape of real estate and method for the same
EP4075789A1 (en) Imaging device, imaging method, and program
US6333740B1 (en) Image processing apparatus
KR101662214B1 (en) Method of providing map service, method of controlling display, and computer program for processing thereof
CN114187509A (en) Object positioning method and device, electronic equipment and storage medium
CN110910482A (en) Method, system and readable storage medium for organizing and scheduling video data
CN112862976A (en) Image generation method and device and electronic equipment
US20230251706A1 (en) Method and apparatus for acquiring object's attention information and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination