CN113748418A - System and method for loading and tracking maps on a vehicle - Google Patents

System and method for loading and tracking maps on a vehicle Download PDF

Info

Publication number
CN113748418A
CN113748418A CN201880100669.4A CN201880100669A CN113748418A CN 113748418 A CN113748418 A CN 113748418A CN 201880100669 A CN201880100669 A CN 201880100669A CN 113748418 A CN113748418 A CN 113748418A
Authority
CN
China
Prior art keywords
map data
boundary
vehicle
updated
geographic location
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201880100669.4A
Other languages
Chinese (zh)
Other versions
CN113748418B (en
Inventor
侯庭波
项国民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Voyager Technology Co Ltd
Original Assignee
Beijing Voyager Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Voyager Technology Co Ltd filed Critical Beijing Voyager Technology Co Ltd
Publication of CN113748418A publication Critical patent/CN113748418A/en
Application granted granted Critical
Publication of CN113748418B publication Critical patent/CN113748418B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3863Structures of map data
    • G01C21/387Organisation of map data, e.g. version management or database structures
    • G01C21/3881Tile-based structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3667Display of a road map

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Navigation (AREA)
  • Instructional Devices (AREA)
  • Traffic Control Systems (AREA)

Abstract

A method and system for loading and tracking maps on a moving vehicle. A method includes obtaining a geographic location of a system on a vehicle, obtaining a boundary corresponding to a contiguous geographic boundary area around the geographic location of the system, loading map data including a plurality of map data blocks, each map data block including a portion of the geographic boundary area, the plurality of map data blocks including a center block having a point corresponding to the system location and surrounding map data blocks. The method also includes obtaining an updated system location, and if the updated geographic location is outside the bounding region, obtaining an updated boundary centered on the updated geographic location, and loading map data based on the updated boundary such that the resulting loaded map data includes a center block intersecting the geographic bounding region and map data blocks surrounding the center block.

Description

System and method for loading and tracking maps on a vehicle
Technical Field
The present disclosure relates generally to processing map data in a vehicle, and in particular to dynamically loading map data into a computer system memory on a vehicle for controlling the vehicle.
Background
Current solutions for autonomous driving, providing driver assistance features, and/or sharing travel services rely heavily on data-rich maps, which may be referred to as high-resolution (HD) maps. These high resolution (HD) maps may have extremely high accuracy, including centimeter-level geographic and object data, to provide the vehicle and/or driver with information (along with real-time sensor data) that can be used to generate precise control instructions on how to maneuver in real-world space. High-resolution (HD) map sizes can be very large due to their high-resolution (HD) and detailed information. When many high-resolution (HD) maps are required to accurately provide map information on a route while traveling a distance, there may be a case where a large number of high-resolution (HD) maps are continuously loaded for use by a vehicle. Due to the bandwidth and security issues involved in continuously transferring large amounts of data to mobile systems over current wireless networks (e.g., 4G networks), it would be advantageous to have a security system that: a large high-resolution (HD) map can be provided without relying on the transmission capabilities of a wireless network to receive the high-resolution (HD) map when the vehicle is in immediate need.
Disclosure of Invention
The system, method, and apparatus of the present invention each have several aspects (features), no single one of which is solely responsible for its desirable attributes. Without limiting the scope of the invention as expressed by the claims that follow.
One innovative aspect includes a method of loading vehicle information (e.g., High Definition (HD) map data) that can be implemented on a device of a mobile vehicle having at least one processor and a memory component coupled to the processor. In one example, the method includes: obtaining, by the at least one processor, a geographic location of the device, obtaining a boundary around the geographic location of the device corresponding to a contiguous geographic boundary area, loading map data comprising a plurality of map data blocks from the storage component into a memory of the device, each of the plurality of map data blocks comprising a portion of the geographic boundary area, the geographic boundary area corresponding to a portion of the loaded map data, wherein the plurality of map data blocks comprises a center block having a point corresponding to the geographic location of the device and surrounding map data blocks, and wherein the boundary is centered on the center block and sized such that the geographic boundary area intersects the surrounding map data blocks. The method may further comprise: obtaining, by at least one processor, an updated geographic location of a device while a vehicle is in motion, determining a location of the updated geographic location relative to the bounding region, and in response to determining that the updated geographic location is outside of the bounding region, obtaining an updated boundary corresponding to an updated geographic region centered around the updated geographic location, and loading map data from the storage component into a memory of the device such that the finally loaded map data comprises: a center block of points corresponding to the updated geographic location of the device, and a map data block surrounding the center block intersecting the geographic boundary region.
The methods described herein may have one or more other aspects (features) of the various method embodiments, some of which are mentioned herein. However, as one of ordinary skill in the art will recognize, various embodiments of such methods may have additional or fewer aspects, and the aspects disclosed herein may be used together in various embodiments, even if not specifically illustrated or described as being in a certain embodiment. For example, in one aspect, the surrounding map data blocks are adjacent to the center block. In another aspect, the map data block includes elevation information. In another aspect, the map data blocks include intensity information. In another aspect, the geographic boundary region corresponds to a region including the center block and at least a portion of the map data block that adjoins the center block. In another aspect, the boundary is rectangular. In another aspect, each map data block includes a width dimension and a length dimension, and the boundary includes a width dimension and a length dimension, the boundary width dimension being between one and three times the width dimension of each map data block, and the boundary length dimension being between one and three times the length dimension of each map data block.
In another aspect of the method, the loaded map data includes 9 map data blocks. In another aspect, each of the 9 map data blocks is of equal size. In another aspect, the 9 map data blocks include a center map data block and eight surrounding map data blocks. In another aspect, the map data blocks include a center map data block and more than eight surrounding map data blocks. In another aspect, the vehicle is an autonomous vehicle. In another aspect, the boundary may be non-rectangular. For example, the boundary may be broader (i.e., contain more regions) over a region representing the direction in which the vehicle is moving, the direction to be moved next, or the direction expected to be moved. In another aspect, the size of the boundary and the updated boundary are predetermined. In another aspect, obtaining the updated boundary includes dynamically determining the updated boundary. In another aspect, dynamically determining the updated boundary includes obtaining a speed of a moving vehicle and determining a size of the boundary based on the speed. In another aspect, dynamically determining the updated boundary includes obtaining a velocity of the moving vehicle and determining a shape of the boundary based on the velocity. Some of these methods may further include determining a direction of motion representative of a direction in which the vehicle is moving, wherein the boundary extends further from the updated geographic location of the device in the direction of motion than the boundary extends in other directions. A variety of storage devices suitable for storing and transmitting large High Definition (HD) maps may be used, and in some embodiments, the storage device may include an optical disk drive or a magnetic hard disk drive. Other types of hard disk drives may also be used. In some embodiments, the storage device may comprise a non-removable storage device (e.g., RAM or DRAM).
In another aspect of the method, each map data block represents an area having a width dimension of less than 1000 meters and a length dimension of less than 1000 meters. In another aspect, each map data block represents an area having a width dimension of less than 500 meters and a length dimension of less than 500 meters. Other sizes of map data blocks are also contemplated, including each map data block representing an area having a width dimension of less than 250 meters and/or a length dimension of less than 250 meters, each map data block representing an area having a width dimension of about 200 meters and/or a length dimension of about 200 meters, or each map data block representing an area having a width dimension of less than 100 meters and a length dimension of less than 100 meters.
In another aspect, the boundary is sized such that loading map data based on the updated geographic location includes loading three map data blocks in response to determining that the updated geographic location is outside of the boundary area. In another aspect, the boundary is sized such that loading map data based on the updated geographic location includes loading five map data blocks in response to determining that the updated geographic location is outside of the boundary area. In another aspect, obtaining the geographic location of the device includes receiving, by the at least one processor, information from a Global Positioning System (GPS). In another aspect, obtaining the geographic location of the device includes receiving geographic location information from at least one transmitter at a fixed location. In another aspect, obtaining the geographic location of the device includes sensing at least one fixed location indicator using a sensing system on the vehicle and determining the geographic location based on the sensed at least one fixed location indicator.
Another innovation includes a system comprising a storage system configured to store map data comprising a plurality of map data blocks; at least one processor coupled to a memory component, the at least one processor comprising a set of instructions and coupled to the storage system, the at least one processor configured, when executed, to cause the system to obtain a location of the device, obtain a boundary corresponding to a contiguous geographic boundary area around the geographic location of the device, load map data from the storage component into the memory of the device, the map data comprising a plurality of map data blocks, each of the plurality of map data blocks comprising a portion of the geographic boundary area and the geographic boundary area corresponding to a portion of the loaded map data, wherein the plurality of map data blocks comprises a center block having a point corresponding to the geographic location of the device and surrounding map data blocks, and wherein the boundary is centered on the center block and sized such that the geographic boundary area and the surrounding map data blocks are centered on the center block The blocks intersect. The system is also configured to obtain an updated geographic location of the device (e.g., while the device is in motion on the vehicle), determine a location of the updated geographic location relative to the boundary area, and in response to determining that the updated geographic location is outside of the boundary area, obtain an updated boundary centered on the updated geographic location, and load map data from the storage component to a memory of the device based on the updated boundary. The system may also include a vehicle.
The aspects disclosed above in connection with the method can also be implemented on the system. For example, the memory component may contain instructions that configure the at least one processor to perform actions related to loading map data corresponding to the method description.
Another innovation includes a non-transitory computer-readable medium storing instructions that, when executed by a computing device, cause the computing device to obtain a geographic location of the device, obtain a boundary corresponding to a bordering geographic boundary area around the geographic location of the device, load map data from the storage component into the memory of the device, the map data including a plurality of map data blocks, each of the plurality of map data blocks including a portion of the geographic boundary area and the geographic boundary area corresponding to a portion of the loaded map data, wherein the plurality of map data blocks includes a center block having a point corresponding to the geographic location of the device and surrounding map data blocks, and wherein the boundary is centered on the center block and sized such that the geographic boundary area intersects the surrounding map data blocks and when the vehicle is in motion: the method further includes obtaining an updated geographic location of the device, determining a location of the updated geographic location relative to the bounding region, and in response to determining that the updated geographic location is outside of the bounding region, obtaining an updated boundary centered on the updated geographic location, and loading map data from a storage component into a memory of the device based on the updated boundary.
Another innovation includes a method, implementable on a device (or system) on a vehicle, for loading data from a storage system capable of storing large amounts of data to a memory component (e.g., working memory or other quickly accessible memory) in communication with at least one processor. The apparatus includes at least one processor, a memory component, and a data storage component coupled to the processor. In one embodiment, the method comprises: obtaining, by the at least one processor, a location of the vehicle, determining, by the at least one processor, data retrieval information based on the vehicle location, the data retrieval information identifying a proximal portion of stored object geometry data within a certain distance of the vehicle, retrieving, by the at least one processor, the proximal portion of the object geometry data from a data storage component, and storing, by the at least one processor, the proximal portion of the object geometry data in the memory component.
Embodiments of the systems described herein may have one or more other aspects (features) of the various system embodiments, some of which are presented herein. However, as one of ordinary skill in the art will recognize, various embodiments of such systems may have additional or fewer aspects, and the aspects disclosed herein may be used together in various embodiments, even if not specifically illustrated or described as being in a certain embodiment. In one aspect, the data storage component is configured to store the object geometric data in a data structure such that a portion of the stored object geometric data representing an area surrounding the vehicle can be retrieved. In another aspect, the vehicle is an autonomous vehicle. In another aspect, a proximal portion of the object geometry data at least partially surrounds the vehicle. In another aspect, the method further comprises obtaining, by the at least one processor, a speed and a direction of the vehicle, and wherein determining the data retrieval information comprises determining based at least in part on the speed and the direction of the vehicle. In another aspect, the method further includes determining a route for one or more roads along which the vehicle travels from the vehicle location to the destination, obtaining road identification information indicative of a road on which the vehicle is located while the vehicle is traveling along the route, and determining the data retrieval information based on the vehicle location and the road identification information. In another aspect, the road-identifying information includes information about one or more roads along the route that the vehicle is approaching.
In another aspect of the method, the method further comprises determining a distance traveled by the vehicle along the route, and wherein determining the data retrieval information is based in part on the distance traveled by the vehicle along the route.
Another innovation comprises a system implemented on a vehicle, such as in an autonomous vehicle. In one embodiment, the system includes a data storage component configured to store the object geometry data in a data structure such that a portion of the stored object geometry data can be retrieved. The data storage component may be, for example, a magnetic hard drive or an optical drive, or may include one or more chips capable of storing large amounts of data (e.g., gigabytes, terabytes, gigabytes, or Ehrytes or more) and allowing retrieval of the stored information. The system also includes at least one processor having a memory component, wherein the at least one processor is configured to obtain a location of the vehicle, determine data retrieval information based on the vehicle location, the data retrieval information identifying a proximal portion of the object geometry data within a distance of the vehicle, and retrieve and store the proximal portion of the object geometry data from the data storage component in the memory component.
Embodiments of the systems described herein may have one or more other aspects (features) of the various system embodiments, some of which are presented herein. However, as one of ordinary skill in the art will recognize, various embodiments of such systems may have additional or fewer aspects, and the aspects disclosed herein may be used together in various embodiments, even if not specifically illustrated or described as being in a certain embodiment. For example, in one aspect, a proximal portion of the object geometry data at least partially surrounds the vehicle location. In another aspect, the proximal portion of the object geometry data is centered on the vehicle position. In another aspect, the proximal portion of the object geometry data extends farther from the front of the vehicle than the rear of the vehicle at the vehicle location. In another aspect, a proximal portion of the object geometry data surrounds the vehicle position.
In various embodiments, a system may further comprise a Global Positioning System (GPS), and wherein the at least one processor is further configured to obtain the location of the vehicle from the Global Positioning System (GPS). In one aspect, the at least one processor is further configured to obtain a speed and a direction of the vehicle, and determine the data retrieval information based at least in part on the speed and the direction of the vehicle. In another aspect, the at least one processor is further configured to obtain road-identifying information indicative of a road on which the vehicle is located, and determine the data retrieval information based on the vehicle location and the road-identifying information. In another aspect, the system may further include a navigation system configured to receive an input identifying a destination, determine a route for the vehicle to travel from the vehicle location to one or more roads along which the destination is located, determine road identification information for the vehicle as it travels along the route, and communicate the road identification information to the at least one processor. In another aspect, the at least one processor is further configured to obtain a speed of the vehicle, and wherein the at least one processor is further configured to determine the data retrieval information based in part on the speed of the vehicle. In another aspect, the system further includes an odometry device configured to determine a distance traveled by the vehicle along the route, wherein the data retrieval information is based in part on the distance traveled by the vehicle along the route. In another aspect, the road-identifying information includes information about roads along a route that the vehicle is approaching.
Drawings
The features and advantages of the apparatus described herein will become more fully apparent from the following description and appended claims, taken in conjunction with the accompanying drawings. These drawings depict only several embodiments in accordance with the disclosure and are not to be considered limiting of its scope. In the drawings, like reference numerals or symbols generally indicate like parts, unless the context dictates otherwise. In some instances, the drawings may not be to scale.
FIG. 1A illustrates a block diagram of a networked vehicle environment in which one or more vehicles and/or one or more user devices interact with a server via a network, according to one embodiment.
FIG. 1B illustrates a block diagram showing the vehicle of FIG. 1A communicating with one or more other vehicles and/or servers of FIG. 1A, according to one embodiment.
FIG. 2 is a schematic diagram illustrating a vehicle moving along a roadway and an example of components that the vehicle may use to determine its geographic location information.
Fig. 3 is a schematic diagram illustrating an example of map data that can be represented by a plurality of map data blocks.
FIG. 4 is a schematic diagram illustrating an example of map data that may be loaded into memory based on an initial (or first) geographic location of a vehicle, the map data including a plurality of map data blocks, wherein at least a portion of each of the plurality of map data blocks falls within a boundary that defines a bounding area that circumscribes a geographic location of the vehicle.
FIG. 5 is a schematic diagram illustrating an example of map data that may be loaded into memory based on an updated (or second) geographic location of a vehicle, the map data including a plurality of map data blocks, wherein at least a portion of each of the plurality of map blocks falls within a boundary defining a bounding area that circumscribes the updated geographic location of the vehicle.
FIG. 6 is a schematic diagram illustrating another example of map data that may be loaded into memory based on an initial (or first) geographic location of a vehicle, the map data including a plurality of map data blocks, wherein at least a portion of each of the plurality of map data blocks falls within a boundary that defines a bounding area that circumscribes a geographic location of the vehicle.
FIG. 7 is a schematic diagram illustrating an example of map data that may be loaded into memory based on an updated (or second) geographic location of a vehicle, the map data including a plurality of map data blocks, wherein at least a portion of each of the plurality of map blocks falls within a boundary defining a bounding area that circumscribes the updated geographic location of the vehicle.
FIG. 8 is a schematic diagram of an example of a computer system that may be onboard a vehicle and that may be used to perform the map data loading described herein.
Fig. 9 is a schematic diagram illustrating another example of map data that may be loaded into memory based on a geographic location of a vehicle, the map data including a plurality of map data blocks, wherein at least a portion of each of the plurality of map data blocks falls within a boundary bounding a contiguous boundary area around the geographic location of the vehicle, and wherein a motion vector (e.g., a velocity vector indicative of the velocity and direction of the vehicle) has been obtained (e.g., determined by the system based on subsequent determinations of the geographic location of the vehicle).
Fig. 10 is a schematic diagram illustrating another example of map data that may be loaded into memory based on the geographic location of the vehicle and motion vectors that determine the size or shape characteristics (e.g., dimensions) of boundaries used to determine which blocks of map data to load into memory.
Fig. 11 is a flowchart of a method of loading map data.
Detailed Description
The following detailed description is directed to certain aspects and embodiments of the invention. However, the invention can be embodied in many different forms. It should be apparent that the aspects herein may be embodied in a wide variety of forms and that any specific structure, function, or both being disclosed herein merely represents one or more embodiments of the invention. The various aspects disclosed herein may be implemented independently of each other, and two or more of these aspects may be combined in various ways. For example, different embodiments of a method of loading map data from a map data storage system configured to store large amounts of data (e.g., terabytes or more) into a "working" memory of a computer system on a vehicle while moving along a route may be implemented using the various aspects/features disclosed herein. Further, such methods may be implemented using other processes, steps, structures, functions, or structures in addition to or in lieu of one or more aspects set forth herein or such systems may be implemented.
Map data used by vehicles (e.g., autonomous vehicles, vehicles providing driver assistance features, vehicles for location-based services, etc.) may be referred to as high-resolution (HD) maps, which may contain many types of information, ranging from raw data (e.g., images captured at a location and a particular orientation relative to the location) to representations of features or objects (e.g., information representing roads, signs, man-made objects near the roads, natural objects near the roads, etc.). Such data and features may represent information previously collected by another vehicle traveling along the road or information determined to be near the road (e.g., elevation data). Generally, as used herein, "near road" or "proximal portion" and the like refer to information that may be sensed by one or more sensors of a sensor system disposed on a vehicle, or information that a vehicle may use to locate the vehicle or control the vehicle.
Various embodiments of high-resolution (HD) maps may contain different information that may be provided by various data information systems on one or more storage components. For example, the information in the high-resolution (HD) map or a portion of the high-resolution (HD) map (e.g., a map data block) may include information representing one or more of altitude, intensity (intensity), natural features (geographic features), roads, signs, building houses, sidewalks, landscapes, and other man-made objects or objects artificially placed in locations. In some embodiments, a high-resolution (HD) map includes elevation information, intensity information. In some embodiments, information representing objects (e.g., man-made objects) is stored in a data storage device separate from the altitude and/or intensity information. For example, in a database that can query objects around the vehicle on demand are vehicles moving along a route. Such queries may be based on a predetermined distance around the vehicle, i.e., such that all objects within a certain distance are returned based on such queries. In another example, such queries may be based on distance around the vehicle that varies based on one or more factors, such as the speed, location (e.g., city or rural roads), or direction of travel of the vehicle.
To be effective for use with vehicles, high-resolution (HD) maps may include information at centimeter-level resolution. In the illustrative embodiment, in the example of raw data constructed on a 10cm by 10cm grid, the resolution may be determined by the upper limit of error that can be tolerated by the vehicle. For each cell on the grid, three bytes may be used: one for intensity information and two bytes for elevation information. Thus, for 1 square meter, 300 bytes (10 × 3) in the memory space are used. For a 10km by 10km region, storage of data may require 30GB of storage space (e.g., 10k x 300). Less storage space is required if the stored data is compressed. However, any decompression to retrieve data from the data storage component requires at least some additional overhead associated with the decompression process, and thus may increase the overhead for retrieving data. Storage of high-resolution (HD) map data is an issue. Transferring high-resolution (HD) map data from a storage location to a vehicle is another problem. Such communications must be reliable, efficient, and secure to ensure that the required high-resolution (HD) map data is available when needed.
The navigation system may provide a route that the vehicle should take when moving from a first point to a second point, for example, indicating a particular highway and street that the vehicle is to use. For second-level control of the vehicle, it may only be necessary to load data around the vehicle. When the vehicle is moving, we can load new data into memory and delete old data from it. In one method embodiment that enables loading of new high-resolution (HD) maps into memory and removal of old high-resolution (HD) maps from memory, the information grid through which the vehicle is traveling may be depicted as information blocks. For example, each block may have a data resolution of 2000 × 2000, corresponding to 200m × 200m in the real world.
One example of selecting the size of a map data block may be based on block loading frequency and block size. We do not want to load blocks too often. I/O reading files from disk is expensive. This indicates that the block cannot be too small. In urban areas, vehicles may move at speeds of 10m/s to 20 m/s. Therefore, it takes 10 to 20 seconds for the vehicle to pass through 1 block. The result is a frequency of 0.05 to 0.1Hz, which is cost prohibitive. The size of the block should not be too large as this takes up more storage space. While computers on vehicles may typically be very powerful, computing resources may actually be very limited due to all of the computing demands that occur on the vehicle. In an example where the storage space occupancy for a high-resolution (HD) map is expected to be about 100MB or less, 1 block occupies 2k 3B-12 MB of storage space. So 9 blocks will occupy 108MB of storage space. We want the loading of the entire block to be completed in 100ms, and faster if possible. The block size should be compatible with the loading time in different situations. Larger block sizes result in longer load times, which may result in blocks not being available until they need to be accessed. Conversely, smaller block sizes result in higher loading frequencies, and similar usability problems may occur when a vehicle is quickly traversing one block of information and needs the next. The block may be a compressed file. For a block size of 2000 x 2000, it takes approximately 10ms to fully load 1 block, including reading and decompression. Thus, loading the complete 9 block packets may take less than 100ms, which may be done by one thread before the vehicle enters the unloaded region. To address the above issues, 2000 × 2000 is a block size that satisfies resource constraints and optimizes loading frequency in one embodiment.
Certain embodiments described herein include the use of high-resolution (HD) map data configured in an arrangement of blocks. One or more blocks are loaded from the memory means into the memory of the device for controlling time in the vehicle, as required. For example, at any time, 9 blocks are loaded in memory, representing some bounded area around the vehicle. To seamlessly provide high-resolution (HD) map data, blocks that a vehicle may travel to may be preloaded in a background thread. The block loading is completely hidden from the client. For example, a boundary defining an area (or region) of size 4000 × 4000 may be arranged around the center block. There is no change in block loading as the vehicle moves within the bounded area. When the vehicle moves out of the bounded area, a new tile will be loaded to form a new 9 tile arrangement around the center tile.
A method for loading map data may be implemented on a computing device of a vehicle. The method may include obtaining a geographic location of the vehicle, for example using, for example, GPS and an inertial navigation system, a fixed location indicator along the roadway that is sensed by one or more sensors of the vehicle, and/or receiving a transmission (e.g., radio or optical) of a transmitter disposed at a location where the vehicle is able to receive its signal. Boundaries corresponding to geographic boundary regions around the vehicle location may be obtained (e.g., computed). The method may then load map data containing a plurality of map data blocks from the storage component to a memory of the device. Each of the plurality of map data blocks includes a geographic boundary region around a portion of the vehicle that corresponds to a portion of the loaded map data. That is, based on the block size, the total loaded map data covers and extends outside of the geographic boundary area. The plurality of map data blocks may include a center block having a point corresponding to a geographic location of the device. The plurality of map tiles may also include a perimeter enforcement (managed) block arranged around the center block (and around the vehicle location). The boundary is centered on the center block and the range such that the geographic boundary region intersects the surrounding map data blocks. For example, the loaded map data may include 9 blocks arranged in a rectangle, and the position of the vehicle corresponds to one point in the center block.
While the vehicle is in motion, an updated geographic location of the device is obtained (e.g., using GPS, roadside location markers, inertial positioning systems, etc.) and the system determines the location of the vehicle relative to the boundary area at the updated geographic location. In response to determining that the updated position of the vehicle is outside of the boundary region, an updated boundary is determined. The updated boundary corresponds to an updated region centered on the updated position of the vehicle. Map data in the form of map data blocks may be loaded from the storage means into the memory of the device such that the finally loaded map data includes a central block having a point corresponding to the location of the device (vehicle) update, and map data blocks surrounding the central block and intersecting the boundary area. In other words, when the position of the vehicle is determined to be beyond the boundary area bounded by the most recently determined boundary, then additional map data blocks are loaded.
Illustrative embodiments
Embodiments of a system and method for loading map data are described below with reference to the drawings. Those skilled in the art will recognize that the techniques described may be used without departing from the scope of the described techniques. Many modifications and variations are possible. Such modifications and alterations are intended to fall within the scope of the embodiments. Those of skill in the art will also recognize that components included in one embodiment may be interchanged with other embodiments-one or more components from an illustrated embodiment may be included with other illustrated embodiments in any combination. For example, any of the various components described herein and/or shown in the figures may be combined, interchanged, or eliminated in other embodiments.
FIG. 1A shows a block diagram of a networked vehicle environment 100 in which one or more vehicles 120 and/or one or more user devices 102 interact with a server 130 via a network 110, according to one embodiment. For example, the vehicle 120 may be equipped to provide travel sharing and/or other location-based services to help the driver control vehicle operation (e.g., through various driver assistance features such as adaptive and/or conventional cruise control, adaptive headlamp control, anti-lock braking, auto parking, night vision, blind spot monitoring, collision avoidance, crosswind stabilization, driver fatigue detection, driver monitoring systems, emergency driver assistance, intersection assistance, ramp descent control, smart speed adaptation, lane centering, lane departure warning, forward, rear, and/or side parking sensors, pedestrian detection, rain sensors, look-around systems, tire pressure monitors, traffic sign recognition, steering assistance, false road driving warnings, traffic condition alerts, etc.) and/or to fully control vehicle operation. Thus, the vehicle 120 may be a conventional gasoline, natural gas, biofuel, electrical power, hydrogen, etc. vehicle configured to provide shared egress and/or other location-based services, a vehicle providing driver assistance functionality (e.g., one or more of the driver assistance features described herein), or an automated or Autonomous Vehicle (AV). Vehicle 120 may be an automobile, truck, van, bus, motorcycle, scooter, bicycle, and/or any other motorized vehicle.
Server 130 may communicate with vehicle 120 to obtain vehicle data, such as route data, sensor data, perception data, vehicle 120 control data, vehicle 120 component failure and/or fault data, and so forth. Server 130 may process and store the vehicle data for use in other operations performed by server 130 and/or another computing system (not shown). Such operations may include running a diagnostic model to identify vehicle 120 running problems (e.g., causes of navigation errors of vehicle 120, abnormal sensor readings, unidentified objects, vehicle 120 component failures, etc.); running the model to simulate vehicle 120 performance given a set of variables; identify objects that vehicle 120 cannot identify, generate control instructions that, when executed by vehicle 120, cause vehicle 120 to travel and/or move in some manner along a specified path; and/or the like.
Server 130 may also transmit data to vehicle 120. For example, server 130 may transmit map data, firmware and/or software updates, vehicle 120 control instructions, identification results of objects that are not recognized by vehicle 120, passenger access information, traffic data, and/or the like.
In addition to communicating with one or more vehicles 120, server 130 may also be capable of communicating with one or more user devices 102. In particular, server 130 may provide a web service to enable a user to request a location-based service (e.g., a shipping service, such as a shared travel service) through an application running on user device 102. For example, the user device 102 may correspond to a computing device, such as a smartphone, tablet, laptop, smartwatch, or any other device that may communicate with the server 130 over the network 110. In this embodiment, the user device 102 executes an application, such as a mobile application, that the user operating the user device 102 may use to interact with the server 130. For example, the user device 102 may communicate with the server 130 to provide location data and/or queries to the server 130, receive map-related data and/or directions from the server 130, and/or the like.
Server 130 may process the request and/or other data received from user device 102 to identify a service provider (e.g., vehicle 120 driver) to provide the requested service to the user. Further, server 130 may receive data, such as user trip access or destination data, user location query data, and the like, based on which server 130 identifies areas, addresses, and/or other locations associated with various users. The server 130 may then use the identified location to provide directions to the service provider and/or user to the determined access location.
Applications running on user device 102 may be created and/or manufactured by the same entity responsible for server 130. Alternatively, the application running on the user device 102 may be a third party application that includes features (e.g., an application programming interface or a software development kit) that enable communication with the server 130.
For simplicity and ease of explanation, one server 130 is illustrated in FIG. 1A. However, it should be appreciated that server 130 may be a single computing device, or may include a plurality of different computing devices logically or physically grouped into a set that collectively operates as a server system. The components of the server 130 may be implemented in dedicated hardware (e.g., a server computing device with one or more ASICs) without software, or as a combination of hardware and software. Additionally, the modules and components of server 130 may be combined on one server computing device or separated or grouped separately on several server computing devices. In some embodiments, server 130 may include more or fewer components than shown in FIG. 1A.
The network 110 includes any wired network, wireless network, or combination thereof. For example, the network 110 may be a personal area network, a local area network, a wide area network, an over-the-air broadcast network (e.g., a network for broadcast or television), a cable network, a satellite network, a cellular telephone network, or a combination thereof. As another example, the network 110 may be a publicly accessible network linking networks, possibly operated by various different parties, such as the internet. In some embodiments, the network 110 may be a private or semi-private network, such as a corporate or university intranet. Network 110 may include one or more wireless networks, such as for a global system for mobile communications (GSM) network, a Code Division Multiple Access (CDMA) network, a Long Term Evolution (LTE) network, or any other type of wireless network. The network 110 may use protocols and components for communicating over the internet or any other of the networks described above. For example, the protocols used by the network 110 may include hypertext transfer protocol (HTTP), hypertext transfer security protocol (HTTPs), Message Queue Telemetry Transport (MQTT), restricted application protocol (CoAP), and the like. Protocols and components for communicating via the internet or any other type of communication network as previously described are well known to those skilled in the art and are therefore not described in detail herein.
The server 130 may include a navigation unit 140, a vehicle data processing unit 145, and a data store 150. The navigation unit 140 may assist in location-based services. For example, the navigation unit 140 may assist a user (also referred to herein as a "driver") in transporting another user (also referred to herein as a "lift") and/or object (e.g., food, packages, etc.) from a first location (also referred to herein as an "pickup location") to a second location (also referred to herein as a "destination location"). The navigation unit 140 may assist in enabling user and/or object transport by providing maps and/or navigation instructions to applications running on the driver's user device 102, to applications running on the lift's user device 102, and/or to a navigation system running on the vehicle 120.
As an example, the navigation unit 140 may include a matching service (not shown) that pairs a lift requesting a trip from an pickup location to a destination location with a driver who is able to complete the trip. The matching service may interact with an application running on the lift's user device 102 and/or an application running on the driver's user device 102 to establish the lift's itinerary and/or to process money paid by the lift to the driver.
The navigation unit 140 may also communicate with an application running on the driver's user device 102 during the trip to obtain trip location information from the user device 102 (e.g., via Global Positioning System (GPS) components coupled to and/or embedded in the user device 102) and provide navigation directions to the application, which assists the driver in driving from the current location to the destination location. The navigation unit 140 may also indicate a number of different geographical locations or points of interest to the driver, whether or not the driver is carrying a lift.
The vehicle data processing unit 145 may be configured to support driver assistance features of the vehicle 120 and/or to support autonomous driving. For example, vehicle data processing unit 145 may generate and/or transmit map data to vehicle 120, run a diagnostic model to identify an operational issue with vehicle 120, run a model to simulate vehicle 120 performance given a set of variables, identify objects using vehicle data provided by vehicle 120 and transmit an identification of the objects to vehicle 120, generate and/or transmit vehicle 120 control instructions and/or type operations to vehicle 120.
The data store 150 may store various types of data used by the navigation unit 140, the vehicle data processing unit 145, the user device 102, and/or the vehicle 120. For example, the data store 150 may store user data 152, map data 154, search data 156, and log data 158.
The user data 152 may include information about some or all users registered for the location-based service, such as drivers and lift riders. The information may include, for example, a username, password, name, address, billing information, data associated with a previous trip taken or serviced by the user, user rating information, user loyalty rating information, and/or the like.
Map data 154 may include high-resolution (HD) maps generated from sensors (e.g., light detection and ranging (LiDAR) sensors, radio detection and ranging (RADAR) sensors, infrared cameras, visible light cameras, stereo cameras, Inertial Measurement Units (IMU), etc.), satellite images, Optical Character Recognition (OCR) performed on captured street images (e.g., recognizing street names, recognizing street sign text, recognizing point of interest names, etc.), and so forth; information for calculating a route; information for rendering a two-dimensional (2D) and/or three-dimensional (3D) graphical map; and/or the like. For example, the map data 154 may include a number of elements: such as street and intersection layouts, bridges (e.g., including information about the height and/or width of the bridge on a street), exit ramps, buildings, parking lot entrances and exits (e.g., including information about the height and/or width of vehicle entrances and/or exits), locations of signboards and stop lights, emergency crossings, points of interest (e.g., parks, restaurants, gas stations, sights, landmarks, etc., and associated names), road markings (e.g., center line markings separating opposing lanes, lane markings, stop lines, left turn guide lines, right turn guide lines, pedestrian crossings, bus lane markings, bicycle lane markings, safety island markings, road text, highway exit and entrance markings, etc.), curb stones, railway lines, fairways, turn radii and/or angles for left and right turns, distances and dimensions of road features, road signs, road traffic lights, parking ramps, safety islands, traffic lights, road characters, and the like, The location of the partition between two-way traffic and/or the like, along with the associated geographic locations (e.g., geographic coordinates) of these elements. The map data 154 may also include reference data such as real-time and/or historical traffic information, current and/or predicted weather conditions, road work information, information regarding laws and regulations (e.g., speed limits, whether to allow or disallow a right turn at a red light, whether to allow or disallow a turn around, allowed travel directions, and/or the like), news events, and/or the like.
Although the map data 154 is illustrated as being stored in the data store 150 of the server 130, this is not meant to be limiting. For example, server 130 may transmit map data 154 to vehicle 120 for storage therein (e.g., in data store 129, as described below).
The search data 156 may include searches that were entered by a number of different users in the past. For example, the search data 156 may include a text search for access and/or destination locations. The search may be for a particular address, geographic location, name associated with the geographic location (e.g., name of park, restaurant, gas station, attraction, landmark, etc.), and so forth.
The log data 158 may include vehicle data provided by one or more vehicles 120. For example, the vehicle data may include route data, sensor data, perception data, vehicle 120 control data, vehicle 120 component failure and/or fault data, and the like.
FIG. 1B illustrates a block diagram showing the vehicle 120 of FIG. 1A communicating with one or more other vehicles 170A-N of FIG. 1A and/or the server 130, according to one embodiment. As shown in fig. 1B, vehicle 120 may include various components and/or data storage. For example, the vehicle 120 may include a sensor array 121, a communication array 122, a data processing system 123, a communication system 124, an internal interface system 125, a vehicle control system 126, an operating system 127, a map engine 128, and/or a data store 129.
Communications 180 may be sent and/or received between vehicle 120, one or more vehicles 170A-N, and/or server 130. Server 130 may transmit and/or receive data from vehicle 120, as described above in connection with fig. 1A. For example, server 130 may transmit vehicle control instructions or commands to vehicle 120 (e.g., as communication 180). The vehicle control instructions may be received by a communication array 122 (e.g., an array of one or more antennas configured to transmit and/or receive wireless signals) operated by a communication system 124 (e.g., a transceiver). The communication system 124 may communicate the vehicle control commands to a vehicle control system 126 that may operate an acceleration, steering, braking, lights, signals, and other operating systems 127 of the vehicle 120 to drive and/or steer the vehicle 120 and/or assist the driver in driving and/or steering the vehicle 120 along a direct path toward a destination location specified by the vehicle control commands.
As an example, the vehicle control instructions may include route data 163 that may be processed by the vehicle control system 126 to maneuver the vehicle 120 and/or assist a driver in maneuvering the vehicle 120 along a given route (e.g., an optimized route calculated by the server 130 and/or the map engine 128) toward a specified destination location. In processing the route data 163, the vehicle control system 126 may generate control commands 164 for execution by the operating system 127 (e.g., to accelerate, steer, brake, steer, reverse, etc.) to cause the vehicle 120 to travel along the route to the destination location and/or to assist the driver in steering the vehicle 120 along the route toward the destination location.
Destination location 166 may be specified by server 130 based on a user request (e.g., an access request, a delivery request, etc.) transmitted from an application running on user device 102. Alternatively or additionally, the lift and/or driver of the vehicle 120 may provide user input 169 via the internal interface system 125 (e.g., a vehicle navigation system) to provide the destination location 166. In some embodiments, vehicle control system 126 may transmit the input destination location 166 and/or the current location of vehicle 120 (e.g., as a GPS data packet) as communication 180 to server 130 via communication system 124 and communication array 122. The server 130 (e.g., navigation unit 140) may perform an optimization operation using the current location of the vehicle 120 and/or the input destination location 166 to determine an optimal route for the vehicle 120 to travel to the destination location 166. Route data 163, including the optimal route, may be communicated from server 130 to vehicle control system 126 via communication array 122 and communication system 124. As a result of receiving the route data 163, the vehicle control system 126 can cause the operating system 127 to maneuver the vehicle 120 along the optimal route directly to the destination location 166, assist the driver in maneuvering the vehicle 120 along the optimal route directly to the destination location 166, and/or cause the internal interface system 125 to display and/or present instructions for maneuvering the vehicle 120 along the optimal route directly to the destination location 166.
Alternatively or additionally, the route data 163 includes an optimal route and the vehicle control system 126 automatically inputs the route data 163 into the map engine 128. The map engine 128 may generate map data 165 using the optimal route (e.g., generate a map to display the optimal route and/or take instructions for the optimal route) and provide the map data 165 to the internal interface system 125 (e.g., via the vehicle control system 126) for display. The map data 165 may include information derived from the map data 154 stored in the data store 150 on the server 130. The displayed map data 165 may indicate an estimated time of arrival and/or display the progress of the journey of the vehicle 120 along the optimal route. The displayed map data 165 may also include indicators such as diversion commands, emergency notifications, road work information, real-time traffic data, current weather conditions, information about laws and regulations (e.g., speed limits, whether or not to allow or prohibit a right turn at a red light, where to allow or prohibit a turn around, allowed directions of travel, etc.), news events, and/or the like.
User input 169 may also be a request to access a network (e.g., network 110). In response to such a request, the internal interface system 125 can generate an access request 168, which can be processed by the communication system 124 to configure the communication array 122 to send and/or receive data corresponding to user interaction with the internal interface system 125 and/or user device 102 interaction with the internal interface system 125 (e.g., user device 102 connected to the internal interface system 125 via a wireless connection). For example, the vehicle 120 may include an onboard Wi-Fi that passengers and/or drivers may access to send and/or receive email and/or text messages, streaming audio and/or video content, browse content pages (e.g., web pages, etc.), and/or access applications using web access. Based on the user interaction, internal interface system 125 can receive content 167 via network 110, communication array 122, and/or communication system 124. Communication system 124 may dynamically manage network access to avoid or minimize disruption of the transmission of content 167.
The sensor array 121 may include any number of one or more types of sensors, such as a satellite radio navigation system (e.g., GPS), light detection and ranging (LiDAR) sensors, landscape (landscaped) sensors (e.g., radio detection and ranging sensors), Inertial Measurement Units (IMU), cameras (e.g., infrared cameras, visible light cameras, stereo cameras, etc.), Wi-Fi detection systems, cellular communication systems, inter-vehicle communication systems, road sensor communication systems, feature sensors, proximity sensors (e.g., infrared, electromagnetic, photoelectric, etc.), distance sensors, depth sensors, and/or the like. The satellite radio navigation system may calculate the current position of vehicle 120 (e.g., within a range of 1-10 meters) based on analyzing signals received from a constellation of satellites.
Light detection and ranging (LiDAR) sensors, radio detection and ranging sensors, and/or any other similar type of sensor may be used to detect the environment around the vehicle 120 when the vehicle 120 is in motion or is about to begin motion. For example, light detection and ranging (LiDAR) sensors may be used to reflect multiple laser beams from approaching objects to assess their distance and provide accurate three-dimensional (3D) information about the surrounding environment. Data obtained from light detection and ranging (LiDAR) sensors may be used to perform object identification, motion vector determination, collision prediction, and/or implement accident avoidance procedures. Alternatively, a light detection and ranging (LiDAR) sensor may use a rotating scanning mirror assembly to provide a 360 degree viewing angle. Light detection and ranging (LiDAR) sensors may optionally be mounted on the roof of the vehicle 120.
An Inertial Measurement Unit (IMU) may include an X, Y, Z-oriented gyroscope and/or accelerometer. An Inertial Measurement Unit (IMU) provides data regarding rotational and linear motion of vehicle 120, which may be used to calculate motion and position of vehicle 120.
The camera may be used to capture a visual image of the environment surrounding the vehicle 120. Depending on the configuration and number of cameras, the cameras may provide a 360 degree view of the surroundings of vehicle 120. The images from the camera may be used to read road markings (e.g., lane markings), read street signs, detect objects, and/or the like.
A Wi-Fi detection system and/or a cellular communication system may be used to triangulate Wi-Fi hotspots or cell towers, respectively, to determine the location of the vehicle 120 (optionally in conjunction with a satellite radio navigation system).
The inter-vehicle communication system (which may include a Wi-Fi detection system, a cellular communication system, and/or the communication array 122) may be used to receive and/or transmit data to other vehicles 170A-N, such as current speed and/or position coordinates of the vehicle 120, time and/or position coordinates corresponding to when deceleration is planned and a planned deceleration rate, time and/or position coordinates when stopping operations are planned, time and/or position coordinates when lane changes are planned and a lane change direction, time and/or position coordinates when turning operations are planned, time and/or position coordinates when stopping operations are planned, and/or the like.
A road sensor communication system (which may include a Wi-Fi detection system and/or a cellular communication system) may be used to read information from road sensors (e.g., indicating traffic speed and/or traffic congestion) and/or to read information from traffic control devices (e.g., traffic lights).
When a user requests a posting (e.g., through an application running on the user device 102), the user may specify a particular destination location. The originating location may be the current location of the vehicle 120, which may be determined using a satellite radio navigation system (e.g., GPS, Galileo, Beidou/COMPASS, DORIS, GLONASS, and/or other satellite radio navigation systems) installed in the vehicle, a Wi-Fi positioning system, cell tower triangulation, and/or the like. Alternatively, the originating location may be specified by the user through a user interface (e.g., internal interface system 125) provided by vehicle 120 or through the user device 102 running the application. Alternatively, the originating location may be automatically determined based on location information obtained from the user device 102. In addition to an originating location and a destination location, one or more navigation points may be specified, enabling multiple destination locations.
Raw sensor data 161 from sensor array 121 may be processed by an on-board data processing system 123. The processed data 162 may then be transmitted by the data processing system 123 to the vehicle control system 126 and optionally to the server 130 via the communication system 124 and the communication array 122.
Data store 129 may store map data (e.g., map data 154) and/or a subset of map data 154 (e.g., a portion of map data 154 corresponding to an approximate area in which vehicle 120 is currently located). In some embodiments, the vehicle 120 may record updated map data along the travel route using the sensor array 121 and transmit the updated map data to the server 130 via the communication system 124 and the communication array 122. The server 130 may then transmit the updated map data to one or more of the vehicles 170A-N and/or further process the updated map data.
The data processing system 123 may provide continuously or near continuously processed data 162 to the vehicle control system 126 in response to point-to-point activity in the environment surrounding the vehicle 120. Processed data 162 may include a comparison between raw sensor data 161, representing the operating environment of vehicle 120 and continuously collected by sensor array 121, and map data stored in data store 129. In one example, the data processing system 123 is programmed with machine learning or other artificial intelligence capabilities to enable the vehicle 120 to identify and respond to conditions, events, and/or potential hazards. In variations, the data processing system 123 may continuously or near continuously compare the raw sensor data 161 to stored map data in order to perform positioning to continuously or near continuously determine the position and/or orientation of the vehicle 120. The positioning of the vehicle 120 may enable the vehicle 120 to know the immediate location and/or orientation of the vehicle 120 as compared to stored map data in order to maneuver the vehicle 120 across a flow of traffic on a ground street and/or to assist a driver in maneuvering the vehicle 120 across a flow of traffic on a ground street and to identify and respond to potentially dangerous (e.g., pedestrians) or local conditions, such as weather or traffic conditions.
Further still, positioning may enable vehicle 120 to tune or beam steer communication array 122 to maximize communication link quality and/or minimize interference from other communications of other vehicles 170A-N. For example, communication system 124 may beam steer the radiation pattern of communication array 122 in response to network configuration commands received from server 130. Data store 129 may store current network resource map data that identifies network base stations and/or other network sources that provide network connectivity. The network resource map data may indicate the location of the base stations and/or available network types (e.g., 3G, 4G, LTE, Wi-Fi, etc.) within the area in which the vehicle 120 is located.
Although fig. 1B describes certain operations as being performed by the vehicle 120 or the server 130, this is not meant to be limiting. The operations performed by the vehicle 120 and the server 130 as described herein may be performed by any entity. For example, certain operations typically performed by the server 130 (e.g., transmitting updated map data to the vehicles 170A-N) may be performed by the vehicle 120 for load balancing purposes (e.g., reducing the processing load of the server 130, utilizing idle processing power on the vehicle 120, etc.).
Still further, any of the vehicles 170A-N may include some or all of the components of the vehicle 120 described herein. For example, vehicles 170A-N may include communication array 122 to communicate with vehicle 120 and/or server 130.
Fig. 2 is a schematic diagram illustrating an example of the vehicle 120 moving along the road 241. The road 241 may be a portion of a route that the vehicle 120 is controlled to traverse from a first point to a second point in a manual manner, a semi-autonomous manner (e.g., by assisting the driver), and/or an autonomous manner. In fig. 2, vehicle 120 is moving along road 241 at a speed and direction indicated by motion vector 230. Fig. 2 also illustrates an example of a positioning component that passively or actively provides vehicle 120 with geographic location information that vehicle 120 may use to determine a location (e.g., geographic location) of vehicle 120.
Positioning components along road 241 or in communication with sensors on vehicle 120 may be used to help control vehicle 120 as vehicle 120 moves along road 241. Fig. 2 illustrates several examples of such positioning assemblies. The proximal positioning members 250A, 250B may travel along the pathway 241. In various embodiments, such components may be contiguous or closely arranged, and may be passive (sensed by sensors on vehicle 120, such as reflective sensors on vehicle 120, or by Infrared (IR) or optical sensors) or active (e.g., emitting radiation sensed by vehicle 120). One or more of the distal positioning members 225 may be disposed beside the roadway or some distance away from the roadway. The distal positioning member 225 may also be active or passive, as well as various embodiments. In some embodiments, the GPS transmitter 215 may provide GPS signals that are received by the vehicle 120. In some embodiments, one or more stationary transmitters 220 may be disposed along roadway 241 and provide vehicle 120 with transmissions or communications that assist the vehicle in determining its location.
In various embodiments, the vehicle 120 may include a sensor system as part of the computer system 105 or may include a sensor system on the vehicle that interfaces with the computer system 105. Computer system 105 may include any of the components of vehicle 120 described above in connection with FIG. 1B. In various embodiments, the sensor system may include one or more sensors configured to sense information about the environment in which the vehicle 120 is located. In various embodiments, the one or more sensors may include one or more of a Global Positioning System (GPS) module, an Inertial Measurement Unit (IMU), a radio detection and ranging (RADAR) unit, a laser rangefinder and/or light detection and ranging (LIDAR) unit, an Infrared (IR) camera, and/or an optical camera. The GPS module may be any sensor configured to estimate the geographic location of the vehicle 120. To this end, the GPS module may include a transceiver configured to estimate the position of the automobile 100 relative to the earth from the satellite-based positioning data. In one example, the computer system 105 may be configured to use a GPS module in conjunction with map data to estimate the location of lane boundaries on a road on which the vehicle 120 may travel.
The Inertial Measurement Unit (IMU) may be any combination of sensors configured to sense changes in position and orientation of the vehicle 120 based on inertial acceleration. In some examples, the combination of sensors may include, for example, an accelerometer and a gyroscope. Other combinations of sensors are also possible.
A radio detection and ranging (RADAR) unit may be considered an object detection system that may be configured to use radio waves to determine characteristics of an object, such as the range, height, direction, or speed of the object. A radio detection and ranging (RADAR) unit may be configured to emit pulses of radio waves or microwaves that may bounce off any object in the wave path. The object may return a portion of the energy of the wave to a receiver (e.g., a dish or antenna), which may also be part of a radio detection and ranging (RADAR) unit. A radio detection and ranging (RADAR) unit may also be configured to digitally signal process the received signal (bouncing off the object) and may be configured to identify the object.
Other systems similar to radio detection and ranging (RADAR) have been applied in other parts of the electromagnetic spectrum. One example is light detection and ranging (LIDAR), which may be configured to use visible light from a laser rather than radio waves.
A light detection and ranging (LIDAR) unit may include a sensor configured to sense or detect objects in the environment in which the vehicle 120 is located using light. In general, light detection and ranging (LIDAR) is an optical remote sensing technique that can measure the distance or other properties to a target by illuminating the target with light. As an example, a light detection and ranging (LIDAR) unit may include a laser source and/or a laser scanner configured to emit laser pulses and a detector configured to receive reflections of the laser pulses. For example, a light detection and ranging (LIDAR) unit may include a laser rangefinder that is reflected by a rotating mirror, and scans the laser in one or two dimensions around the scene being digitized, collecting range measurements at specified angular intervals. In various examples, a light detection and ranging (LIDAR) unit may include components such as a light (e.g., laser) source, scanners and optics, photodetector and receiver electronics, and a positioning and navigation system.
In one example, a light detection and ranging (LIDAR) unit may be configured to image objects using Ultraviolet (UV), visible, or infrared light, and may be used with a wide target range including non-metallic objects. In one example, a narrow laser beam may be used to map physical features of an object at high resolution.
In an example, wavelengths in the range from about 10 microns (infrared) to about 250nm (ultraviolet, UV) may be used. Typically the light is reflected by backscattering. Different types of scattering are used for different light detection and ranging (LiDAR) applications, such as rayleigh scattering, mie scattering, and raman scattering, as well as fluorescence. Based on the different kinds of backscattering, light detection and ranging (LiDAR) may be referred to as rayleigh light detection and ranging (LiDAR), mie light detection and ranging (LiDAR), raman light detection and ranging (LiDAR), and sodium/iron/potassium fluorescence light detection and ranging (LiDAR), respectively, for example. For example, a suitable wavelength combination may enable remote mapping of an object by looking for wavelength-dependent changes in reflected signal intensity.
Three-dimensional (3D) imaging may be achieved using scanning and non-scanning light detection and ranging (LiDAR) systems. "three-dimensional (3D) gated imaging light detection and ranging" is an example of a non-scanning laser ranging system that employs a pulsed laser and a fast gated camera. Imaging light detection and ranging (LiDAR) can also be performed using a high-speed detector array and a modulation-sensitive detector array, which are typically built on a single chip using CMOS (complementary metal oxide semiconductor) and hybrid CMOS/CCD (charge coupled device) fabrication technologies. In these devices, each pixel can be processed locally by demodulation or high speed gating so that the array can be processed to represent the image from the camera. Using this technique, thousands of pixels may be acquired simultaneously to create a three-dimensional (3D) point cloud representing an object or scene detected by a light detection and ranging (LIDAR) unit.
The point cloud may include a set of vertices in a three-dimensional (3D) coordinate system. For example, these vertices may be defined by X, Y and Z coordinates, and may represent the outer surface of an object. A light detection and ranging (LiDAR) unit may be configured to create a point cloud by measuring a large number of points on the surface of an object, and may output the point cloud as a data file. As a result of a three-dimensional (3D) scanning process of an object by a light detection and ranging (LIDAR) unit, a point cloud may be used to identify and visualize the object. In one example, the point cloud may be rendered directly to visualize the object. In another example, cloud points may be converted to polygonal or triangular mesh models by a process that may be referred to as surface reconstruction. Example techniques for converting a point cloud to a three-dimensional (3D) surface may include Delaunay triangulation, alpha shape, and sphere rotation. These techniques include building a network of triangles on existing vertices of the point cloud. Other example techniques may include converting the point cloud to a volumetric distance field and reconstructing an implicit surface defined by a marching cubes algorithm.
The camera may be any video camera (e.g., a camera, a video camera, etc.) configured to capture images of the environment in which the vehicle 120 is located. To this end, the camera may be configured to detect visible light, or may be configured to detect light from other parts of the spectrum, such as infrared or ultraviolet light. Other types of cameras are also possible. The camera may be a two-dimensional detector, or may have a three-dimensional spatial extent. In some examples, the camera may be, for example, a range detector configured to generate a two-dimensional image indicative of distances from the camera to a plurality of points in the environment. To this end, the camera may use one or more ranging techniques. For example, the camera may be configured to use structured light technology, where the vehicle 120 illuminates an object in the environment with a predetermined light pattern (e.g., a grid or checkerboard pattern) and uses the camera to detect a reflection of the predetermined light pattern from the object. Based on the deformation of the reflected light pattern, the vehicle 120 may be configured to determine a distance to a point on the object. The predetermined light pattern may comprise infrared light or light of another wavelength. The sensor system may additionally or alternatively include components other than those described herein.
Fig. 3 is a schematic diagram illustrating an example of map data that may be represented by a plurality of map data blocks 310. The map data may be represented in a variety of ways. As shown in fig. 3, the location on the earth 305 may be referenced by a latitude line and a longitude line. For a particular location of vehicle 120 on earth, the latitude and longitude information may be represented by a plurality of map data blocks 310 arranged around vehicle 120 in a grid. The geographic distance between the lines of latitude increments is consistent based on the latitude/longitude frame of reference and the shape of the earth. However, the geographic distance between the longitude increment lines depends on location on the earth, being closer at the two poles. Therefore, the grid patterns of the map data representing portions on the earth specified with reference to the longitude and latitude (as usual) may not be completely rectangular, and they will not be rectangular at a place near both poles. For purposes of this disclosure, a map data block 310 depicting map data representing a portion of the earth will be assumed to be rectangular or substantially rectangular, due in part to the relatively small size of the map data block 310. The map data mentioned herein do not necessarily implement references in longitude and latitude. Instead, other coordinate reference systems may be used.
FIG. 4 is a schematic diagram illustrating an example of map data that may be loaded into memory based on an initial (or first) geographic location of a vehicle, the map data including a plurality of map data blocks, wherein at least a portion of each of the plurality of map data blocks falls within a boundary that defines a contiguous boundary area around the geographic location of the vehicle. The vehicle 120 is shown to be located in a location 430 for which there is corresponding available map data information surrounding the vehicle 120 on all sides, as shown by the plurality of map data blocks 310. Referring to the location of vehicle 120 at location 430, computer system 105 or a device including computer system 105 is understood to be located at the same location 430 of vehicle 120, and thus the terms may have the same meaning and may be used interchangeably unless the context indicates otherwise.
At any time, although a plurality of map data blocks 310 are available for provision by the storage means, only some of the map data blocks are loaded into the memory of the vehicle, illustrated in fig. 4 as map data blocks 401 and 409. As used herein, "memory of a vehicle" refers to memory locations where a processor may retrieve stored information, such as chip-based memory (e.g., RAM, DRAM, cache memory on a processor, etc.) and non-disk-based memory/storage locations.
A determined boundary 420 is established around the initial position of the vehicle 430. The boundary 420 is a representation of a certain distance and shape around the vehicle 430 and encloses a geographic boundary area 425 around the vehicle 430. In the illustrated embodiment, boundary 420 is a square centered on the initial position of vehicle 430. In this embodiment, at least the 9 map data blocks 401 and 409 are loaded into the memory of the vehicle 120. In other embodiments, boundary 420 may have other shapes and extend outward from vehicle 120 by different distances. For example, the shape of boundary 420 may be rectangular, circular, or an asymmetric shape. For example, boundary 420 may extend away from vehicle 120 in the direction of movement of vehicle 120, and then it may extend to the sides and/or rear of vehicle 120.
In this embodiment, the boundary 420 intersects or surrounds 9 map data blocks 401 and 409. In other embodiments, the boundary 420 may intersect or enclose a greater or lesser number of map data blocks, depending on the size of the map data blocks in size and the shape of the boundary 420.
Fig. 5 is a schematic diagram illustrating an example of map data that may be loaded into memory based on an updated (or second) geographic location of vehicle 120, the map data including a plurality of map data blocks 402, 403, 405, 406, 410, 411, 412, 413, 414, wherein at least a portion of each of the plurality of map data blocks 402, 403, 405, 406, 410, 411, 412, 413, 414 falls within an updated boundary 422 that defines a boundary area surrounding an updated geographic location 435 of vehicle 120. FIG. 5 illustrates an initial boundary 420 intersecting or surrounding map data blocks 401 centered around an initial geographic location 430 of the vehicle and an updated boundary 422 intersecting or surrounding map data blocks 402, 403, 405, 406, 410, 411, 412, 413, 414 centered around an updated geographic location 435 of the vehicle 120. The updated boundary 422 surrounds and defines an updated geographic boundary area 440 around the vehicle 120 at the updated geographic location 435.
In practice, the system 105 determines an updated geographic location using one or more components, such as those described with reference to fig. 1B and/or fig. 2, as the vehicle moves along the route. The system 105 determines whether the updated geographic location of the vehicle 120 corresponds to a location outside of the boundary region bounded by the initial boundary 420 (outside of the initial boundary 420). If this is the case, the system 105 loads additional map data blocks into memory and determines an updated boundary 422 that bounds an updated geographic boundary region 440. In this example, system 105 loads map data blocks as needed so that map data blocks 402, 403, 405, 406, 410, 411, 412, 413, 414 are loaded into memory and can be used to control vehicle 120. In this example, map data blocks 402, 403, 405, and 406 are already in memory based on initial geographic location 430. Thus, while retaining the map data blocks 402, 403, 405, and 406 in memory, the system 105 loads the map data blocks 410, 411, 412, 413, and 414 into memory. In some embodiments, map data blocks loaded into memory without intersecting or being encompassed by the updated boundary 422 are removed from memory (e.g., the memory space allocated to map data blocks 404, 407, 408, and 409 is marked as memory space that can be overwritten). Each time additional map block data is loaded as a result of determining that the vehicle location is outside the updated geographic boundary region (e.g., indicating that the vehicle 120 is in a region corresponding to a location outside the updated geographic boundary region 422), the method may continue to be performed for subsequent updated geographic locations of the device along its entire route.
FIG. 6 is a schematic diagram illustrating another example of map data that may be loaded into memory based on an initial (or first) geographic location of a vehicle, the map data including a plurality of map data blocks, wherein at least a portion of the data blocks of each of the plurality of maps fall within boundaries that define a contiguous boundary area around the geographic location of the vehicle. In the example shown in fig. 4 and 5, vehicle 120 crosses boundary 420 at map data block 403 located at the corner of 9 map data blocks 401 and 409. Thus, as a result of the updated boundary 422 being established, 5 map data blocks 410, 411, 412, 413, 414 need to be uploaded into memory such that, after these map data blocks are uploaded, an arrangement of 9 map data blocks encompasses the updated geographic location 435.
Fig. 6 shows an initial starting position of vehicle 120 and a configuration of map data blocks similar to that shown in fig. 4, except that vehicle 120 is oriented toward map data blocks 406 such that when the position of vehicle 120 passes through boundary 420, the vehicle will be located at a position corresponding to map data blocks 406 (as shown in fig. 7). An example of map data that may be loaded into memory based on an updated (or second) geographic location of vehicle 120 is illustrated in fig. 7. When the geographic location of the vehicle 120 crosses the updated boundary 422 at a location corresponding to a map data block on the 9 map data block arrangement side, only 3 new map data blocks 413, 414, 415 need to be loaded into memory. In other words, because the updated boundary 422 is laterally displaced from the initial boundary 420, the updated boundary 422 only intersects map data blocks 413, 414, 415 that are not already in memory, and therefore only these three map data blocks 413, 414, 415 are loaded into memory.
The techniques described herein may be implemented by one or more special-purpose computing devices. A special purpose computing device may be hardwired to perform the techniques, or may include circuitry or digital electronics, such as one or more Application Specific Integrated Circuits (ASICs) or Field Programmable Gate Arrays (FPGAs), that are continuously programmed to perform the techniques, or may include one or more hardware processors programmed to perform the techniques according to program instructions in firmware, memory, other storage, or a combination. Such special purpose computing devices may also incorporate custom hardwired logic, ASICs, or FPGAs with custom programming to implement these techniques. A special-purpose computing device may be a desktop computer system, a server computer system, a portable computer system, a handheld device, a networked device, or any other device or combination of devices, loaded with hardwired and/or program logic to implement the techniques. One or more computing devices are typically controlled and coordinated through operating system software. Conventional operating systems control and schedule computer processes for executing, performing memory management, providing file systems, networking, I/O services, and providing user interface functions, such as graphical user interfaces ("GUIs"), etc.
FIG. 8 is a block diagram that illustrates a computer system 800, such as computer system 105 shown in FIG. 1, on which any of the embodiments described herein may be implemented. The system 800 may correspond exactly to the system 105 described above, or may have one or more different components. Computer system 800 includes a bus 802 or other communication mechanism for communicating information, and one or more hardware processors 804 coupled with bus 802 for processing information. For example, hardware processor 804 may be one or more general purpose microprocessors. Processor 804 may correspond to the processor described above with reference to computer system 105.
Computer system 800 also includes a main memory 806, such as a Random Access Memory (RAM), cache memory, and/or other dynamic storage device, coupled to bus 802 for storing information and instructions to be executed by processor 804. Main memory 806 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 804. These instructions, when stored in a storage medium accessible to processor 804, cause computer system 800 to enter a special-purpose machine that is customized to perform the operations specified in the instructions. In some embodiments, the instructions may cause the computer system 800 to obtain a geographic location of the device, obtain boundaries corresponding to contiguous geographic boundary regions around the geographic location of the device, load map data from the storage 810 into the memory 806 of the device, the map data including a plurality of map data blocks, each of the plurality of map data blocks including a portion of a geographic boundary region in the geographic boundary region corresponding to a portion of the loaded map data. The plurality of map data blocks loaded into the memory 806 includes a center block having a point corresponding to the geographic location of the device and map data blocks surrounding the center block. The boundary is centered on the center block and the range such that the geographic boundary region intersects the surrounding map data blocks. The instructions may also cause the computer system to obtain an updated geographic position of the device while the vehicle is in motion, determine a location of the updated geographic position relative to the boundary region, and in response to determining that the updated geographic position is within the boundary region of the outer boundary region, obtain an updated boundary centered about the updated geographic position, and load map data from the storage 810 to the memory 806 based on the updated boundary, such that the resulting loaded map data includes a center block having points corresponding to the updated geographic position of the device, and map data blocks surrounding the center block that intersect the geographic boundary region.
Computer system 800 further includes a Read Only Memory (ROM)808 or other static storage device coupled to bus 802 for storing static information and instructions for processor 804. A storage device 810, such as a magnetic disk, optical disk or USB thumb drive (flash drive) is provided and coupled to bus 802 for storing information and instructions. The main memory 806, the ROM 808, and/or the storage 810 may correspond to the memory 106 for storing map data described above. In some embodiments, the main memory 806 is a memory for storing map data blocks when they are used to control the vehicle 120. For example, one or more map data blocks may be initially stored on storage device 810, and then loaded into memory 806 and used to control vehicle 120 as needed based on the methods and systems described herein.
Computer system 800 may implement the techniques described herein using custom hardwired logic, one or more ASICs or FPGAs, firmware, and/or program logic that, in combination with the computer system, render computer system 800 into a special-purpose machine or program the system into a special-purpose machine. According to one embodiment, the techniques of this invention are performed by computer system 800 in response to processor 804 executing one or more sequences of one or more instructions contained in main memory 806. Such instructions may be read into main memory 806 from another storage medium, such as storage device 810. Execution of the sequences of instructions contained in main memory 806 causes one or more processors 804 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.
Main memory 806, ROM 808, and/or storage 810 may include non-transitory storage media. The term "non-transitory medium" and similar terms, as used herein, refers to a medium that stores data and/or instructions that cause a machine to operate in a specific manner, wherein the medium does not contain transitory signals. Such non-transitory media may include non-volatile media and/or volatile media. Non-volatile media includes, for example, optical or magnetic disks, such as storage device 810. Volatile media includes dynamic memory, such as main memory 806. For example, common forms of non-transitory media include: floppy disk, hard disk, solid state disk, magnetic tape, or any other magnetic data storage medium, CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, RAM, PROM, and EPROM, FLASH-EPROM, NVRAM, any other memory chip or cartridge, and network versions thereof.
Computer system 800 also includes a communication interface 818 coupled to bus 802. Communication interface 818 provides a two-way data communication coupling to one or more network links that are connected to one or more local networks. For example, communication interface 818 may be an Integrated Services Digital Network (ISDN) card, cable modem, satellite modem, or a modem to provide a communication connection to data on a corresponding type of telephone line. As another example, communication interface 818 may be a Local Area Network (LAN) card to provide a communication connection to a compatible LAN (or WAN component that communicates with a WAN). Wireless links may also be implemented. In any such implementation, communication interface 818 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
Computer system 800 can send messages and receive data, including program code, through the network 825, the network link 819, and the communication interface 818. The network 825 may be connected to one or more servers 830. In the Internet example, a server might transmit a requested code for an application program through the Internet, an ISP, local network and communication interface 818. The received code may be executed by processor 804 as it is received, and/or may be stored in storage device 810, or other non-volatile storage for later execution.
FIG. 9 is a schematic diagram illustrating another example of map data that may be loaded into memory based on the geographic location of the vehicle. In fig. 9, the map data includes a plurality of map data blocks 901 & 909, wherein at least a portion of each of the plurality of map data blocks 901 & 909 falls within an initial boundary 923 that defines a boundary area 925 around the vehicle 120. In fig. 9, a motion vector 930 indicative of the speed and direction of the vehicle has been obtained (e.g., determined by the system based on subsequent determinations of the geographic location of the vehicle). The example shown in fig. 9 is similar to the example shown in fig. 4, except that information relating to the speed and direction of the vehicle has been obtained and can therefore be used to determine an updated boundary 922 (fig. 10).
Fig. 10 is a schematic diagram illustrating another example of map data that may be loaded into memory based on vehicle geographic location and motion vectors that determine size or shape features (e.g., dimensions) of boundaries used to determine which blocks of map data to load into memory. In this embodiment, the speed of the vehicle 120 has been determined. Based on the speed of the vehicle, the system 105 may determine that a larger boundary 922 is needed to ensure that the map data blocks loaded in memory cover the area through which the vehicle 120 will move in the near future. In this case, since the higher the speed as indicated by the motion vector 920, the larger the boundary 922 is determined to protrude by a larger distance in the direction in which the vehicle moves with respect to the vehicle rear direction or with respect to the vehicle side direction. In this embodiment, rather than loading 9 map data blocks based on the updated boundary 922, 20 map data blocks 901 and 920 are loaded into memory. The boundary 922 defines a larger bounding region 940 than the bounding region 425 shown in fig. 4. The map data blocks 901-920 each cover a portion of the boundary area 940, and are surrounded by or intersect a boundary 922.
Fig. 11 is a flow chart 1100 of a method of loading map data. The method may be implemented on a device of a mobile vehicle having at least one processor and a memory component coupled to the processor. At block 1105, the method includes obtaining, by the at least one processor, a geographic location vehicle of the device/vehicle. In various embodiments, the geographic location of the device may be obtained using one or more sensors or other suitable components described with reference to fig. 2. At block 1110, the method obtains boundaries corresponding to geographic boundary regions around the geographic location of the device. The geographic boundary regions may be contiguous. In various embodiments, the boundary may have a predetermined size and shape, or it may be determined dynamically, for example, based on the direction and speed of the vehicle 120.
At block 1115, the method loads map data containing a plurality of map data blocks from a storage component to a memory of the device. For example, the storage 810 may store hundreds or thousands of map data blocks. When needed, a plurality of stored map data blocks may be loaded into the memory 806, the map data blocks being loaded based on the boundaries and based on which map data blocks are already in the memory 806. The plurality of map data blocks that are ultimately in memory include a center block having a point corresponding to the geographic location of the device and surrounding map data blocks, and wherein the boundary is centered on the center block and sized such that the geographic boundary region intersects the surrounding map data blocks.
At block 1120 of the method, the at least one processor obtains an updated geographic location of the vehicle while the vehicle is in motion. Similar to block 1110, the updated geographic location of the vehicle may be obtained using one or more sensors described with reference to fig. 2. At block 1125, at least one processor may determine a location of the updated geographic location relative to a boundary (or geographic boundary region). At block 1130, in response to determining that the updated geographic location of the vehicle is outside of the boundary region, the method obtains an updated boundary centered on the updated geographic location and loads map data from the storage component into a memory of the device based on the updated boundary. The updated boundary corresponds to an updated geographic area centered on the updated geographic location. Map data is loaded from the storage component into the memory of the device such that the finally loaded map data includes a center block having a point corresponding to the updated location of the device, and map data blocks surrounding the center block and intersecting the geographic boundary region. As the vehicle continues to move, blocks 1120, 1125, and 1130 may be repeated to provide map data in a geographic area surrounding the moving vehicle.
The foregoing description details certain embodiments of the systems, devices, and methods disclosed herein. It will be appreciated, however, that no matter how detailed the foregoing appears in text, the systems, devices, and methods may be practiced in many ways. Also as stated above, it should be noted that the use of particular terminology when describing certain features or aspects of the invention should not be taken to imply that the terminology is being re-defined herein to be restricted to the specific characteristics of the features or aspects of the technology with which that terminology is associated.
Each of the processes, methods, and algorithms described in the preceding sections may be implemented by, and fully or partially automated by, code modules executed by one or more computer systems or computer processors comprising computer hardware. The processes and algorithms may be implemented in part or in whole in application specific circuitry.
The various features and processes described above may be used independently of one another or may be combined in a variety of ways. All possible combinations and sub-combinations are deemed to fall within the scope of the present disclosure. Additionally, in some embodiments, certain method or process blocks may be omitted. The methods and processes described herein are also not limited to any particular sequence, and the blocks or states associated therewith may be performed in other sequences as appropriate. For example, described blocks or states may be performed in an order different than that specifically disclosed, or multiple blocks or states may be combined in a single block or state. The exemplary blocks or states may be performed serially, in parallel, or in some other manner. Blocks or states may be added to or removed from the disclosed exemplary embodiments. The example systems and components described herein may be configured differently than described. For example, elements may be added, removed, or rearranged as compared to the disclosed exemplary embodiments.
Various operations of the example methods described herein may be performed at least in part by algorithms. The algorithm may be embodied in program code or instructions stored in a memory (e.g., the non-transitory computer-readable storage medium described above). Such algorithms may include machine learning algorithms or models. In some embodiments, the machine learning algorithm or model may not explicitly program the computer to perform a function, but may learn from training data to produce a predictive model (trained machine learning model) that performs the function.
Various operations of the example methods described herein may be performed, at least in part, by one or more processors that are temporarily configured (e.g., via software) or permanently configured to perform the relevant operations. Whether temporarily configured or permanently configured, such a processor may constitute a processor-implemented engine operable to perform one or more operations or functions described herein.
Similarly, the methods described herein may be implemented at least in part by a processor, where one or more particular processors are examples of hardware. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented engines. In addition, one or more processors may also run in a "cloud computing" environment or as a "software as a service" (SaaS) to support the performance of related operations. For example, at least some of the operations may be performed by a group of computers (e.g., machines including multiple processors) that are accessible via a network (e.g., the Internet) and one or more appropriate interfaces (e.g., Application Program Interfaces (APIs)).
The execution of certain operations may be distributed among multiple processors, and may reside not only within a single machine, but may be deployed across multiple machines. In some example embodiments, the processor or processor-implemented engine may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the processor or processor-implemented engine may be distributed across multiple geographic locations.
Throughout the specification, various examples may implement various components, operations, or structures described as single examples. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in the example configurations may be implemented as a combined structure or component. Similarly, structures and functions presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject invention.
Although the subject summary has been described with reference to specific exemplary embodiments, various modifications and changes may be made to the disclosed embodiments without departing from the broader scope of the embodiments. Such embodiments of the invention may be referred to herein, individually or collectively, by the term "invention" merely for convenience and without intending to voluntarily limit the scope of this application to any single disclosure or concept if more than one is in fact present.
The embodiments illustrated herein are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed. Other embodiments may be utilized and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. The detailed description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.
Any process descriptions, elements or blocks in the flowcharts described herein and/or depicted in the figures should be understood as potentially representing modules, segments or portions of code: which comprises functions or steps of one or more executable instructions for implementing the specified logic. Alternative implementations are included within the scope of the embodiments described herein in which elements or functions may be deleted, executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art.
As used herein, the term "or" may be interpreted as including or not including a meaning. Further, multiple instances may support multiple resources, operations, or structures described herein as one instance. In addition, boundaries between various resources, operations, engines, and data stores are somewhat arbitrary, and particular operations are illustrated in the context of specific illustrative configurations. Other allocations of functionality are envisioned and may fall within the scope of various embodiments of the disclosure. In general, structures and functionality presented as separate resources in the example configurations may be implemented as a combined structure or resource. Similarly, structures and functionality presented as a single resource may be implemented as separate resources. These and other variations, modifications, additions, and improvements may fall within the scope of the disclosed embodiments of the invention as represented by the claims that follow. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
Conditional language, such as "can," "might," "may," or "may," unless specifically stated otherwise, or otherwise understood in the context of usage, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements, and/or steps. Thus, such conditional language is not generally intended to imply that features, elements, and/or steps must be used in any way in one or more embodiments or that one or more embodiments necessarily include logic for determining: whether such features, elements and/or steps are included or are to be performed in any particular embodiment, with or without user input or prompting.
Furthermore, certain terminology has been used to describe the disclosed embodiments of the invention. For example, the terms "one embodiment," "an embodiment," and "some embodiments" mean that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Therefore, it is emphasized and should be appreciated that two or more references to "one embodiment" or "an alternative embodiment" in various portions of this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures or characteristics may be combined as desired in one or more embodiments of the disclosure. Similarly, it should be appreciated that in the foregoing description of embodiments of the present disclosure, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the various embodiments. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed subject matter requires more features than are expressly recited in each claim. Rather, the claimed subject matter is not intended to be limited to all features of a single foregoing disclosed embodiment.
Moreover, those skilled in the art will recognize that aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or contexts, including any new and useful process, machine, manufacture, or combination or any new and useful improvement thereof. Accordingly, aspects of the present disclosure may be embodied entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or in a combination of software and hardware implementations that may be referred to herein collectively as a "module," unit, "" component, "" device "or" system. Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer-readable media having computer-readable program code embodied in the medium.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C + +, C #, VB, NET, Python, and the like; conventional procedural programming languages, such as the "C" programming language, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP; dynamic programming languages such as Python, Ruby and Groovy or other programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer via any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, via the Internet using an Internet service provider) or provided in a cloud computing environment or as a service, such as a software as a service (SaaS).
Headings are included herein for reference and to aid in locating the various parts. These headings are not intended to limit the scope of the concepts described in connection therewith. Such concepts have applicability throughout the present specification.
Unless specifically stated otherwise, an extracted language such as the phrase "X, Y or at least one of Z," as conventionally used in connection with the context understood to refer to an item, term, etc., may be X, Y or Z, or any combination thereof (e.g., X, Y and/or Z). Thus, such disjunctive language is generally not intended to, and should not, imply that certain embodiments require at least one of X, at least one of Y, or at least one of Z to be present for each. Those skilled in the art will recognize that the techniques described may be used without departing from the scope of the described techniques. Many modifications and variations are possible. Such modifications and alterations are intended to fall within the scope of the embodiments. Those skilled in the art will also recognize that components included in one embodiment may be interchanged with other embodiments; one or more components from the illustrated embodiments may be included with other illustrated embodiments in any combination. For example, any of the various components described herein and/or shown in the figures may be combined, interchanged, or eliminated in other embodiments.
The phrase "based on" does not mean "based only on," unless expressly specified otherwise. In other words, the phrase "based on" describes "based only on" and "based at least on". Articles such as "a" or "an" should generally be construed to include one or more of the described items unless explicitly stated otherwise. Thus, phrases such as "a device configured to perform … …" are intended to include one or more of the recited devices.
The above description discloses several methods and materials of the present invention. The invention is intended to cover modifications in the method and materials, and alterations in the manufacturing method and apparatus. Such modifications will become apparent to those skilled in the art from the disclosure or practice of the invention disclosed herein. It is therefore intended that the invention be not limited to the particular embodiments disclosed herein, but that it cover all modifications and alternatives falling within the true scope and spirit of the invention as encompassed by the appended claims. Applicants reserve the right to submit claims to combinations and subcombinations of the disclosed inventions that are regarded as novel and nonobvious. Inventions embodied in other combinations and subcombinations of features, functions, elements, and/or properties may be claimed through amendment of these claims or presentation of new claims in this or a related application. Such amended or new claims, whether they are directed to the same invention or directed to a different invention, and whether they are different, broader, narrower or equal in scope to the original claims, are also regarded as included within the subject matter of the inventions of the present disclosure.

Claims (32)

1. A method implemented on a device on a vehicle, the device having at least one processor and a storage component coupled to the processor, the method comprising:
obtaining, by the at least one processor, a geographic location of the device;
obtaining a boundary corresponding to a contiguous geographic boundary region around the geographic location of the device;
loading map data comprising a plurality of map data blocks from the storage component into a memory of the device, each of the plurality of map data blocks comprising a portion of the geographic boundary region corresponding to a portion of the loaded map data, wherein the plurality of map data blocks comprises a center block having a point corresponding to the geographic location of the device and surrounding map data blocks, and wherein the boundary is centered on the center block and sized such that the geographic boundary region intersects the surrounding map data blocks; and
while the vehicle is in motion:
obtaining, by the at least one processor, an updated geographic location of the device,
determining a location of the updated geographic location relative to the bounding region, an
In response to determining that the updated geographic location is outside of the boundary region, obtaining an updated boundary corresponding to an updated geographic region centered around the updated geographic location, and loading map data from the storage component into the memory of the device such that the resulting loaded map data comprises:
a center block having points corresponding to the updated geographic location of the device, an
Map data blocks surrounding the center block that intersects the geographic boundary region.
2. The method of claim 1, wherein the surrounding map data block is adjacent to the center block.
3. The method of claim 1, wherein the map data block includes elevation information.
4. The method of claim 1, wherein the map data chunks include intensity information.
5. The method of claim 1, wherein the geographic boundary region corresponds to a region including the center block and at least a portion of the map data block adjacent to the center block.
6. The method of claim 1, wherein the boundary is rectangular.
7. The method of claim 6, wherein the first and second light sources are selected from the group consisting of,
wherein each map data block includes a width dimension and a length dimension, and the boundary includes a width dimension and a length dimension, and
wherein the boundary width dimension is between one and three times a width dimension of each map data block and the boundary length dimension is between one and three times a length dimension of each map data block.
8. The method of claim 1, wherein the loaded map data comprises 9 map data blocks.
9. The method of claim 8, wherein each of the 9 map data blocks is of equal size.
10. The method of claim 8, wherein the 9 map data blocks include a center map data block and eight surrounding map data blocks.
11. The method of claim 1, wherein the map data blocks include a center map data block and more than eight surrounding map data blocks.
12. The method of claim 1, wherein the vehicle is an autonomous vehicle.
13. The method of claim 1, wherein the boundary is non-rectangular.
14. The method of claim 1, wherein the size of the boundary and the updated boundary is predetermined.
15. The method of claim 1, wherein obtaining the updated boundary comprises dynamically determining the updated boundary.
16. The method of claim 15, wherein dynamically determining the updated boundary comprises obtaining a speed of the moving vehicle and determining a size of the boundary based on the speed.
17. The method of claim 15, wherein dynamically determining the updated boundary comprises obtaining a speed of the moving vehicle and determining a shape of the boundary based on the speed.
18. The method of claim 1, further comprising determining a direction of motion representative of the direction of travel of the vehicle, wherein the boundary extends further in the direction of motion from the updated geographic location of the device than the boundary extends in other directions.
19. The method of claim 1, wherein the storage device comprises an optical or magnetic hard drive.
20. The method of claim 1, wherein each map data block represents an area having a width dimension of less than 1000 meters and a length dimension of less than 1000 meters.
21. The method of claim 1, wherein each map data block represents an area having a width dimension of less than 500 meters and a length dimension of less than 500 meters.
22. The method of claim 1, wherein each map data block represents an area having a width dimension of less than 250 meters and a length dimension of less than 250 meters.
23. The method of claim 1, wherein each map data block represents an area having a width dimension of about 200 meters and a length dimension of about 200 meters.
24. The method of claim 1, wherein each map data block represents an area having a width dimension of less than 100 meters and a length dimension of less than 100 meters.
25. The method of claim 1, wherein the boundary is sized such that, in response to determining that the updated geographic location is outside of the boundary area, loading map data based on the updated geographic location comprises loading three map data blocks.
26. The method of claim 1, wherein the boundary is sized such that, in response to determining that the updated geographic location is outside of the boundary area, loading map data based on the updated geographic location comprises loading five map data blocks.
27. The method of claim 1, wherein obtaining the geographic location of the device comprises receiving, by the at least one processor, information from a Global Positioning System (GPS).
28. The method of claim 1, wherein obtaining the geographic location of the device comprises receiving geographic location information from at least one transmitter at a fixed location.
29. The method of claim 1, wherein obtaining the geographic location of the device comprises sensing at least one fixed location indicator using a sensing system on the vehicle, and determining the geographic location based on the sensed at least one fixed location indicator.
30. A system, the system comprising:
a storage system configured to store map data, the map data including a plurality of map data blocks;
at least one processor coupled to a memory component comprising a set of instructions and to the storage system, the at least one processor configured to, when executing the set of instructions, cause the system to:
obtaining a geographic location of the device;
obtaining a boundary corresponding to a contiguous geographic boundary region around the geographic location of the device;
loading map data into the memory of the device from the storage component, the map data comprising a plurality of map data blocks, each of the plurality of map data blocks comprising a portion of the geographic boundary region and the geographic boundary region corresponding to a portion of the loaded map data, wherein the plurality of map data blocks comprises a center block having a point corresponding to the geographic location of the device and surrounding map data blocks, and wherein the boundary is centered on the center block and sized such that the geographic boundary region intersects the surrounding map data blocks; and
while the vehicle is in motion:
obtaining an updated geographic location of the device,
determining a location of the updated geographic location relative to the bounding region, an
In response to determining that the updated geographic location is outside of the bounding region, obtaining an updated boundary centered on the updated geographic location and loading map data from the storage component into the memory of the device based on the updated boundary.
31. The system of claim 30, further comprising a vehicle.
32. A non-transitory computer-readable medium storing instructions that, when executed by a computing device, cause the computing device to:
obtaining a geographic location of the device;
obtaining a boundary corresponding to a contiguous geographic boundary region around the geographic location of the device;
loading map data into the memory of the device from the storage component, the map data comprising a plurality of map data blocks, each of the plurality of map data blocks comprising a portion of the geographic boundary region and the geographic boundary region corresponding to a portion of the loaded map data, wherein the plurality of map data blocks comprises a center block having a point corresponding to the geographic location of the device and surrounding map data blocks, and wherein the boundary is centered on the center block and sized such that the geographic boundary region intersects the surrounding map data blocks; and
while the vehicle is in motion:
obtaining an updated geographic location of the device,
determining a location of the updated geographic location relative to the bounding region, an
In response to determining that the updated geographic location is outside of the bounding region, obtaining an updated boundary centered on the updated geographic location and loading map data from the storage component into the memory of the device based on the updated boundary.
CN201880100669.4A 2018-12-26 2018-12-26 System and method for loading and tracking maps on a vehicle Active CN113748418B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2018/067555 WO2020139330A1 (en) 2018-12-26 2018-12-26 Systems and methods for loading and tracking maps on a vehicle

Publications (2)

Publication Number Publication Date
CN113748418A true CN113748418A (en) 2021-12-03
CN113748418B CN113748418B (en) 2024-04-30

Family

ID=71128364

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880100669.4A Active CN113748418B (en) 2018-12-26 2018-12-26 System and method for loading and tracking maps on a vehicle

Country Status (2)

Country Link
CN (1) CN113748418B (en)
WO (1) WO2020139330A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112734918B (en) * 2020-12-31 2023-05-23 潍柴动力股份有限公司 Dynamic updating method, device, equipment and medium of platform-end three-dimensional electronic map

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030078724A1 (en) * 2001-10-19 2003-04-24 Noriyuki Kamikawa Image display
WO2009027161A1 (en) * 2007-08-29 2009-03-05 Wayfinder Systems Ab Pre-fetching navigation maps
CN105571608A (en) * 2015-12-22 2016-05-11 苏州佳世达光电有限公司 Navigation system, vehicle and navigation map transmission method
US20180189323A1 (en) * 2016-12-30 2018-07-05 DeepMap Inc. High definition map and route storage management system for autonomous vehicles

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005020152A1 (en) * 2005-04-29 2006-11-02 Volkswagen Ag Method for controlling map display in vehicle involves display device which is controlled in such manner that section of geographical map is displayed in three-dimensionally non-linear scale

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030078724A1 (en) * 2001-10-19 2003-04-24 Noriyuki Kamikawa Image display
WO2009027161A1 (en) * 2007-08-29 2009-03-05 Wayfinder Systems Ab Pre-fetching navigation maps
CN105571608A (en) * 2015-12-22 2016-05-11 苏州佳世达光电有限公司 Navigation system, vehicle and navigation map transmission method
US20180189323A1 (en) * 2016-12-30 2018-07-05 DeepMap Inc. High definition map and route storage management system for autonomous vehicles

Also Published As

Publication number Publication date
CN113748418B (en) 2024-04-30
WO2020139330A1 (en) 2020-07-02

Similar Documents

Publication Publication Date Title
US10876844B2 (en) Systems and methods for loading and tracking maps on a vehicle
US11080216B2 (en) Writing messages in a shared memory architecture for a vehicle
US20200209005A1 (en) Systems and methods for loading object geometry data on a vehicle
US11423677B2 (en) Automatic detection and positioning of pole-like objects in 3D
US10747597B2 (en) Message buffer for communicating information between vehicle components
US11616737B2 (en) Reading messages in a shared memory architecture for a vehicle
US20210347378A1 (en) Method and system for generating an importance occupancy grid map
US11327489B2 (en) Shared memory architecture for a vehicle
CN113874803A (en) System and method for updating vehicle operation based on remote intervention
US20210304607A1 (en) Collaborative perception for autonomous vehicles
EP4085442A1 (en) Identification of proxy calibration targets for a fleet of vehicles
US11673581B2 (en) Puddle occupancy grid for autonomous vehicles
CN114072784A (en) System and method for loading object geometric data on a vehicle
CN113748448B (en) Vehicle-based virtual stop-line and yield-line detection
WO2022178738A1 (en) Method and system for generating a topological graph map
CN113748418B (en) System and method for loading and tracking maps on a vehicle
WO2020139396A1 (en) Writing messages in a shared memory architecture for a vehicle
US20220179082A1 (en) Methods and system for analyzing dynamic lidar point cloud data
WO2020139395A1 (en) Reading messages in a shared memory architecture for a vehicle
CN113767376A (en) Message buffer for transmitting information between vehicle components
US20220067399A1 (en) Autonomous vehicle system for performing object detections using a logistic cylinder pedestrian model
US20240219199A1 (en) Non-semantic map layer in crowdsourced maps
WO2020139389A1 (en) Shared memory architecture for a vehicle
WO2024144948A1 (en) Non-semantic map layer in crowdsourced maps

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant