WO2024052392A1 - Circuitry and method - Google Patents

Circuitry and method Download PDF

Info

Publication number
WO2024052392A1
WO2024052392A1 PCT/EP2023/074416 EP2023074416W WO2024052392A1 WO 2024052392 A1 WO2024052392 A1 WO 2024052392A1 EP 2023074416 W EP2023074416 W EP 2023074416W WO 2024052392 A1 WO2024052392 A1 WO 2024052392A1
Authority
WO
WIPO (PCT)
Prior art keywords
radar
radar signal
candidate
phase
sar
Prior art date
Application number
PCT/EP2023/074416
Other languages
French (fr)
Inventor
Timo GREBNER
Christian Waldschmidt
Original Assignee
Sony Group Corporation
Sony Europe B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Group Corporation, Sony Europe B.V. filed Critical Sony Group Corporation
Publication of WO2024052392A1 publication Critical patent/WO2024052392A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • G01S13/90Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
    • G01S13/904SAR modes
    • G01S13/9054Stripmap mode
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • G01S13/90Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
    • G01S13/9004SAR image acquisition techniques
    • G01S13/9017SAR image acquisition techniques with time domain processing of the SAR signals in azimuth
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93271Sensor installation details in the front of the vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93272Sensor installation details in the back of the vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93274Sensor installation details on the side of the vehicles

Definitions

  • the present disclosure generally pertains to a circuitry and a method, and, more particularly, to a circuitry and a method for generating a synthetic-aperture radar image.
  • SAR synthetic-aperture radar
  • the disclosure provides a circuitry for generating a synthetic aperture radar image, wherein the circuitry is configured to obtain, based on a radar signal detected by a radar sensor, radar measurement data that indicate a received power and a received phase of the radar signal; map the received power of the radar signal to a measurement probability; generate a set of data tuples by associating each candidate phase value of a set of candidate phase values with a candidate measurement probability value of a set of candidate measurement probability values that correspond to a distribution of the measurement probability according to a phase uncertainty of the received phase; map the set of data tuples to a set of phase candidate synthetic aperture radar, PC- SAR, images, and determine a target phase value based on the set of PC-SAR images.
  • the disclosure provides a method for generating a synthetic aperture radar image, wherein the method includes obtaining, based on a radar signal detected by a radar sensor, radar measurement data that indicate a received power and a received phase of the radar signal; mapping the received power of the radar signal to a measurement probability; generating a set of data tuples by associating each candidate phase value of a set of candidate phase values with a candidate measurement probability value of a set of candidate measurement probability values that correspond to a distribution of the measurement probability according to a phase uncertainty of the received phase; mapping the set of data tuples to a set of phase candidate synthetic aperture radar, PC-SAR, images; and determining a target phase value based on the set of PC-SAR images.
  • Fig. 1 shows a block diagram of a circuitry for generating a SAR image according to an embodiment
  • Fig. 2 shows a flow diagram of a method for generating a SAR image according to an embodiment
  • Fig. 3 shows a diagram of a probability distribution of a received phase according to an embodiment
  • Fig. 4 illustrates a mapping of a set of data tuples to a set of PC -SAR images according to an embodiment
  • Fig. 5 is a block diagram depicting an example of schematic configuration of a vehicle control system.
  • Fig. 6 is a diagram of assistance in explaining an example of installation positions of an outside- vehicle information detecting section and an imaging section.
  • SAR synthetic-aperture radar
  • SAR images can be generated in multiple different ways based on the received reflected radar signals.
  • approaches in the frequency domain as well as approaches in the time domain.
  • these approaches generate SAR images based on an amplitude and phase of a time signal or on a frequency signal.
  • using various algorithms, for example a backproj ection algorithm the reception phases of each range cell in a grid map are corrected and then summed up.
  • the cell represents a target, whereas a destructive superposition of all pointers leads to a low-power cell which does not represent a target.
  • this approach allows a simple and efficient way to generate SAR mappings of the environment.
  • this approach causes the resulting map to be directly dependent on the received power of each target. However, this may not indicate the actual occupancy state of the cell.
  • a currently existing map G contains the complex entries G(m i
  • z 1:t ,x 1:t ) describes a complex occupancy amplitude of the cell m i at a current time frame t, and describes a currently measured complex amplitude of a reflected signal that corresponds to a distance r and a ramp k that has a time stamp t k among K ramps emitted into the environment as radar signals in the frame t.
  • the complex amplitude is phase- corrected by with the use of the distance between the cell m i and the sensor position at the time stamp t k , wherein ⁇ start is the start frequency of the ramp and c 0 is the speed of light. This procedure may be repeated for all received ramps and cells of the resulting map G.
  • the distance-proportional phase correction there are many different approaches to correct the received phase, but all are based on the same principle: the distance-proportional phase correction.
  • such an approach has the disadvantage that only the amplitude information and the phase information of the complex measured values are taken into account.
  • the dynamics of the resulting map may depend significandy on the strongest target and the weakest target.
  • Targets which have a small radar cross section (RCS) but, due to their phase, lead to a constructive overlapping of the complex pointers after application of the SAR algorithm may therefore not be shown in maps although the measured phases may represent a target (object in the environment).
  • RCS radar cross section
  • measurement inaccuracies such as noise or position inaccuracies are not taken into account, although they may have a significant impact on the measurement results.
  • some embodiments pertain to a circuitry for generating a synthetic aperture radar (SAR) image, wherein the circuitry is configured to obtain, based on a radar signal detected by a radar sensor, radar measurement data that indicate a received power and a received phase of the radar signal; map the received power of the radar signal to a measurement probability; generate a set of data tuples by associating each candidate phase value of a set of candidate phase values with a candidate measurement probability value of a set of candidate measurement probability values that correspond to a distribution of the measurement probability according to a phase uncertainty of the received phase; map the set of data tuples to a set of phase candidate synthetic aperture radar (PC- SAR) images; and determine a target phase value based on the set of PC-SAR images.
  • PC- SAR phase candidate synthetic aperture radar
  • the radar sensor may include a chirped-sequence radar sensor that may emit a radar signal into an environment, wherein a frequency of the emitted radar signal is modulated.
  • the frequency may vary according to a frequency ramp, starting at a predetermined start frequency, with a predetermined slope.
  • the slope may be constant (linear chirp) or may vary for realizing a non-linear frequency variation (e.g., an exponential chirp).
  • the radar sensor may emit radar signals repeatedly, e.g., with a predetermined repetition rate. The repeated radar signals may be equally or differently modulated.
  • the emitted radar signals may be reflected by objects in the environment.
  • the objects reflecting the radar signals may include cars, trucks, bicycles, other vehicles, pedestrians, animals, walls, road signs, traffic barriers, bridges, trees, stones, buildings, fences, or the like. Such objects may also be referred to as targets.
  • the radar sensor may receive radar signals reflected by an object in the environment and may generate the radar measurement data based on information extracted from the received reflected radar signals. For example, the radar sensor may determine a power of the received radar signal. The radar sensor may determine to which emitted radar signal, e.g. to which ramp, the received radar signal corresponds. This determination may be based on a timing of emitting radar signals into the environment, on a timing of receiving the reflected radar signal, on a start frequency, a slope and/ or a shape (e.g., linear chirp, exponential chirp, etc.) of the radar signal, or the like. The radar sensor may further determine, e.g. based on a delay between receiving the reflected radar signal and emitting the corresponding radar signal, a phase of the received radar signal.
  • the radar sensor may further determine, e.g. based on a delay between receiving the reflected radar signal and emitting the corresponding radar signal, a phase of the received radar signal.
  • the radar measurement data may be generated by any radar sensor that is capable of determining a received power and a received phase of a radar signal reflected from an environment.
  • the radar measurement data may be generated by a time-division multiplexing (TDM) radar, a frequency-division multiplexing (FDM) radar, a code-division multiplexing (CDM) radar, or the like.
  • TDM time-division multiplexing
  • FDM frequency-division multiplexing
  • CDM code-division multiplexing
  • the circuitry may include any entity that is capable of performing information processing, such as generating a SAR image.
  • the circuitry may include a field-programmable gate array (FPGA), an application-specific integrated circuit (ASIC), a microprocessor or the like.
  • the circuitry may obtain the radar measurement data directly from the radar sensor or indirectly via another information processing apparatus or via a magnetic, optical or semiconductor-based storage medium.
  • the circuitry may further include a storage unit that may, for example, include a dynamic random- access memory (DRAM), a flash memory, an electrically erasable programmable read-only memory (EEPROM), a hard disk drive or the like.
  • the circuitry may further include an interface for receiving the radar measurement data and for outputting the generated SAR image, e.g., via serial peripheral interface (SPI), peripheral component interconnect (PCI), controller area network (CAN), universal storage bus (USB), Ethernet, IEEE 802.11 (Wi-Fi), Bluetooth, or the like.
  • the circuitry may receive the radar measurement data via the interface from the radar sensor, from another information processing apparatus or from a storage medium.
  • the circuitry may output the generated SAR image via the interface to another information processing apparatus or to a storage medium.
  • the mapping of the received power of the radar signal to the measurement probability may, for example, map the received power to a value in an interval from zero to one, without limiting the disclosure to these values.
  • the mapping may, for example, depend on environmental conditions and/ or on a configuration of the radar sensor.
  • the mapping may depend on a radar cross section (RCS) of an object reflecting the radar signal (e.g., on a material, a size, a shape and/or a distance of the object), on a power of a radar signal emitted by the radar sensor, on an expected received power of the radar signal, on a power of electromagnetic interference reducing a quality of the radar signal or the like.
  • RCS radar cross section
  • the measurement probability may, for example, correspond to a probability that the radar signal is not a measurement noise or an artefact but is reflected by a real object.
  • the set of candidate phase values may, for example, be predetermined or may be generated based on the received phase of the radar signal.
  • the candidate phase values of the set of candidate phase values may be evenly distributed in the interval from 0° to 360° to cover a full period, without limiting the present disclosure to these values.
  • the candidate phase values of the set of phase values may be generated in an interval around the received phase of the radar signal and may be distributed evenly in the interval or may be distributed with a higher density in a subinterval that includes the received phase of the radar signal than in a subinterval that does not include the received phase.
  • the set of candidate measurement probability values may be determined based on the set of candidate phase values. For example, for each candidate phase value of the set of candidate phase values, a candidate measurement probability value of the set of candidate measurement probability values may be determined.
  • the candidate measurement probability values may, for example, be computed for each candidate phase value of the set of candidate phase values based on a probability density function or may be read from a predetermined table, wherein candidate measurement probability values that lie between entries of the table may be interpolated.
  • the distribution of the measurement probability may represent the phase uncertainty of the received phase.
  • a standard deviation of the distribution may be determined based on properties of the radar sensor and/ or based on a quality of the radar signal.
  • the set of data tuples may be generated by associating, for each data tuple of the set of data tuples, a candidate phase value of the set of candidate phase values with a candidate measurement probability value of the set of candidate measurement probability values.
  • the set of PC-SAR images may, for example, include an associated PC-SAR image for each candidate phase value of the set of candidate phase values.
  • the mapping of the set of data tuples to the set of PC-SAR images may assign the respective candidate measurement probability values of the data tuples of the set of data tuples to the respective PC-SAR images of the set of PC-SAR images that are associated with the respective candidate phase values of the data tuples of the set of data tuples.
  • the determining of the target phase value may, for example, select, as the target phase value, a candidate phase value of the set of candidate phase values based on a candidate measurement probability value that is assigned to the PC-SAR image to which the candidate phase value is mapped.
  • the circuitry may then generate a SAR image based on the candidate probability value that has been assigned to the PC-SAR image associated with the target phase value.
  • the target phase value may be represented by an index of a PC-SAR image, of the set of PC-SAR images, that is associated with the target phase value.
  • the mapping of the received power includes determining a signal-to-noise ratio of the radar signal based on the received power; and determining the measurement probability based on the signal-to-noise ratio of the radar signal.
  • the signal-to-noise ratio of the radar signal may be determined as a quotient of the received power of the radar signal and a noise level of the radar sensor.
  • the noise level may be determined in any suitable way.
  • the noise level may be determined based on a calibration measurement of a known environment that may, for example, include no objects (targets) in a predetermined portion.
  • the noise level may be determined based on the radar measurement data, e.g., based on a mean value, a median or another suitable quantile of a received power of a plurality of radar signals, or based on clustering (such as hierarchical clustering or as density-based spatial clustering of applications with noise (DBSCAN)), wherein the noise level may be determined based on a largest or densest cluster of values of the received power, e.g., based on a maximum, a mean, a quantile or a density distribution of the cluster.
  • DBSCAN density-based spatial clustering of applications with noise
  • the measurement probability may, for example, be determined based on an exponential function, wherein an exponent includes the signal-to-noise ratio.
  • the exponent may further include a scaling factor multiplied with the signal-to-noise ratio for determining a limited growth of the measurement probability.
  • the scaling factor may, for example, be empirically determined based on a configuration of the radar sensor and/ or on expected conditions of the environment.
  • the measurement probability may, for example, be chosen from an interval between a minimum probability and a maximum probability, based on the received power of the radar signal.
  • the minimum probability may be 0, 0.01, 0.1 or 0.5 and may be chosen if the received power equals or falls below the noise level.
  • the maximum probability may be 1, 0.99 or 0.9 and may be chosen if the received power equals an expected maximum received power, wherein the expected maximum received power may correspond to a power of an emitted radar signal reduced according to the inverse-square law with respect to a distance (range) determined based on the radar signal.
  • the maximum probability may further be reduced according to a difference between the expected maximum received power and the noise level.
  • the minimum probability may also be chosen if the received power is at least a certain, e.g. predetermined, amount lower than the noise level to account for an overlap between a distribution of a received power of noise signals and a distribution of a received power of radar signals reflected from an object.
  • the minimum probability and the maximum probability are not limited to the values described herein, and the skilled person may find other values for the minimum probability and the maximum probability that are suitable for determining the measurement probability.
  • the distribution of the candidate measurement probability values of the set of data tuples corresponds to a probability distribution, wherein at least one of a mean and a mode of the probability distribution corresponds to the received phase of the radar signal and has a probability value that corresponds to the received power of the radar signal.
  • the probability distribution may correspond to a Gaussian distribution (normal distribution).
  • the standard deviation of the probability distribution may correspond to the phase uncertainty of the received phase, e.g., to a limited precision of the radar sensor for measuring the received phase.
  • the candidate measurement probability values of the set of candidate measurement probability values may be based on evaluating a probability density function of the probability distribution for the candidate phase values of the set of candidate phase values.
  • the candidate measurement probability values may further be scaled with respect to the probability density function corresponding to the measurement probability.
  • the determining of the target phase value includes selecting, as the target phase value, a candidate phase value that is mapped to a PC-SAR image associated with a highest candidate measurement probability value among the set of PC-SAR images.
  • the circuitry may generate the SAR image based on the PC-SAR image associated with the target phase value, e.g., based on the highest candidate measurement probability value, which is associated with the PC-SAR image.
  • the circuitry is further configured to obtain, based on at least one further radar signal detected by the radar sensor, further radar measurement data that indicate a received power and a received phase of the at least one further radar signal; perform the mapping of the received power, the generating of the set of data tuples and the mapping of the set of data tuples accordingly for the at least one further radar signal; update, based on the set of data tuples generated for the at least one further radar signal, the candidate measurement probability values associated with the respective PC-SAR images for the at least one further radar signal; and determine the target phase value based on the updated candidate measurement probability values.
  • the at least one further radar signal may include one further radar signal, 100 further radar signals or 10,000 further radar signals, without limiting the present disclosure to these numbers or to these orders of magnitude.
  • the circuitry may map the received power of each further radar signal to a corresponding measurement probability, as described above.
  • the circuitry may further generate a set of data tuples, as described above, by associating candidate phase values of the set of candidate phase values with candidate measurement probability values of a set of candidate measurement probability values, wherein the candidate measurement probability values may, for example, be generated by evaluating, at the candidate phase values of the set of candidate phase values, a distribution of the measurement probability according to a phase uncertainty of the received phase, as described above.
  • the circuitry may then map, for each further radar signal, the generated set of data tuples to the set of PC-SAR images. For example, each PC-SAR image of the set of PC-SAR images may be associated with a candidate phase value of the set of candidate phase values.
  • the circuitry may map each data tuple of the set of data tuples generated for the further radar signals to the PC-SAR image, of the set of PC-SAR images, that is associated with the candidate phase value of the respective data tuple.
  • the number of PC-SAR images in the set of PC-SAR images may correspond to a number of candidate phase values in the set of candidate phase values, and for the radar signal as well as for each further radar signal a candidate measurement probability value, for the set of candidate measurement probability values, may be generated for each candidate phase value of the set of candidate phase values.
  • each PC-SAR image of the set of PC-SAR images may be associated with one data tuple, which may associate a candidate phase value and a candidate measurement probability value with each other, for the radar signal and with one data tuple per further radar signal.
  • the present disclosure is not limited to providing a same number of elements in the set of candidate phase values, in the set of candidate measurement probability values, in the set of data tuples and/ or in the set of PC-SAR images.
  • the circuitry may generate fewer candidate measurement probability values than there are candidate phase values in the set of candidate phase values, e.g., the circuitry may generate candidate measurement probability values only for candidate phase values at which the distribution of the measurement probability yields a value that exceeds a predetermined threshold, e.g., for candidate phase values that lie within a predetermined number of standard deviations around the received phase.
  • the circuitry may generate fewer data tuples than there are candidate measurement probability values in the set of candidate measurement probability values, e.g., the circuitry may generate data tuples only for candidate measurement probability values that exceed a predetermined threshold.
  • the circuitry may generate fewer data tuples, which associate a candidate phase value and a candidate measurement probability value, than there are PC-SAR images in the set of PC-SAR images, e.g., if the circuitry generates candidate measurement probability values or corresponding data tuples only for candidate measurement probability values that exceed a predetermined threshold.
  • a PC-SAR image of the set of PC-SAR images may be associated with fewer data tuples than there are reflected radar signals received by the radar sensor.
  • the updating of the candidate measurement probability values associated with the respective PC- SAR images for the at least one further radar signal may include updating, for each data tuple generated for the at least one further radar signal, the candidate measurement probability value assigned to the respective PC-SAR image that is associated with the candidate phase value of the respective data tuple, wherein the circuitry updates the candidate measurement probability value assigned to the respective PC-SAR image such that an updated candidate measurement probability value, which the circuitry assigns to the respective PC-SAR image, depends on the candidate measurement probability value of the respective data tuple.
  • the circuitry may make the candidate measurement probability values associated with the respective PC-SAR images dependent on the candidate measurement probability values associated with the respective candidate phase values by any data tuple generated for the radar signal or for the at least one further radar signal.
  • the circuitry may, for example, determine, as the target phase value, a candidate phase value associated with a PC-SAR image, of the set of PC-SAR images, that is associated with a highest candidate measurement probability value. The circuitry may then set the highest candidate measurement probability value, which is assigned to the PC-SAR image associated with the target phase value, as a measurement probability value, or as a target probability indicating a probability of a presence of a target, in a SAR image which the circuitry generates.
  • the updating is based on a product of a first factor and a second factor, wherein the first factor is based on the candidate measurement probability values associated with the respective PC-SAR images and the second factor is based on the respective candidate measurement probability values of the at least one further radar signal.
  • the updating of the candidate measurement probability values associated with the respective PC-SAR images may be based on an odds ratio and may, for example, be based on a candidate measurement probability value of a current measurement (e.g., a currently processed radar signal), on one or more candidate probability values (“previous candidate probability values”) from which the candidate PC-SAR image has already been made dependent, and on a priori information.
  • the circuitry may iteratively update the candidate measurement probability values associated with the respective PC-SAR images for each respective data tuple.
  • the updating may be based on an update formula for occupancy grid maps (OGM), such as equation (2), which is based on Thrun, Sebastian: “Learning occupancy grids with forward models”, Proceedings 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2001, Digital Object Identifier (DOI): 10.1109/IROS.2001.977219:
  • OGM occupancy grid maps
  • z 1:t ,x 1:t ) represents a probability associated with a cell m i of a grid map, e.g. of a PC-SAR image, wherein the probability is based on sensor measurements z 1:t and corresponding sensor positions X 1:t of time frames 1 to t, i.e., of a currendy processed time frame t and all previously processed time frames.
  • z t , x t ) represents a probability associated with the cell mi based on a sensor measurement z t and a corresponding sensor position x t of the currently processed time frame t
  • z 1:t-1 , x 1:t-1 represents a probability associated with the cell m ⁇ based on sensor measurements z 1:t-1 and corresponding sensor positions x 1:t-1 of time frames from 1 to t — 1, i.e., without the currendy processed time frame t.
  • time frame 1 i.e., when a first probability is inserted into a cell, the recursive term may be omitted (or set to 1).
  • p(m i ) represents a prior for an occupancy of the cell m i and may be initialized to, e.g., 0.5, without limiting the present disclosure thereto.
  • the term marked as “current measurement” in equation (2) may correspond to a currendy processed measurement, e.g., to a candidate measurement probability of a data tuple based on which the circuitry updates a candidate measurement probability assigned to a PC-SAR image.
  • the term marked as “recursive term” in equation (2) may correspond to previously processed measurements, e.g., to a candidate measurement probability value that has previously been assigned to the PC-SAR image.
  • the term marked as “a priori information” in equation (2) may correspond to a predetermined a priori probability.
  • z 1:t ,x 1:t ) of equation (2) may be assigned to the PC-SAR image as an updated candidate measurement probability value.
  • the circuitry may evaluate the product in the logarithmic domain, wherein the product of the first factor and the second factor may be represented as a sum of the logarithm of the first factor and a logarithm of the second factor.
  • the radar signal and the at least one further radar signal are detected by the radar sensor from different positions.
  • the radar sensor may be provided on a mobile platform and perform radar measurements while the mobile platform (including the radar sensor) is moving through the environment.
  • the performing of radar measurements may include emitting radar signals into the environment and receiving radar signals from the environment that have been reflected by objects in the environment.
  • the mobile platform may, for example, include a car, a truck, a motorcycle, a bicycle, a tractor, an excavator, a train, a boat, a ship, a helicopter, an airplane, a drone or the like.
  • the radar sensor While moving through the environment, the radar sensor may emit radar signals from different positions into the environment and may receive radar signals from the environment from different positions. Thus, the radar signals received by the radar sensor may correspond to different views of the environment, wherein an object in the environment may be sensed from different directions and/ or from different distances.
  • the circuitry may determine a position of the object in the environment.
  • the circuitry may determine the position of the radar sensor in the environment, for example, based on the movement of the mobile platform, e.g., based on a speed sensor, an acceleration sensor, a rotation sensor, a global navigation satellite system (GNSS) such as Global Positioning System (GPS), Galileo, Michibiki, Beidou or GLONASS, or the like, and/ or based on simultaneous localization and mapping (SLAM).
  • GNSS global navigation satellite system
  • GPS Global Positioning System
  • Galileo Galileo
  • Michibiki Beidou or GLONASS
  • SLAM simultaneous localization and mapping
  • the circuitry may determine a position of the object in the environment even if only a distance to the object but no angle can be determined based on reflected radar signals received from the object.
  • the radar measurement data further indicate a range of the radar signal; wherein the mapping of the set of data tuples includes mapping the set of data tuples to portions, of the respective PC-SAR images, that correspond to the radar signal.
  • the radar sensor and/ or the circuitry may determine the range of the radar signal based on the received phase of the radar signal.
  • the received phase of the radar signal may indicate a delay between emitting a radar signal and receiving a corresponding reflected radar signal.
  • the range of the radar signal may be determined based on a Fourier transform (e.g., a Fast Fourier Transform (FFT)) of the radar signal.
  • FFT Fast Fourier Transform
  • the range of the radar signal may correspond to a distance between the radar sensor and an object in the environment at which the radar signal is reflected.
  • the circuitry may map the set of data tuples to portions of the respective PC-SAR images that correspond to the range of the radar signal, e.g., to portions that correspond to a distance from the radar sensor according to the range of the radar signal at the time of receiving the radar signal.
  • the portions of a PC-SAR image that correspond to the range of the radar signal may be arranged on a circular arc, wherein a radius of the circular arc may correspond to the range of the radar signal and a center associated with the circular arc may correspond to the position of the radar signal in the environment at the time of receiving the radar signal.
  • the respective candidate measurement probability values may be mapped to portions of the PC-SAR images that may be arranged in circular arcs shifted against each other.
  • the corresponding circular arcs of updated portions of the PC-SAR images may overlap at portions that correspond to positions of corresponding objects (targets) in the environment, such that the updated candidate measurement probability values of the respective portions may add up or accumulate (positively interfere) at portions that correspond to positions of corresponding objects (targets) in the environment and may cancel each other out or average each other out (negatively interfere) at portions that do not correspond to the portions of the corresponding objects (targets) in the environment.
  • the PC-SAR images of the set of PC-SAR images include grid maps, and the portions of the PC-SAR-images correspond to cells of the grid maps.
  • the cells of the grid maps may have a rectangular or a quadratic shape.
  • the present disclosure is not limited thereto.
  • the cells may instead have a triangular or a hexagonal shape or any other suitable shape.
  • each cell may have an assigned candidate measurement probability value that may be updated independently from candidate measurement probability values assigned to other cells of the grid maps.
  • Each cell of a grid map may correspond to a portion of the environment.
  • a size of a cell e.g., a side length of a rectangular or quadratic cell, a diameter of a hexagonal cell, an altitude of a triangular cell or a diameter of a circular cell
  • a size of a cell may, for example, correspond to 1 centimeter, 1 decimeter or
  • All cells of a grid map may have a same size and shape.
  • a grid map may include cells of different sizes and/ or of different shapes.
  • all PC-SAR images of the set of PC-SAR images may include grid maps, wherein the grid maps of all PC-SAR images of the set of PC-SAR images may be equally configured, e.g., the grid maps of all PC-SAR images may have a same number of cells, a same number of rows and columns of cells, a same shape of cells, a same size of cells, and a same mapping of cells to the environment.
  • a grid map of the SAR image to be generated by the circuitry may be equally configured as the grid maps of the PC-SAR images.
  • the present disclosure is not limited to equally configured grid maps.
  • the grid maps of the set of PC-SAR images and/ or the grid map of the SAR image to be generated may be configured differently from each other.
  • a grid map of each PC- SAR image of the set of PC-SAR images as well as a grid map of the SAR image to be generated by the circuity may have a defined mapping from its cells to positions in the environment, and a grid map of each PC-SAR image of the set of PC-SAR images may have a defined mapping to a grid map of the SAR image to be generated by the circuitry.
  • cells of a grid map of the SAR image to be generated by the circuitry may correspond to pixels of the SAR image.
  • the pixels of the SAR image that correspond to the cells of the grid map of the SAR image may be displayed with a color that corresponds to a probability, e.g., a measurement probability or a target probability, that is assigned to the corresponding cell.
  • the circuitry is further configured to correct the received phase of the radar signal based on the range of the radar signal; and perform the generating and the mapping of the set of data tuples based on the corrected phase.
  • the range of the radar signal may represent a distance between the radar sensor and an object in the environment that has reflected the radar signal
  • the circuitry may correct the received phase based on the range of the radar signal.
  • the circuitry may, for example, obtain the distance based on the radar signal (e.g., based on a phase between the received phase of the radar signal and a phase of a previously emitted radar signal that corresponds to the radar signal received from the environment).
  • the correcting of the received phase may be based on multiplying a phase correction factor to a complex pointer that represents, with its absolute value, a received power (or a measurement probability based on the received power) and, with its argument, a received phase of the radar signal.
  • the phase correction factor may be based on a complex exponential function, as shown in equation (3):
  • j is the imaginary unit
  • ⁇ start is a start frequency of a frequency ramp of a radar signal emitted into the environment
  • c 0 is the speed of light
  • r is the determined distance between the radar sensor and an object that reflects the radar signal.
  • the factor 2 is inserted for taking into account a round trip from the sensor to the reflecting object and back to the sensor.
  • a corrected phase may correspond to a phase at a position of the radar sensor. Correcting the received phase in such a way allows, in some embodiments, comparing received phases of radar signals received at different sensor positions in the environment, e.g., at different distances from the reflecting object.
  • the circuitry may correct the received phase before generating the set of data tuples and before mapping the set of data tuples to the set of PC-SAR images.
  • the circuitry may correct the received phase before generating the set of candidate phase values.
  • the circuitry may generate the set of candidate phase values based on the corrected phase, and may perform the generating of the set of data tuples and the mapping of the set of data tuples to the set of PC-SAR images based on the candidate phase values that are based on the corrected phase.
  • Some embodiments pertain to a method for generating a SAR image, wherein the method includes obtaining, based on a radar signal detected by a radar sensor, radar measurement data that indicate a received power and a received phase of the radar signal; mapping the received power of the radar signal to a measurement probability; generating a set of data tuples by associating each candidate phase value of a set of candidate phase values with a candidate measurement probability value of a set of candidate measurement probability values that correspond to a distribution of the measurement probability according to a phase uncertainty of the received phase; mapping the set of data tuples to a set of PC-SAR images; and determining a target phase value based on the set of PC- SAR images.
  • the method may be configured as described above with respect to the circuitry. Thus, all features of the circuitry may correspond to features of the method.
  • the methods as described herein are also implemented in some embodiments as a computer program causing a computer and/or a processor to perform the method, when being carried out on the computer and/ or processor.
  • a non-transitory computer-readable recording medium is provided that stores therein a computer program product, which, when executed by a processor, such as the processor described above, causes the methods described herein to be performed.
  • Fig. 1 shows a block diagram of a circuitry 1 for generating a SAR image according to an embodiment.
  • the circuitry 1 includes a processor 2, a storage unit 3 and a communication unit 4.
  • the processor 2 controls an operation of the circuitry 1 based on instructions stored in the storage unit 3.
  • the storage unit 3 stores the instructions for the processor 2 and further data for generating a SAR image, including radar measurement data, temporary data (including PC-SAR images) and the generated SAR image.
  • the communication unit 4 includes a universal storage bus (USB) interface, receives, via the USB interface, radar measurement data and outputs, via the USB interface, a SAR image generated based on the received radar measurement data.
  • USB universal storage bus
  • the circuitry 1 further includes a measurement data obtaining unit 5, a phase correcting unit 6, a power mapping unit 7, a data tuple generating unit 8, a data tuple mapping unit 9, an updating unit 10 and a target phase determining unit 11.
  • the measurement data obtaining unit 5 obtains, based on radar signals detected by a radar sensor, radar measurement data that indicate a received power of the radar signals, a received phase of the radar signals and a range of the radar signals.
  • the radar signals are detected by the radar sensor from different positions within an environment of the radar sensor.
  • the radar signals correspond to radar signals emitted by the radar sensor into the environment and are reflected by objects (targets) in the environment.
  • the radar sensor includes a chirped-sequence radar sensor that is capable of measuring a received power and a received phase of received radar signals.
  • the range of the radar signals is determined based on the received phase of the radar signals.
  • the measurement data obtaining unit 5 obtains the radar measurement data via the communication unit 4 and stores the radar measurement data in the storage unit 3.
  • the phase correcting unit 6 corrects the received phase of the radar signals based on a distance of the radar signals.
  • the correcting of the received phase is based on equation (3).
  • the phase correcting unit 6 obtains the radar measurement data from the storage unit 3 and stores the corrected phase in the storage unit 3.
  • the power mapping unit 7 maps the received power of the radar signals to a measurement probability.
  • the power mapping unit 7 receives the received power from the storage unit 3 and stores the measurement probability, to which the received power is mapped, in the storage unit 3.
  • the power mapping unit 7 includes a signal-to-noise ratio (SNR) determining unit 12 and a probability determining unit 13.
  • SNR signal-to-noise ratio
  • the SNR determining unit 12 determines a SNR of the radar signals based on a quotient of the received power and a predetermined noise level.
  • the probability determining unit 13 determines the measurement probability based on an exponential function, wherein the exponent of the exponential function includes a product of the SNR determined by the SNR determining unit 12 with a predetermined scaling factor.
  • the power mapping unit 7 outputs, to the storage unit 3, the measurement probability determined by the probability determining unit 13.
  • the data tuple generating unit 8 generates, for each radar signal of the radar measurement data, a set of data tuples by associating each candidate phase value of a set of candidate phase values with a candidate measurement probability value of a set of candidate measurement probability values that correspond to a distribution of the measurement probability according to a phase uncertainty of the received phase.
  • the data tuple generating unit 8 receives the measurement probability and the received phase (represented by the corrected phase corrected by the phase correcting unit 6) from the storage unit 3 and stores the generated set of data tuples in the storage unit 3.
  • the distribution of the candidate measurement probability values of the set of data tuples corresponds to a probability distribution, wherein a mean and a mode of the probability distribution corresponds to the received phase of the radar signal and has a probability value that corresponds to the received power of the radar signal.
  • the distribution of the measurement probability corresponds to a Gaussian distribution (normal distribution).
  • a mean and a mode of the distribution are located at the corrected phase of the corresponding radar signal (which is corrected by the phase correcting unit 6 and which represents the received phase of the radar signal), and a probability value of the distribution at the mean and at the mode corresponds to the measurement probability determined by the power mapping unit 7 based on the received power of the corresponding radar signal.
  • a standard deviation of the distribution corresponds to a phase uncertainty of the received phase.
  • the data tuple generating unit 8 evaluates a probability density function of the distribution of the measurement probability at the candidate phase values of the set of candidate phase values.
  • the candidate phase values of the set of candidate phase values are predetermined and are equally distributed in an interval from 0° to 360°, i.e., the set of candidate phase values covers one period.
  • the values obtained by evaluating the probability density function at the candidate phase values of the set of candidate phase values are the candidate measurement probability values of the set of measurement probability values.
  • the data tuple generating unit 8 associates each candidate phase value of the set of candidate phase values with the corresponding candidate measurement probability value of the set of candidate measurement probability values, wherein each pair of a candidate phase value and a candidate measurement probability value associated with each other is a data tuple of the set of data tuples.
  • the data tuple generating unit 8 generates a set of data tuples for each radar signal indicated by the radar measurement data and stores each generated set of data tuples in the storage unit 3.
  • the data tuple mapping unit 9 maps the sets of data tuples generated by the data tuple generating unit 8 to a set of PC-SAR images.
  • the data tuple mapping unit 9 receives the sets of data tuples from the storage unit 3 and stores a generated mapping in the storage unit 3.
  • Each PC-SAR image of the set of PC-SAR images is associated with a candidate phase value of the set of candidate phase values.
  • the data tuple mapping unit 9 maps each data tuple generated by the data tuple generating unit 8 to the PC-SAR image of the set of PC-SAR images that is associated with the candidate phase value of the data tuple.
  • each PC-SAR image of the set of PC-SAR images includes a grid map, wherein each cell of the grid map is quadratic and corresponds to a region of one centimeter by one centimeter in the environment.
  • the grid maps of the PC-SAR images have a same number of rows of cells and have a same number of columns of cells, wherein cells at equal positions in the grid maps correspond to a same region in the environment for all grid maps.
  • the mapping of the set of data tuples includes mapping the set of data tuples to portions, of the respective PC-SAR images, that correspond to the respective radar signal, wherein the portions of the respective PC-SAR images correspond to cells of the grid maps of the PC-SAR images.
  • the data tuple mapping unit 9 determines which cells correspond to the radar signal based on the range of the radar signal and based on a position of the radar sensor in the environment at a time of receiving the radar signal. Then, the data tuple mapping unit 9 maps each respective data tuple to the determined cells, of the PC-SAR image associated with the data tuple, that correspond to the range of the radar signal around the position of the radar sensor in the environment at the time of receiving the radar signal.
  • the data tuple mapping unit 9 performs the mapping for each set of data tuples generated by the data tuple generating unit 8 for each radar signal indicated by the radar measurement data.
  • the updating unit 10 updates, based on the sets of data tuples generated for the radar signals, the candidate measurement probability values associated with the respective PC-SAR images for the radar signals.
  • the updating unit 10 receives the mapping generated by the data tuple mapping unit 9 from the storage unit 3 and stores updated candidate measurement probability values in the storage unit 3.
  • the updating of the candidate measurement probability values is based on a product of a first factor and a second factor, wherein the first factor is based on the candidate measurement probability values associated with the respective PC-SAR images and the second factor is based on the respective candidate measurement probability values of the respective radar signals. More concretely, the updating is based on equation (2) and is performed iteratively for each data tuple.
  • the target phase determining unit 11 determines a target phase value based on the set of PC-SAR images, wherein the determining of the target phase value includes selecting, as the target phase value, a candidate phase value that is mapped to a PC-SAR image associated with a highest updated candidate measurement probability value among the set of PC-SAR images.
  • the target phase determining unit 11 receives the set of PC-SAR images and the updated candidate measurement probability values determined by the updating unit 10 from the storage unit 3 and stores the determined target phase value in the storage unit 3.
  • the target phase determining unit 11 determines for each cell of the grid maps of the set of PC-SAR images a target phase value that indicates a candidate phase value associated with the PC-SAR image from which the candidate measurement probability value of the corresponding cell should be used for the SAR image to be generated by the circuitry 1.
  • the target phase determining unit 11 selects, as the target phase value, the candidate phase value associated with the PC-SAR image of the set of PC-SAR images in which a largest candidate measurement probability value is assigned to the corresponding cell.
  • the circuitry 1 further generates a SAR image that includes a grid map with a same number of columns of cells and with a same number of rows of cells as the grid maps of the set of PC-SAR images, and wherein the cells of the grid map of the SAR image correspond to a same respective region in the environment as the corresponding cells in the grid maps of the set of PC-SAR images.
  • the circuitry 1 assigns to each cell of the grid map of the SAR image a value that corresponds to the candidate measurement probability value that is assigned to the corresponding cell of the PC-SAR image associated with the target phase value determined by the target phase determining unit 11.
  • the circuitry 1 then outputs the generated SAR image via the communication unit 4.
  • Fig. 2 shows a flow diagram of a method 20 for generating a SAR image according to an embodiment. The method 20 is performed by the circuitry 1 of Fig. 1.
  • the method 20 obtains radar measurement data.
  • the obtaining of radar measurement data at S21 is performed by the measurement data obtaining unit 5 of Fig. 1.
  • the obtaining of measurement data at S21 obtains, based on radar signals detected by a radar sensor, radar measurement data that indicate a received power of the radar signals, a received phase of the radar signals and a range of the radar signals.
  • the radar signals are detected by the radar sensor from different positions within an environment of the radar sensor.
  • the radar signals correspond to radar signals emitted by the radar sensor into the environment and are reflected by objects (targets) in the environment.
  • the radar sensor includes a chirped-sequence radar sensor that is capable of measuring a received power and a received phase of received radar signals.
  • the range of the radar signals is determined based on the received phase of the radar signals.
  • the obtaining of measurement data obtains the radar measurement data via the communication unit 4 of Fig. 1 and stores the radar measurement data in the storage unit 3 of Fig. 1.
  • the method 20 corrects the received phase of the radar signals.
  • the correcting of the received phase at S22 is performed by the phase correcting unit 6 of Fig. 1.
  • the correcting of the received phase at S22 corrects the received phase of the radar signals based on a distance of the radar signals.
  • the correcting of the received phase is based on equation (3).
  • the correcting of the received phase obtains the radar measurement data from the storage unit 3 of Fig. 1 and stores the corrected phase in the storage unit 3 of Fig. 1.
  • the method 20 maps the received power of the radar signals to a measurement probability.
  • the mapping of the received power at S23 is performed by the power mapping unit 7 of Fig. 1.
  • the mapping of the received power receives the received power from the storage unit 3 of Fig. 1 and stores the measurement probability, to which the received power is mapped, in the storage unit 3 of Fig- 1.
  • the mapping of the received power at S23 includes determining a signal-to-noise ratio (SNR) at S24 and determining a measurement probability at S25.
  • the determining of the SNR at S24 is performed by the SNR determining unit 12 of Fig. 1 and determines a SNR of the radar signals based on a quotient of the received power and a predetermined noise level.
  • the determining of the measurement probability at S25 is performed by the probability determining unit 13 of Fig. 1 and determines the measurement probability based on an exponential function, wherein the exponent of the exponential function includes a product of the SNR determined at S24 with a predetermined scaling factor.
  • the mapping of the received power at S23 outputs, to the storage unit 3 of Fig. 1, the measurement probability determined at S25.
  • the method 20 generates a set of data tuples.
  • the generating of the set of data tuples at S26 is performed by the data tuple generating unit 8 of Fig. 1 and generates, for each radar signal of the radar measurement data, a set of data tuples by associating each candidate phase value of a set of candidate phase values with a candidate measurement probability value of a set of candidate measurement probability values that correspond to a distribution of the measurement probability according to a phase uncertainty of the received phase.
  • the generating of the set of data tuples at S26 receives the measurement probability and the received phase (represented by the corrected phase corrected at S22) from the storage unit 3 of Fig. 1 and stores the generated set of data tuples in the storage unit 3 of Fig. 1.
  • the distribution of the candidate measurement probability values of the set of data tuples corresponds to a probability distribution, wherein a mean and a mode of the probability distribution corresponds to the received phase of the radar signal and has a probability value that corresponds to the received power of the radar signal.
  • the distribution of the measurement probability corresponds to a Gaussian distribution (normal distribution).
  • a mean and a mode of the distribution are located at the corrected phase of the corresponding radar signal (which is corrected at S22 and which represents the received phase of the radar signal), and a probability value of the distribution at the mean and at the mode corresponds to the measurement probability determined at S23 based on the received power of the corresponding radar signal.
  • a standard deviation of the distribution corresponds to a phase uncertainty of the received phase.
  • the generating of the set of data tuples at S26 evaluates a probability density function of the distribution of the measurement probability at the candidate phase values of the set of candidate phase values.
  • the candidate phase values of the set of candidate phase values are predetermined and are equally distributed in an interval from 0° to 360°, i.e., the set of candidate phase values covers one period.
  • the values obtained by evaluating the probability density function at the candidate phase values of the set of candidate phase values are the candidate measurement probability values of the set of measurement probability values.
  • the generating of the set of data tuples at S26 associates each candidate phase value of the set of candidate phase values with the corresponding candidate measurement probability value of the set of candidate measurement probability values, wherein each pair of a candidate phase value and a candidate measurement probability value associated with each other is a data tuple of the set of data tuples.
  • the generating of the set of data tuples at S26 generates a set of data tuples for each radar signal indicated by the radar measurement data and stores each generated set of data tuples in the storage unit 3 of Fig. 1.
  • the method 20 maps the sets of data tuples generated at S26 to a set of PC-SAR images.
  • the mapping of the set of data tuples at S27 is performed by the data tuple mapping unit 9.
  • the mapping of the sets of data tuples at S27 receives the sets of data tuples from the storage unit 3 of Fig. 1 and stores a generated mapping in the storage unit 3 of Fig. 1.
  • Each PC-SAR image of the set of PC-SAR images is associated with a candidate phase value of the set of candidate phase values.
  • the mapping of the sets of data tuples at S27 maps each data tuple generated at S26 to the PC-SAR image of the set of PC-SAR images that is associated with the candidate phase value of the data tuple.
  • each PC-SAR image of the set of PC-SAR images includes a grid map, wherein each cell of the grid map is quadratic and corresponds to a region of one centimeter by one centimeter in the environment.
  • the grid maps of the PC-SAR images have a same number of rows of cells and have a same number of columns of cells, wherein cells at equal positions in the grid maps correspond to a same region in the environment for all grid maps.
  • the mapping of the set of data tuples includes mapping the set of data tuples to portions, of the respective PC-SAR images, that correspond to the respective radar signal, wherein the portions of the respective PC-SAR images correspond to cells of the grid maps of the PC-SAR images.
  • the mapping of the sets of data tuples at S27 determines which cells correspond to the radar signal based on the range of the radar signal and based on a position of the radar sensor in the environment at a time of receiving the radar signal. Then, the mapping of the sets of data tuples at S27 maps each respective data tuple to the determined cells, of the PC-SAR image associated with the data tuple, that correspond to the range of the radar signal around the position of the radar sensor in the environment at the time of receiving the radar signal.
  • the mapping of the sets of data tuples at S27 performs the mapping for each set of data tuples generated at S26 for each radar signal indicated by the radar measurement data.
  • the method 20 updates candidate measurement probability values associated with PC-SAR images.
  • the updating at S28 is performed by the updating unit 10 and updates, based on the sets of data tuples generated for the radar signals, the candidate measurement probability values associated with the respective PC-SAR images for the radar signals.
  • the updating at S28 receives the mapping generated at S27 from the storage unit 3 of Fig. 1 and stores updated candidate measurement probability values in the storage unit 3 of Fig. 1.
  • the updating of the candidate measurement probability values is based on a product of a first factor and a second factor, wherein the first factor is based on the candidate measurement probability values associated with the respective PC-SAR images and the second factor is based on the respective candidate measurement probability values of the respective radar signals. More concretely, the updating is based on equation (2) and is performed iteratively for each data tuple.
  • the method 20 determines a target phase value.
  • the determining of the target phase value 29 is performed by the target phase determining unit 11 of Fig. 1 and determines a target phase value based on the set of PC-SAR images, wherein the determining of the target phase value includes selecting, as the target phase value, a candidate phase value that is mapped to a PC-SAR image associated with a highest updated candidate measurement probability value among the set of PC- SAR images.
  • the determining of the target phase value at S29 receives the set of PC-SAR images and the updated candidate measurement probability values determined at S28 from the storage unit 3 of Fig. 1 and stores the determined target phase value in the storage unit 3 of Fig. 1.
  • the determining of the target phase value at S29 determines for each cell of the grid maps of the set of PC-SAR images a target phase value that indicates a candidate phase value associated with the PC-SAR image from which the candidate measurement probability value of the corresponding cell should be used for the SAR image to be generated by the method 20.
  • the determining of the target phase value at S29 selects, as the target phase value, the candidate phase value associated with the PC-SAR image of the set of PC-SAR images in which a largest candidate measurement probability value is assigned to the corresponding cell.
  • the method 20 further generates a SAR image that includes a grid map with a same number of columns of cells and with a same number of rows of cells as the grid maps of the set of PC-SAR images, and wherein the cells of the grid map of the SAR image correspond to a same respective region in the environment as the corresponding cells in the grid maps of the set of PC-SAR images.
  • the method 20 assigns to each cell of the grid map of the SAR image a value that corresponds to the candidate measurement probability value that is assigned to the corresponding cell of the PC- SAR image associated with the target phase value determined at S29.
  • the method 20 then outputs the generated SAR image via the communication unit 4 of Fig. 1.
  • the present disclosure enables a processing of a synthetic aperture radar (SAR) using probabilities.
  • SAR synthetic aperture radar
  • the complex numbers e.g., with an absolute value that corresponds to a received power of a radar signal and with an argument that corresponds to a received phase of the radar signal
  • probabilistic assumptions are made based on the radar measurement data and models are created which, in some embodiments, lead to a more robust SAR image of the environment.
  • the resulting image is no longer power-dependent but probability-dependent. This may allow a direct state description of each cell of a grid map of a SAR image.
  • Some embodiments differ from a conventional SAR algorithm by a range evaluation and an actual SAR processing, which is based on an ordinary backproj ection algorithm.
  • a one-dimensional Fast Fourier Transform (FFT) is calculated to transform time samples acquired by SAR into the frequency domain for determining range data. Since this is amplitude-dependent, it is calculated on the basis of the noise level according to equation (4):
  • FFT Range,prob represents the Fourier transform of the radar signal after mapping the received power of the radar signal to a measurement probability represents the complex argument of the Fourier transform of the radar signal and corresponds to the received phase of the radar signal.
  • the measured angles remain unchanged, but the complex pointers have only a length in the interval between 0 and 1. l marks a factor with which a limited growth of the measurement probability can be determined. Since this step is not significantly dependent on a noise level, but only oriented on it, the noise level is determined sufficiently exactly by a median calculation, i.e., the noise level corresponds to a median of the received power of the radar signals received from the environment.
  • the signal-to- noise ratio SNR of the radar signal is determined as a quotient of the received power of the radar signal and the noise level.
  • a backproj ection is applied based on the probabilistic range FFT FFT Range,prob ⁇ Since a backprojection does not exist for probabilistic complex pointers, a corresponding model is used, which is presented in the following.
  • a phase correction is applied to a measured complex pointer A r according to equation (5).
  • the phase correction is based on multiplying the phase correction factor of equation (3). Therefore, for a description of the phase correction factor, it is referred to the description regarding equation (3) above.
  • the phase-corrected complex pointer includes, like the measured complex pointer A r , an absolute value that corresponds to the received power of the corresponding radar signal and an argument that corresponds to the received phase of the corresponding radar signal, wherein a value of the received phase indicated by is corrected with respect to a value indicated by A r .
  • A' r corresponds to a complex number, it is divided into the two components and Z , wherein the absolute value corresponds to the received power of the corresponding radar signal and is represented by the measurement probability determined in the first step.
  • the expected value /z corresponds to the (corrected) received phase of the radar signal and thus applies.
  • the standard deviation a corresponds to a phase uncertainty of the radar sensor and is chosen accordingly.
  • different candidate phase values generated based on the (corrected) received phase ZAJ. of the radar signal have different probabilities, as indicated in Fig. 3.
  • Fig. 3 shows a diagram 30 of a probability distribution of a received phase according to an embodiment.
  • the diagram 30 indicates, in a horizontal direction, one period of a received phase of a radar signal and, in a vertical direction, a probability of a corresponding phase value.
  • the diagram 30 shows a probability density function 31 of a Gaussian distribution.
  • the maximum value corresponds to the magnitude of FFT Range prob , i.e., to , which means that stronger targets have a higher probability.
  • the probability distribution of Fig. 3 has a standard deviation ( ⁇ that corresponds to a phase uncertainty of the radar sensor.
  • the probability density function 31 is evaluated at candidate phase values of a set of candidate phase values.
  • the set of candidate phase values includes 18 candidate phase values that are evenly distributed over one period, such that the candidate phase values are 20° apart from each other (without limiting the disclosure to these values).
  • the set of candidate measurement probability values includes the values obtained by evaluating the probability density function 31 at the candidate phase values.
  • a set of data tuples is generated, wherein the data tuples of the set of data tuples associate each a respective candidate phase value of the set of candidate phase values with a candidate measurement probability value of the set of candidate probability values obtained by evaluating the probability density function 31 at the respective candidate phase value.
  • a data tuple, indicated by an arrow 33 of the set of data tuples associates a candidate phase value of 20° with a candidate measurement probability value obtained by evaluating the probability density function 31 at 20°.
  • a data tuple, indicated by an arrow 34 of the set of data tuples associates a candidate phase value of 40° with a candidate measurement probability value corresponding to 40°.
  • the arrows 33, 34 and 35 are only shown for illustrative purposes; as mentioned, the set of data tuples includes a data tuple for each candidate phase value of the set of candidate phase values.
  • the set of data tuples is mapped to a set of PC-SAR images.
  • Each PC-SAR image of the set of PC- SAR images includes a grid map, and the set of data tuples generated for a radar signal is mapped to cells of the grid maps of the PC-SAR images that correspond to a region in an environment from which the radar signal is received.
  • a candidate measurement probability value of the respective cells of the grid maps is updated based on an update formula for occupancy grid maps (OGM) according to equation (2). Since the update formula for OGM is based on scalars, accordingly only scalar update steps can be performed.
  • OGM occupancy grid maps
  • Each PC-SAR image of the set of PC-SAR images is associated with a respective candidate phase value of the set of candidate phase values and represents a certain phase value of a resulting image (i.e., of a SAR image to be generated).
  • the set of data tuples is mapped to the set of PC-SAR images such that the candidate measurement probability value of each data tuple is assigned to a cell corresponding to the radar signal (i.e., corresponding to a region in the environment which an object has reflected the radar signal) in a PC-SAR image that is associated with the candidate phase value of the respective data tuple.
  • Fig. 4 illustrates a mapping of the set of data tuples to the set of PC-SAR images according to an embodiment. Fig. 4 shows how the three data tuples indicated by the arrows 33, 34 and 35 of Fig. 3 are mapped to the corresponding PC-SAR images.
  • a PC-SAR image 41 of the set of PC-SAR images is associated with a candidate phase value of 20°.
  • the data tuple indicated by the arrow 33 of Fig. 3 which also corresponds to a candidate phase value of 20°, is mapped to a cell 42, of the PC-SAR image 41, that corresponds to the radar signal. This is illustrated by an arrow 43 in the cell 42.
  • the PC-SAR image 44 of the set of PC-SAR images is associated with a candidate phase value of 40°, and the data tuple indicated by the arrow 34 of Fig. 3, which also corresponds to a candidate phase value of 40°, is mapped to a cell 45, of the PC-SAR image 44, that corresponds to the radar signal. This is illustrated by an arrow 46 in the cell 45.
  • the PC-SAR image 47 of the set of PC-SAR images is associated with a candidate phase value of 60°, and the data tuple indicated by the arrow 35 of Fig. 3, which also corresponds to a candidate phase value of 60°, is mapped to a cell 48, of the PC-SAR image 47, that corresponds to the radar signal. This is illustrated by an arrow 49 in the cell 48.
  • the candidate measurement probability values associated with the respective cells 42, 45 and 48 are then updated in a probabilistic updating based on the update formula according to equation (2) for the candidate measurement probability values of the corresponding data tuples of the set of data tuples.
  • the probabilistic updating in the PC-SAR images of the set of PC-SAR images allows to compensate for the different candidate phase values, which cannot be directly accounted for by the update formula according to equation (2).
  • the first step and the second step are repeated for further radar signals that are received from different positions within the environment, wherein candidate measurement probability values of the same set of PC-SAR images are updated for all radar signals.
  • a final result is determined using a maximum search over all PC-SAR images of the set of PC-SAR images, generating a probabilistic SAR image of the environment.
  • the probabilistic SAR image is generated with a grid map. For each cell of the grid map of the probabilistic SAR image, a largest candidate measurement probability value is selected from the cells of the grid maps of the set of PC-SAR images that correspond to a same region in the environment as the cell in the probabilistic SAR image, and the selected candidate measurement probability value is assigned to the cell in the probabilistic SAR image.
  • the first step and the second step may, for example, the performed by the circuitry 1 of Fig. 1 and/ or by the method 20 of Fig. 2.
  • the present technology enables robust and nearly power-independent SAR processing.
  • the resulting map is no longer power dependent but based on probabilities.
  • strong and weak targets that lead to constructive overlap of the complex pointers may be represented as “1”, and targets whose overlap is destructive may be represented as “0”. This may allow an environment mapping of all targets and not only such targets which have a high backscatter cross section (RCS).
  • the present technology enables probabilistic SAR processing based on radar raw data.
  • the SAR processing is based on an amplitude-based summation of complex numbers.
  • the resulting environment representation is strongly amplitude-dependent, wherein weak targets can no longer be identified due to strong targets and the dynamics of the map.
  • each cell has an occupancy probability which can be determined as a function of the phases and amplitudes. In some embodiments, this results in a map in which weak and strong targets have similar occupancy probabilities as long as the corrected phase leads to a constructive superposition of all complex pointers. In some embodiments, this leads to a significant robustness gain due to the amplitude independence.
  • the algorithm may be used for any SAR processing and is not limited, e.g., to a certain hardware such as a certain sensor configuration.
  • the technology according to an embodiment of the present disclosure is applicable to various products.
  • the technology according to an embodiment of the present disclosure may be implemented as a device included in a mobile body that is any of kinds of automobiles, electric vehicles, hybrid electric vehicles, motorcycles, bicycles, personal mobility vehicles, airplanes, drones, ships, robots, construction machinery, agricultural machinery (tractors), and the like.
  • Fig. 5 is a block diagram depicting an example of schematic configuration of a vehicle control system 7000 as an example of a mobile body control system to which the technology according to an embodiment of the present disclosure can be applied.
  • the vehicle control system 7000 includes a plurality of electronic control units connected to each other via a communication network 7010.
  • the vehicle control system 7000 includes a driving system control unit 7100, a body system control unit 7200, a battery control unit 7300, an outside-vehicle information detecting unit 7400, an in-vehicle information detecting unit 7500, and an integrated control unit 7600.
  • the communication network 7010 connecting the plurality of control units to each other may, for example, be a vehicle-mounted communication network compliant with an arbitrary standard such as controller area network (CAN), local interconnect network (LIN), local area network (LAN), FlexRay (registered trademark), or the like.
  • CAN controller area network
  • LIN local interconnect network
  • LAN local area network
  • FlexRay registered trademark
  • Each of the control units includes: a microcomputer that performs arithmetic processing according to various kinds of programs; a storage section that stores the programs executed by the microcomputer, parameters used for various kinds of operations, or the like; and a driving circuit that drives various kinds of control target devices.
  • Each of the control units further includes: a network interface (I/F) for performing communication with other control units via the communication network 7010; and a communication I/F for performing communication with a device, a sensor, or the like within and without the vehicle by wire communication or radio communication.
  • I/F network interface
  • the fifth includes a microcomputer 7610, a general-purpose communication I/F 7620, a dedicated communication I/F 7630, a positioning section 7640, a beacon receiving section 7650, an in-vehicle device I/F 7660, a sound/image output section 7670, a vehicle-mounted network I/F 7680, and a storage section 7690.
  • the other control units similarly include a microcomputer, a communication I/F, a storage section, and the like.
  • the driving system control unit 7100 controls the operation of devices related to the driving system of the vehicle in accordance with various kinds of programs.
  • the driving system control unit 7100 functions as a control device for a driving force generating device for generating the driving force of the vehicle, such as an internal combustion engine, a driving motor, or the like, a driving force transmitting mechanism for transmitting the driving force to wheels, a steering mechanism for adjusting the steering angle of the vehicle, a braking device for generating the braking force of the vehicle, and the like.
  • the driving system control unit 7100 may have a function as a control device of an antilock brake system (ABS), electronic stability control (ESC), or the like.
  • ABS antilock brake system
  • ESC electronic stability control
  • the driving system control unit 7100 is connected with a vehicle state detecting section 7110.
  • the vehicle state detecting section 7110 includes at least one of a gyro sensor that detects the angular velocity of axial rotational movement of a vehicle body, an acceleration sensor that detects the acceleration of the vehicle, and sensors for detecting an amount of operation of an accelerator pedal, an amount of operation of a brake pedal, the steering angle of a steering wheel, an engine speed or the rotational speed of wheels, and the like.
  • the driving system control unit 7100 performs arithmetic processing using a signal input from the vehicle state detecting section 7110, and controls the internal combustion engine, the driving motor, an electric power steering device, the brake device, and the like.
  • the body system control unit 7200 controls the operation of various kinds of devices provided to the vehicle body in accordance with various kinds of programs.
  • the body system control unit 7200 functions as a control device for a keyless entry system, a smart key system, a power window device, or various kinds of lamps such as a headlamp, a backup lamp, a brake lamp, a turn signal, a fog lamp, or the like.
  • radio waves transmitted from a mobile device as an alternative to a key or signals of various kinds of switches can be input to the body system control unit 7200.
  • the body system control unit 7200 receives these input radio waves or signals, and controls a door lock device, the power window device, the lamps, or the like of the vehicle.
  • the battery control unit 7300 controls a secondary battery 7310, which is a power supply source for the driving motor, in accordance with various kinds of programs.
  • the battery control unit 7300 is supplied with information about a battery temperature, a battery output voltage, an amount of charge remaining in the battery, or the like from a battery device including the secondary battery 7310.
  • the battery control unit 7300 performs arithmetic processing using these signals, and performs control for regulating the temperature of the secondary battery 7310 or controls a cooling device provided to the battery device or the like.
  • the outside-vehicle information detecting unit 7400 detects information about the outside of the vehicle including the vehicle control system 7000.
  • the outside-vehicle information detecting unit 7400 is connected with at least one of an imaging section 7410 and an outside-vehicle information detecting section 7420.
  • the imaging section 7410 includes at least one of a time-of- flight (ToF) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras.
  • ToF time-of- flight
  • the outside-vehicle information detecting section 7420 includes at least one of an environmental sensor for detecting current atmospheric conditions or weather conditions and a peripheral information detecting sensor for detecting another vehicle, an obstacle, a pedestrian, or the like on the periphery of the vehicle including the vehicle control system 7000.
  • the environmental sensor may be at least one of a rain drop sensor detecting rain, a fog sensor detecting a fog, a sunshine sensor detecting a degree of sunshine, and a snow sensor detecting a snowfall.
  • the peripheral information detecting sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR device (Light detection and Ranging device, or Laser imaging detection and ranging device).
  • Each of the imaging section 7410 and the outside-vehicle information detecting section 7420 may be provided as an independent sensor or device, or may be provided as a device in which a plurality of sensors or devices are integrated.
  • Imaging sections 7910, 7912, 7914, 7916, and 7918 are, for example, disposed at at least one of positions on a front nose, sideview mirrors, a rear bumper, and a back door of the vehicle 7900 and a position on an upper portion of a windshield within the interior of the vehicle.
  • the imaging section 7910 provided to the front nose and the imaging section 7918 provided to the upper portion of the windshield within the interior of the vehicle obtain mainly an image of the front of the vehicle 7900.
  • the imaging sections 7912 and 7914 provided to the sideview mirrors obtain mainly an image of the sides of the vehicle 7900.
  • the imaging section 7916 provided to the rear bumper or the back door obtains mainly an image of the rear of the vehicle 7900.
  • the imaging section 7918 provided to the upper portion of the windshield within the interior of the vehicle is used mainly to detect a preceding vehicle, a pedestrian, an obstacle, a signal, a traffic sign, a lane, or the like.
  • Fig. 6 depicts an example of photographing ranges of the respective imaging sections 7910, 7912, 7914, and 7916.
  • An imaging range a represents the imaging range of the imaging section 7910 provided to the front nose.
  • Imaging ranges b and c respectively represent the imaging ranges of the imaging sections 7912 and 7914 provided to the sideview mirrors.
  • An imaging range d represents the imaging range of the imaging section 7916 provided to the rear bumper or the back door.
  • a bird’s-eye image of the vehicle 7900 as viewed from above can be obtained by superimposing image data imaged by the imaging sections 7910, 7912, 7914, and 7916, for example.
  • Outside-vehicle information detecting sections 7920, 7922, 7924, 7926, 7928, and 7930 provided to the front, rear, sides, and corners of the vehicle 7900 and the upper portion of the windshield within the interior of the vehicle may be, for example, an ultrasonic sensor or a radar device.
  • the outside- vehicle information detecting sections 7920, 7926, and 7930 provided to the front nose of the vehicle 7900, the rear bumper, the back door of the vehicle 7900, and the upper portion of the windshield within the interior of the vehicle may be a LIDAR device, for example.
  • These outside- vehicle information detecting sections 7920 to 7930 are used mainly to detect a preceding vehicle, a pedestrian, an obstacle, or the like.
  • the outside-vehicle information detecting unit 7400 makes the imaging section 7410 image an image of the outside of the vehicle, and receives imaged image data.
  • the outside- vehicle information detecting unit 7400 receives detection information from the outside-vehicle information detecting section 7420 connected to the outside-vehicle information detecting unit 7400.
  • the outside-vehicle information detecting section 7420 is an ultrasonic sensor, a radar device, or a LIDAR device
  • the outside-vehicle information detecting unit 7400 transmits an ultrasonic wave, an electromagnetic wave, or the like, and receives information of a received reflected wave.
  • the outside-vehicle information detecting unit 7400 may perform processing of detecting an object such as a human, a vehicle, an obstacle, a sign, a character on a road surface, or the like, or processing of detecting a distance thereto.
  • the outside-vehicle information detecting unit 7400 may perform environment recognition processing of recognizing a rainfall, a fog, road surface conditions, or the like on the basis of the received information.
  • the outside-vehicle information detecting unit 7400 may calculate a distance to an object outside the vehicle on the basis of the received information.
  • the outside-vehicle information detecting unit 7400 may perform image recognition processing of recognizing a human, a vehicle, an obstacle, a sign, a character on a road surface, or the like, or processing of detecting a distance thereto.
  • the outside-vehicle information detecting unit 7400 may subject the received image data to processing such as distortion correction, alignment, or the like, and combine the image data imaged by a plurality of different imaging sections 7410 to generate a bird’s-eye image or a panoramic image.
  • the outside- vehicle information detecting unit 7400 may perform viewpoint conversion processing using the image data imaged by the imaging section 7410 including the different imaging parts.
  • the in-vehicle information detecting unit 7500 detects information about the inside of the vehicle.
  • the in-vehicle information detecting unit 7500 is, for example, connected with a driver state detecting section 7510 that detects the state of a driver.
  • the driver state detecting section 7510 may include a camera that images the driver, a biosensor that detects biological information of the driver, a microphone that collects sound within the interior of the vehicle, or the like.
  • the biosensor is, for example, disposed in a seat surface, the steering wheel, or the like, and detects biological information of an occupant sitting in a seat or the driver holding the steering wheel.
  • the in-vehicle information detecting unit 7500 may calculate a degree of fatigue of the driver or a degree of concentration of the driver, or may determine whether the driver is dozing.
  • the in-vehicle information detecting unit 7500 may subject an audio signal obtained by the collection of the sound to processing such as noise canceling processing or the like.
  • the integrated control unit 7600 controls general operation within the vehicle control system 7000 in accordance with various kinds of programs.
  • the integrated control unit 7600 is connected with an input section 7800.
  • the input section 7800 is implemented by a device capable of input operation by an occupant, such, for example, as a touch panel, a button, a microphone, a switch, a lever, or the like.
  • the integrated control unit 7600 may be supplied with data obtained by voice recognition of voice input through the microphone.
  • the input section 7800 may, for example, be a remote control device using infrared rays or other radio waves, or an external connecting device such as a mobile telephone, a personal digital assistant (PDA), or the like that supports operation of the vehicle control system 7000.
  • the input section 7800 may be, for example, a camera.
  • an occupant can input information by gesture.
  • data may be input which is obtained by detecting the movement of a wearable device that an occupant wears.
  • the input section 7800 may, for example, include an input control circuit or the like that generates an input signal on the basis of information input by an occupant or the like using the above-described input section 7800, and which outputs the generated input signal to the integrated control unit 7600.
  • An occupant or the like inputs various kinds of data or gives an instruction for processing operation to the vehicle control system 7000 by operating the input section 7800.
  • the storage section 7690 may include a read only memory (ROM) that stores various kinds of programs executed by the microcomputer and a random access memory (RAM) that stores various kinds of parameters, operation results, sensor values, or the like.
  • ROM read only memory
  • RAM random access memory
  • the storage section 7690 may be implemented by a magnetic storage device such as a hard disc drive (HDD) or the like, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
  • the general-purpose communication I/F 7620 is a communication I/F used widely, which communication I/F mediates communication with various apparatuses present in an external environment 7750.
  • the general-purpose communication I/F 7620 may implement a cellular communication protocol such as global system for mobile communications (GSM (registered trademark)), worldwide interoperability for microwave access (WiMAX (registered trademark)), long term evolution (LTE (registered trademark)), LTE-advanced (LTE-A), or the like, or another wireless communication protocol such as wireless LAN (referred to also as wireless fidelity (Wi-Fi (registered trademark)), Bluetooth (registered trademark), or the like.
  • GSM global system for mobile communications
  • WiMAX worldwide interoperability for microwave access
  • LTE registered trademark
  • LTE-advanced LTE-advanced
  • WiFi wireless fidelity
  • Bluetooth registered trademark
  • the general-purpose communication I/F 7620 may, for example, connect to an apparatus (for example, an application server or a control server) present on an external network (for example, the Internet, a cloud network, or a company-specific network) via a base station or an access point.
  • the general-purpose communication I/F 7620 may connect to a terminal present in the vicinity of the vehicle (which terminal is, for example, a terminal of the driver, a pedestrian, or a store, or a machine type communication (MTC) terminal) using a peer to peer (P2P) technology, for example.
  • an apparatus for example, an application server or a control server
  • an external network for example, the Internet, a cloud network, or a company-specific network
  • MTC machine type communication
  • P2P peer to peer
  • the dedicated communication I/F 7630 is a communication I/F that supports a communication protocol developed for use in vehicles.
  • the dedicated communication I/F 7630 may implement a standard protocol such, for example, as wireless access in vehicle environment (WAVE), which is a combination of institute of electrical and electronic engineers (IEEE) 802.11p as a lower layer and IEEE 1609 as a higher layer, dedicated short range communications (DSRC), or a cellular communication protocol.
  • WAVE wireless access in vehicle environment
  • IEEE institute of electrical and electronic engineers
  • DSRC dedicated short range communications
  • the dedicated communication I/F 7630 typically carries out V2X communication as a concept including one or more of communication between a vehicle and a vehicle (Vehicle to Vehicle), communication between a road and a vehicle (Vehicle to Infrastructure), communication between a vehicle and a home (Vehicle to Home), and communication between a pedestrian and a vehicle (Vehicle to Pedestrian).
  • the positioning section 7640 performs positioning by receiving a global navigation satellite system (GNSS) signal from a GNSS satellite (for example, a GPS signal from a global positioning system (GPS) satellite), and generates positional information including the latitude, longitude, and altitude of the vehicle.
  • GNSS global navigation satellite system
  • GPS global positioning system
  • the positioning section 7640 may identify a current position by exchanging signals with a wireless access point, or may obtain the positional information from a terminal such as a mobile telephone, a personal handyphone system (PHS), or a smart phone that has a positioning function.
  • the beacon receiving section 7650 receives a radio wave or an electromagnetic wave transmitted from a radio station installed on a road or the like, and thereby obtains information about the current position, congestion, a closed road, a necessary time, or the like.
  • the function of the beacon receiving section 7650 may be included in the dedicated communication I/F 7630 described above.
  • the in-vehicle device I/F 7660 is a communication interface that mediates connection between the microcomputer 7610 and various in-vehicle devices 7760 present within the vehicle.
  • the in-vehicle device I/F 7660 may establish wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), near field communication (NFC), or wireless universal serial bus (WUSB).
  • a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), near field communication (NFC), or wireless universal serial bus (WUSB).
  • WUSB wireless universal serial bus
  • the in-vehicle device I/F 7660 may establish wired connection by universal serial bus (USB), high-definition multimedia interface (HDMI (registered trademark)), mobile high-definition link (MHL), or the like via a connection terminal (and a cable if necessary) not depicted in the figures.
  • USB universal serial bus
  • HDMI high-definition multimedia interface
  • MHL mobile high-definition link
  • the in-vehicle devices 7760 may, for example, include at least one of a mobile device and a wearable device possessed by an occupant and an information device carried into or attached to the vehicle.
  • the in-vehicle devices 7760 may also include a navigation device that searches for a path to an arbitrary destination.
  • the in-vehicle device I/F 7660 exchanges control signals or data signals with these in-vehicle devices 7760.
  • the vehicle-mounted network I/F 7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010.
  • the vehicle-mounted network I/F 7680 transmits and receives signals or the like in conformity with a predetermined protocol supported by the communication network 7010.
  • the microcomputer 7610 of the integrated control unit 7600 controls the vehicle control system 7000 in accordance with various kinds of programs on the basis of information obtained via at least one of the general-purpose communication I/F 7620, the dedicated communication I/F 7630, the positioning section 7640, the beacon receiving section 7650, the in-vehicle device I/F 7660, and the vehicle-mounted network I/F 7680.
  • the microcomputer 7610 may calculate a control target value for the driving force generating device, the steering mechanism, or the braking device on the basis of the obtained information about the inside and outside of the vehicle, and output a control command to the driving system control unit 7100.
  • the microcomputer 7610 may perform cooperative control intended to implement functions of an advanced driver assistance system (ADAS) which functions include collision avoidance or shock mitigation for the vehicle, following driving based on a following distance, vehicle speed maintaining driving, a warning of collision of the vehicle, a warning of deviation of the vehicle from a lane, or the like.
  • ADAS advanced driver assistance system
  • the microcomputer 7610 may perform cooperative control intended for automatic driving, which makes the vehicle to travel autonomously without depending on the operation of the driver, or the like, by controlling the driving force generating device, the steering mechanism, the braking device, or the like on the basis of the obtained information about the surroundings of the vehicle.
  • the microcomputer 7610 may generate three-dimensional distance information between the vehicle and an object such as a surrounding structure, a person, or the like, and generate local map information including information about the surroundings of the current position of the vehicle, on the basis of information obtained via at least one of the general-purpose communication I/F 7620, the dedicated communication I/F 7630, the positioning section 7640, the beacon receiving section 7650, the in-vehicle device I/F 7660, and the vehicle-mounted network I/F 7680.
  • the microcomputer 7610 may predict danger such as collision of the vehicle, approaching of a pedestrian or the like, an entry to a closed road, or the like on the basis of the obtained information, and generate a warning signal.
  • the warning signal may, for example, be a signal for producing a warning sound or lighting a warning lamp.
  • the sound/image output section 7670 transmits an output signal of at least one of a sound and an image to an output device capable of visually or auditorily notifying information to an occupant of the vehicle or the outside of the vehicle.
  • an audio speaker 7710, a display section 7720, and an instrument panel 7730 are illustrated as the output device.
  • the display section 7720 may, for example, include at least one of an on-board display and a head-up display.
  • the display section 7720 may have an augmented reality (AR) display function.
  • the output device may be other than these devices, and may be another device such as headphones, a wearable device such as an eyeglass type display worn by an occupant or the like, a projector, a lamp, or the like.
  • the output device is a display device
  • the display device visually displays results obtained by various kinds of processing performed by the microcomputer 7610 or information received from another control unit in various forms such as text, an image, a table, a graph, or the like.
  • the audio output device converts an audio signal constituted of reproduced audio data or sound data or the like into an analog signal, and auditorily outputs the analog signal.
  • At least two control units connected to each other via the communication network 7010 in the example depicted in Fig. 5 may be integrated into one control unit.
  • each individual control unit may include a plurality of control units.
  • the vehicle control system 7000 may include another control unit not depicted in the figures.
  • part or the whole of the functions performed by one of the control units in the above description may be assigned to another control unit. That is, predetermined arithmetic processing may be performed by any of the control units as long as information is transmitted and received via the communication network 7010.
  • a sensor or a device connected to one of the control units may be connected to another control unit, and a plurality of control units may mutually transmit and receive detection information via the communication network 7010.
  • a computer program for realizing the functions of the information processing device 100 according to the present embodiment described with reference to Fig. 5 can be implemented in one of the control units or the like.
  • a computer readable recording medium storing such a computer program can also be provided.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like.
  • the above-described computer program may be distributed via a network, for example, without the recording medium being used.
  • a circuitry for generating a synthetic aperture radar image wherein the circuitry is configured to: obtain, based on a radar signal detected by a radar sensor, radar measurement data that indicate a received power and a received phase of the radar signal; map the received power of the radar signal to a measurement probability; generate a set of data tuples by associating each candidate phase value of a set of candidate phase values with a candidate measurement probability value of a set of candidate measurement probability values that correspond to a distribution of the measurement probability according to a phase uncertainty of the received phase; map the set of data tuples to a set of phase candidate synthetic aperture radar, PC-SAR, images; and determine a target phase value based on the set of PC-SAR images.
  • mapping of the received power includes: determining a signal-to-noise ratio of the radar signal based on the received power; and determining the measurement probability based on the signal-to-noise ratio of the radar signal.
  • distribution of the candidate measurement probability values of the set of data tuples corresponds to a probability distribution, wherein at least one of a mean and a mode of the probability distribution corresponds to the received phase of the radar signal and has a probability value that corresponds to the received power of the radar signal.
  • determining of the target phase value includes selecting, as the target phase value, a candidate phase value that is mapped to a PC-SAR image associated with a highest candidate measurement probability value among the set of PC-SAR images.
  • the circuitry of any one of (1) to (4) further configured to: obtain, based on at least one further radar signal detected by the radar sensor, further radar measurement data that indicate a received power and a received phase of the at least one further radar signal; perform the mapping of the received power, the generating of the set of data tuples and the mapping of the set of data tuples accordingly for the at least one further radar signal; update, based on the set of data tuples generated for the at least one further radar signal, the candidate measurement probability values associated with the respective PC-SAR images for the at least one further radar signal; and determine the target phase value based on the updated candidate measurement probability values.
  • mapping of the set of data tuples includes mapping the set of data tuples to portions, of the respective PC-SAR images, that correspond to the radar signal.
  • a method for generating a synthetic aperture radar image comprising: obtaining, based on a radar signal detected by a radar sensor, radar measurement data that indicate a received power and a received phase of the radar signal; mapping the received power of the radar signal to a measurement probability; generating a set of data tuples by associating each candidate phase value of a set of candidate phase values with a candidate measurement probability value of a set of candidate measurement probability values that correspond to a distribution of the measurement probability according to a phase uncertainty of the received phase; mapping the set of data tuples to a set of phase candidate synthetic aperture radar, PC-SAR, images; and determining a target phase value based on the set of PC-SAR images.
  • mapping of the received power includes: determining a signal-to-noise ratio of the radar signal based on the received power; and determining the measurement probability based on the signal-to-noise ratio of the radar signal.
  • the method of any one of (11) to (13), wherein the determining of the target phase value includes selecting, as the target phase value, a candidate phase value that is mapped to a PC-SAR image associated with a highest candidate measurement probability value among the set of PC-SAR images.
  • (21) A computer program comprising program code causing a computer to perform the method according to anyone of (11) to (20), when being carried out on a computer.
  • (22) A non-transitory computer-readable recording medium that stores therein a computer program product, which, when executed by a processor, causes the method according to anyone of (11) to (20) to be performed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The present disclosure pertains to a circuitry for generating a synthetic aperture radar image, wherein the circuitry is configured to obtain, based on a radar signal detected by a radar sensor, radar measurement data that indicate a received power and a received phase of the radar signal; map the received power of the radar signal to a measurement probability; generate a set of data tuples by associating each candidate phase value of a set of candidate phase values with a candidate measurement probability value of a set of candidate measurement probability values that correspond to a distribution of the measurement probability according to a phase uncertainty of the received phase; map the set of data tuples to a set of phase candidate synthetic aperture radar, PC-SAR, images; and determine a target phase value based on the set of PC-SAR images.

Description

CIRCUITRY AND METHOD
TECHNICAL FIELD
The present disclosure generally pertains to a circuitry and a method, and, more particularly, to a circuitry and a method for generating a synthetic-aperture radar image.
TECHNICAL BACKGROUND
It is generally known to generate an image of an environment based on radar. For example, synthetic-aperture radar (SAR) is based on emitting radar signals into the environment and receiving reflected radar signals from the environment while moving through the environment. Based on the received reflected radar signals and positions within the environment at which the reflected radar signals have been received, a SAR image can be generated that represents the environment.
Although there exist techniques for generating a SAR image, it is generally desirable to provide an improved circuitry and method for generating a SAR image.
SUMMARY
According to a first aspect, the disclosure provides a circuitry for generating a synthetic aperture radar image, wherein the circuitry is configured to obtain, based on a radar signal detected by a radar sensor, radar measurement data that indicate a received power and a received phase of the radar signal; map the received power of the radar signal to a measurement probability; generate a set of data tuples by associating each candidate phase value of a set of candidate phase values with a candidate measurement probability value of a set of candidate measurement probability values that correspond to a distribution of the measurement probability according to a phase uncertainty of the received phase; map the set of data tuples to a set of phase candidate synthetic aperture radar, PC- SAR, images, and determine a target phase value based on the set of PC-SAR images.
According to a second aspect, the disclosure provides a method for generating a synthetic aperture radar image, wherein the method includes obtaining, based on a radar signal detected by a radar sensor, radar measurement data that indicate a received power and a received phase of the radar signal; mapping the received power of the radar signal to a measurement probability; generating a set of data tuples by associating each candidate phase value of a set of candidate phase values with a candidate measurement probability value of a set of candidate measurement probability values that correspond to a distribution of the measurement probability according to a phase uncertainty of the received phase; mapping the set of data tuples to a set of phase candidate synthetic aperture radar, PC-SAR, images; and determining a target phase value based on the set of PC-SAR images. Further aspects are set forth in the dependent claims, the drawings and the following description.
BRIEF DESCRIPTION OF THE DRAWINGS
Embodiments are explained by way of example with respect to the accompanying drawings, in which:
Fig. 1 shows a block diagram of a circuitry for generating a SAR image according to an embodiment;
Fig. 2 shows a flow diagram of a method for generating a SAR image according to an embodiment;
Fig. 3 shows a diagram of a probability distribution of a received phase according to an embodiment;
Fig. 4 illustrates a mapping of a set of data tuples to a set of PC -SAR images according to an embodiment;
Fig. 5 is a block diagram depicting an example of schematic configuration of a vehicle control system; and
Fig. 6 is a diagram of assistance in explaining an example of installation positions of an outside- vehicle information detecting section and an imaging section.
DETAILED DESCRIPTION OF EMBODIMENTS
Before a detailed description of the embodiments under reference of Fig. 1 is given, general explanations are made.
As mentioned in the outset, it is generally known to generate an image of an environment based on radar. For example, synthetic-aperture radar (SAR) is based on emitting radar signals into the environment and receiving reflected radar signals from the environment while moving through the environment. Based on the received reflected radar signals and positions within the environment at which the reflected radar signals have been received, a SAR image can be generated that represents the environment.
SAR images can be generated in multiple different ways based on the received reflected radar signals. There are approaches in the frequency domain as well as approaches in the time domain. In some instances, these approaches generate SAR images based on an amplitude and phase of a time signal or on a frequency signal. In some instances, using various algorithms, for example a backproj ection algorithm, the reception phases of each range cell in a grid map are corrected and then summed up. In some instances, as soon as a constructive superposition results from this summation and thus a lot of power is mapped in a cell, the cell represents a target, whereas a destructive superposition of all pointers leads to a low-power cell which does not represent a target. In some instances, this approach allows a simple and efficient way to generate SAR mappings of the environment.
However, in some instances, this approach causes the resulting map to be directly dependent on the received power of each target. However, this may not indicate the actual occupancy state of the cell.
As mentioned, there are many different ways to span a synthetic aperture for SAR and generate a SAR image of the environment from raw data that have been acquired by SAR. However, in some instances, these different ways differ in a domain in which they are calculated. On the one hand there are approaches in the frequency domain and on the other hand in the time domain. However, all these approaches are amplitude-based, whereby only two measured complex numbers are summed up. In this case, as described in Grebner, Timo et al.: “Radar-Based Mapping of the Environment: Occupancy Grid-Map Versus SAR”, IEEE Microwave and Wireless Components Letters, 2022, Digital Object Identifier (DOI): 10.1109/LMWC.2022.3145661, a currently existing map G contains the complex entries G(mi |z1:t,x1:t) for each cell mi of the grid map G depending on a given sensor position x and a sensor measurement z, according to equation (1):
Figure imgf000005_0001
Here, G(mi |z1:t,x1:t) describes a complex occupancy amplitude of the cell mi at a current time frame t, and describes a currently measured complex amplitude of a reflected signal
Figure imgf000005_0002
that corresponds to a distance r and a ramp k that has a time stamp t k among K ramps emitted into the environment as radar signals in the frame t. The complex amplitude is phase-
Figure imgf000005_0003
corrected by with the use of the distance between the
Figure imgf000005_0004
Figure imgf000005_0005
cell mi and the sensor position at the time stamp tk, wherein ƒ start is the start frequency of the
Figure imgf000005_0006
ramp and c0 is the speed of light. This procedure may be repeated for all received ramps and cells of the resulting map G.
In some instances, for real targets, all phases overlap constructively after the correction, while targets which represent noise overlap destructively and thus the power of all measurements cancels each other out.
Finally, there are many different approaches to correct the received phase, but all are based on the same principle: the distance-proportional phase correction. In some instances, such an approach has the disadvantage that only the amplitude information and the phase information of the complex measured values are taken into account. Thus, the dynamics of the resulting map may depend significandy on the strongest target and the weakest target. Targets which have a small radar cross section (RCS) but, due to their phase, lead to a constructive overlapping of the complex pointers after application of the SAR algorithm may therefore not be shown in maps although the measured phases may represent a target (object in the environment).
In addition, in some instances, measurement inaccuracies such as noise or position inaccuracies are not taken into account, although they may have a significant impact on the measurement results.
Consequently, some embodiments pertain to a circuitry for generating a synthetic aperture radar (SAR) image, wherein the circuitry is configured to obtain, based on a radar signal detected by a radar sensor, radar measurement data that indicate a received power and a received phase of the radar signal; map the received power of the radar signal to a measurement probability; generate a set of data tuples by associating each candidate phase value of a set of candidate phase values with a candidate measurement probability value of a set of candidate measurement probability values that correspond to a distribution of the measurement probability according to a phase uncertainty of the received phase; map the set of data tuples to a set of phase candidate synthetic aperture radar (PC- SAR) images; and determine a target phase value based on the set of PC-SAR images.
For example, the radar sensor may include a chirped-sequence radar sensor that may emit a radar signal into an environment, wherein a frequency of the emitted radar signal is modulated. For example, the frequency may vary according to a frequency ramp, starting at a predetermined start frequency, with a predetermined slope. The slope may be constant (linear chirp) or may vary for realizing a non-linear frequency variation (e.g., an exponential chirp). The radar sensor may emit radar signals repeatedly, e.g., with a predetermined repetition rate. The repeated radar signals may be equally or differently modulated.
The emitted radar signals may be reflected by objects in the environment. For example, the objects reflecting the radar signals may include cars, trucks, bicycles, other vehicles, pedestrians, animals, walls, road signs, traffic barriers, bridges, trees, stones, buildings, fences, or the like. Such objects may also be referred to as targets.
The radar sensor may receive radar signals reflected by an object in the environment and may generate the radar measurement data based on information extracted from the received reflected radar signals. For example, the radar sensor may determine a power of the received radar signal. The radar sensor may determine to which emitted radar signal, e.g. to which ramp, the received radar signal corresponds. This determination may be based on a timing of emitting radar signals into the environment, on a timing of receiving the reflected radar signal, on a start frequency, a slope and/ or a shape (e.g., linear chirp, exponential chirp, etc.) of the radar signal, or the like. The radar sensor may further determine, e.g. based on a delay between receiving the reflected radar signal and emitting the corresponding radar signal, a phase of the received radar signal.
Although a chirped-sequence radar sensor has been mentioned as an example of a radar sensor that generates the radar measurement data, the radar measurement data may be generated by any radar sensor that is capable of determining a received power and a received phase of a radar signal reflected from an environment. For example, in some embodiments, the radar measurement data may be generated by a time-division multiplexing (TDM) radar, a frequency-division multiplexing (FDM) radar, a code-division multiplexing (CDM) radar, or the like.
The circuitry may include any entity that is capable of performing information processing, such as generating a SAR image. For example, the circuitry may include a field-programmable gate array (FPGA), an application-specific integrated circuit (ASIC), a microprocessor or the like. For example, the circuitry may obtain the radar measurement data directly from the radar sensor or indirectly via another information processing apparatus or via a magnetic, optical or semiconductor-based storage medium.
The circuitry may further include a storage unit that may, for example, include a dynamic random- access memory (DRAM), a flash memory, an electrically erasable programmable read-only memory (EEPROM), a hard disk drive or the like. The circuitry may further include an interface for receiving the radar measurement data and for outputting the generated SAR image, e.g., via serial peripheral interface (SPI), peripheral component interconnect (PCI), controller area network (CAN), universal storage bus (USB), Ethernet, IEEE 802.11 (Wi-Fi), Bluetooth, or the like. For example, the circuitry may receive the radar measurement data via the interface from the radar sensor, from another information processing apparatus or from a storage medium. For example, the circuitry may output the generated SAR image via the interface to another information processing apparatus or to a storage medium.
The mapping of the received power of the radar signal to the measurement probability may, for example, map the received power to a value in an interval from zero to one, without limiting the disclosure to these values. The mapping may, for example, depend on environmental conditions and/ or on a configuration of the radar sensor. For example, the mapping may depend on a radar cross section (RCS) of an object reflecting the radar signal (e.g., on a material, a size, a shape and/or a distance of the object), on a power of a radar signal emitted by the radar sensor, on an expected received power of the radar signal, on a power of electromagnetic interference reducing a quality of the radar signal or the like.
The measurement probability may, for example, correspond to a probability that the radar signal is not a measurement noise or an artefact but is reflected by a real object.
The set of candidate phase values may, for example, be predetermined or may be generated based on the received phase of the radar signal. For example, the candidate phase values of the set of candidate phase values may be evenly distributed in the interval from 0° to 360° to cover a full period, without limiting the present disclosure to these values. For example, the candidate phase values of the set of phase values may be generated in an interval around the received phase of the radar signal and may be distributed evenly in the interval or may be distributed with a higher density in a subinterval that includes the received phase of the radar signal than in a subinterval that does not include the received phase.
The set of candidate measurement probability values may be determined based on the set of candidate phase values. For example, for each candidate phase value of the set of candidate phase values, a candidate measurement probability value of the set of candidate measurement probability values may be determined. The candidate measurement probability values may, for example, be computed for each candidate phase value of the set of candidate phase values based on a probability density function or may be read from a predetermined table, wherein candidate measurement probability values that lie between entries of the table may be interpolated.
The distribution of the measurement probability may represent the phase uncertainty of the received phase. For example, a standard deviation of the distribution may be determined based on properties of the radar sensor and/ or based on a quality of the radar signal.
The set of data tuples may be generated by associating, for each data tuple of the set of data tuples, a candidate phase value of the set of candidate phase values with a candidate measurement probability value of the set of candidate measurement probability values.
The set of PC-SAR images may, for example, include an associated PC-SAR image for each candidate phase value of the set of candidate phase values. The mapping of the set of data tuples to the set of PC-SAR images may assign the respective candidate measurement probability values of the data tuples of the set of data tuples to the respective PC-SAR images of the set of PC-SAR images that are associated with the respective candidate phase values of the data tuples of the set of data tuples. The determining of the target phase value may, for example, select, as the target phase value, a candidate phase value of the set of candidate phase values based on a candidate measurement probability value that is assigned to the PC-SAR image to which the candidate phase value is mapped.
For example, the circuitry may then generate a SAR image based on the candidate probability value that has been assigned to the PC-SAR image associated with the target phase value. For example, the target phase value may be represented by an index of a PC-SAR image, of the set of PC-SAR images, that is associated with the target phase value.
In some embodiments, the mapping of the received power includes determining a signal-to-noise ratio of the radar signal based on the received power; and determining the measurement probability based on the signal-to-noise ratio of the radar signal.
For example, the signal-to-noise ratio of the radar signal may be determined as a quotient of the received power of the radar signal and a noise level of the radar sensor.
The noise level may be determined in any suitable way. For example, the noise level may be determined based on a calibration measurement of a known environment that may, for example, include no objects (targets) in a predetermined portion. For example, the noise level may be determined based on the radar measurement data, e.g., based on a mean value, a median or another suitable quantile of a received power of a plurality of radar signals, or based on clustering (such as hierarchical clustering or as density-based spatial clustering of applications with noise (DBSCAN)), wherein the noise level may be determined based on a largest or densest cluster of values of the received power, e.g., based on a maximum, a mean, a quantile or a density distribution of the cluster.
The measurement probability may, for example, be determined based on an exponential function, wherein an exponent includes the signal-to-noise ratio. The exponent may further include a scaling factor multiplied with the signal-to-noise ratio for determining a limited growth of the measurement probability. The scaling factor may, for example, be empirically determined based on a configuration of the radar sensor and/ or on expected conditions of the environment.
The measurement probability may, for example, be chosen from an interval between a minimum probability and a maximum probability, based on the received power of the radar signal. For example, the minimum probability may be 0, 0.01, 0.1 or 0.5 and may be chosen if the received power equals or falls below the noise level. For example, the maximum probability may be 1, 0.99 or 0.9 and may be chosen if the received power equals an expected maximum received power, wherein the expected maximum received power may correspond to a power of an emitted radar signal reduced according to the inverse-square law with respect to a distance (range) determined based on the radar signal. The maximum probability may further be reduced according to a difference between the expected maximum received power and the noise level. The minimum probability may also be chosen if the received power is at least a certain, e.g. predetermined, amount lower than the noise level to account for an overlap between a distribution of a received power of noise signals and a distribution of a received power of radar signals reflected from an object. However, the minimum probability and the maximum probability are not limited to the values described herein, and the skilled person may find other values for the minimum probability and the maximum probability that are suitable for determining the measurement probability.
In some embodiments, the distribution of the candidate measurement probability values of the set of data tuples corresponds to a probability distribution, wherein at least one of a mean and a mode of the probability distribution corresponds to the received phase of the radar signal and has a probability value that corresponds to the received power of the radar signal.
For example, the probability distribution may correspond to a Gaussian distribution (normal distribution). The standard deviation of the probability distribution may correspond to the phase uncertainty of the received phase, e.g., to a limited precision of the radar sensor for measuring the received phase. The candidate measurement probability values of the set of candidate measurement probability values may be based on evaluating a probability density function of the probability distribution for the candidate phase values of the set of candidate phase values. The candidate measurement probability values may further be scaled with respect to the probability density function corresponding to the measurement probability.
In some embodiments, the determining of the target phase value includes selecting, as the target phase value, a candidate phase value that is mapped to a PC-SAR image associated with a highest candidate measurement probability value among the set of PC-SAR images.
For example, the circuitry may generate the SAR image based on the PC-SAR image associated with the target phase value, e.g., based on the highest candidate measurement probability value, which is associated with the PC-SAR image.
In some embodiments, the circuitry is further configured to obtain, based on at least one further radar signal detected by the radar sensor, further radar measurement data that indicate a received power and a received phase of the at least one further radar signal; perform the mapping of the received power, the generating of the set of data tuples and the mapping of the set of data tuples accordingly for the at least one further radar signal; update, based on the set of data tuples generated for the at least one further radar signal, the candidate measurement probability values associated with the respective PC-SAR images for the at least one further radar signal; and determine the target phase value based on the updated candidate measurement probability values.
For example, the at least one further radar signal may include one further radar signal, 100 further radar signals or 10,000 further radar signals, without limiting the present disclosure to these numbers or to these orders of magnitude.
For example, the circuitry may map the received power of each further radar signal to a corresponding measurement probability, as described above. The circuitry may further generate a set of data tuples, as described above, by associating candidate phase values of the set of candidate phase values with candidate measurement probability values of a set of candidate measurement probability values, wherein the candidate measurement probability values may, for example, be generated by evaluating, at the candidate phase values of the set of candidate phase values, a distribution of the measurement probability according to a phase uncertainty of the received phase, as described above.
The circuitry may then map, for each further radar signal, the generated set of data tuples to the set of PC-SAR images. For example, each PC-SAR image of the set of PC-SAR images may be associated with a candidate phase value of the set of candidate phase values. The circuitry may map each data tuple of the set of data tuples generated for the further radar signals to the PC-SAR image, of the set of PC-SAR images, that is associated with the candidate phase value of the respective data tuple.
For example, the number of PC-SAR images in the set of PC-SAR images may correspond to a number of candidate phase values in the set of candidate phase values, and for the radar signal as well as for each further radar signal a candidate measurement probability value, for the set of candidate measurement probability values, may be generated for each candidate phase value of the set of candidate phase values. Thus, each PC-SAR image of the set of PC-SAR images may be associated with one data tuple, which may associate a candidate phase value and a candidate measurement probability value with each other, for the radar signal and with one data tuple per further radar signal.
However, the present disclosure is not limited to providing a same number of elements in the set of candidate phase values, in the set of candidate measurement probability values, in the set of data tuples and/ or in the set of PC-SAR images. For example, the circuitry may generate fewer candidate measurement probability values than there are candidate phase values in the set of candidate phase values, e.g., the circuitry may generate candidate measurement probability values only for candidate phase values at which the distribution of the measurement probability yields a value that exceeds a predetermined threshold, e.g., for candidate phase values that lie within a predetermined number of standard deviations around the received phase. For example, the circuitry may generate fewer data tuples than there are candidate measurement probability values in the set of candidate measurement probability values, e.g., the circuitry may generate data tuples only for candidate measurement probability values that exceed a predetermined threshold. For example, the circuitry may generate fewer data tuples, which associate a candidate phase value and a candidate measurement probability value, than there are PC-SAR images in the set of PC-SAR images, e.g., if the circuitry generates candidate measurement probability values or corresponding data tuples only for candidate measurement probability values that exceed a predetermined threshold. Thus, a PC-SAR image of the set of PC-SAR images may be associated with fewer data tuples than there are reflected radar signals received by the radar sensor.
The updating of the candidate measurement probability values associated with the respective PC- SAR images for the at least one further radar signal may include updating, for each data tuple generated for the at least one further radar signal, the candidate measurement probability value assigned to the respective PC-SAR image that is associated with the candidate phase value of the respective data tuple, wherein the circuitry updates the candidate measurement probability value assigned to the respective PC-SAR image such that an updated candidate measurement probability value, which the circuitry assigns to the respective PC-SAR image, depends on the candidate measurement probability value of the respective data tuple. For example, the circuitry may make the candidate measurement probability values associated with the respective PC-SAR images dependent on the candidate measurement probability values associated with the respective candidate phase values by any data tuple generated for the radar signal or for the at least one further radar signal.
The circuitry may, for example, determine, as the target phase value, a candidate phase value associated with a PC-SAR image, of the set of PC-SAR images, that is associated with a highest candidate measurement probability value. The circuitry may then set the highest candidate measurement probability value, which is assigned to the PC-SAR image associated with the target phase value, as a measurement probability value, or as a target probability indicating a probability of a presence of a target, in a SAR image which the circuitry generates.
In some embodiments, the updating is based on a product of a first factor and a second factor, wherein the first factor is based on the candidate measurement probability values associated with the respective PC-SAR images and the second factor is based on the respective candidate measurement probability values of the at least one further radar signal. For example, the updating of the candidate measurement probability values associated with the respective PC-SAR images may be based on an odds ratio and may, for example, be based on a candidate measurement probability value of a current measurement (e.g., a currently processed radar signal), on one or more candidate probability values (“previous candidate probability values”) from which the candidate PC-SAR image has already been made dependent, and on a priori information. For example, the circuitry may iteratively update the candidate measurement probability values associated with the respective PC-SAR images for each respective data tuple.
For example, the updating may be based on an update formula for occupancy grid maps (OGM), such as equation (2), which is based on Thrun, Sebastian: “Learning occupancy grids with forward models”, Proceedings 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2001, Digital Object Identifier (DOI): 10.1109/IROS.2001.977219:
Figure imgf000013_0001
In equation (2), p(mi |z1:t,x1:t) represents a probability associated with a cell mi of a grid map, e.g. of a PC-SAR image, wherein the probability is based on sensor measurements z1:t and corresponding sensor positions X1:t of time frames 1 to t, i.e., of a currendy processed time frame t and all previously processed time frames. Accordingly, p(mi|zt, xt) represents a probability associated with the cell mi based on a sensor measurement zt and a corresponding sensor position xt of the currently processed time frame t, and p(mi| z1:t-1, x1:t-1 represents a probability associated with the cell m^ based on sensor measurements z1:t-1 and corresponding sensor positions x1:t-1 of time frames from 1 to t — 1, i.e., without the currendy processed time frame t. For time frame 1, i.e., when a first probability is inserted into a cell, the recursive term may be omitted (or set to 1). Further, p(mi) represents a prior for an occupancy of the cell mi and may be initialized to, e.g., 0.5, without limiting the present disclosure thereto.
The term marked as “current measurement” in equation (2) may correspond to a currendy processed measurement, e.g., to a candidate measurement probability of a data tuple based on which the circuitry updates a candidate measurement probability assigned to a PC-SAR image. The term marked as “recursive term” in equation (2) may correspond to previously processed measurements, e.g., to a candidate measurement probability value that has previously been assigned to the PC-SAR image. The term marked as “a priori information” in equation (2) may correspond to a predetermined a priori probability. Thus, the term p(mi |z1:t,x1:t) of equation (2) may be assigned to the PC-SAR image as an updated candidate measurement probability value. The circuitry may evaluate the product in the logarithmic domain, wherein the product of the first factor and the second factor may be represented as a sum of the logarithm of the first factor and a logarithm of the second factor.
In some embodiments, the radar signal and the at least one further radar signal are detected by the radar sensor from different positions.
For example, the radar sensor may be provided on a mobile platform and perform radar measurements while the mobile platform (including the radar sensor) is moving through the environment. The performing of radar measurements may include emitting radar signals into the environment and receiving radar signals from the environment that have been reflected by objects in the environment. The mobile platform may, for example, include a car, a truck, a motorcycle, a bicycle, a tractor, an excavator, a train, a boat, a ship, a helicopter, an airplane, a drone or the like.
While moving through the environment, the radar sensor may emit radar signals from different positions into the environment and may receive radar signals from the environment from different positions. Thus, the radar signals received by the radar sensor may correspond to different views of the environment, wherein an object in the environment may be sensed from different directions and/ or from different distances.
Based on a position of the radar sensor in the environment at the time of receiving a radar signal reflected from an object in the environment, the circuitry may determine a position of the object in the environment. The circuitry may determine the position of the radar sensor in the environment, for example, based on the movement of the mobile platform, e.g., based on a speed sensor, an acceleration sensor, a rotation sensor, a global navigation satellite system (GNSS) such as Global Positioning System (GPS), Galileo, Michibiki, Beidou or GLONASS, or the like, and/ or based on simultaneous localization and mapping (SLAM).
Thus, the circuitry may determine a position of the object in the environment even if only a distance to the object but no angle can be determined based on reflected radar signals received from the object.
In some embodiments, the radar measurement data further indicate a range of the radar signal; wherein the mapping of the set of data tuples includes mapping the set of data tuples to portions, of the respective PC-SAR images, that correspond to the radar signal.
For example, the radar sensor and/ or the circuitry may determine the range of the radar signal based on the received phase of the radar signal. The received phase of the radar signal may indicate a delay between emitting a radar signal and receiving a corresponding reflected radar signal. For example, the range of the radar signal may be determined based on a Fourier transform (e.g., a Fast Fourier Transform (FFT)) of the radar signal. The range of the radar signal may correspond to a distance between the radar sensor and an object in the environment at which the radar signal is reflected.
For example, the circuitry may map the set of data tuples to portions of the respective PC-SAR images that correspond to the range of the radar signal, e.g., to portions that correspond to a distance from the radar sensor according to the range of the radar signal at the time of receiving the radar signal.
For example, the portions of a PC-SAR image that correspond to the range of the radar signal may be arranged on a circular arc, wherein a radius of the circular arc may correspond to the range of the radar signal and a center associated with the circular arc may correspond to the position of the radar signal in the environment at the time of receiving the radar signal. By updating the candidate measurement probability values assigned to the PC-SAR images for one or more further radar signals that have been received by the radar sensor at different positions, the respective candidate measurement probability values may be mapped to portions of the PC-SAR images that may be arranged in circular arcs shifted against each other. Thus, after updating the candidate measurement probability values associated with the set of PC-SAR images for a sufficient number of further radar signals received from different positions within the environment, the corresponding circular arcs of updated portions of the PC-SAR images may overlap at portions that correspond to positions of corresponding objects (targets) in the environment, such that the updated candidate measurement probability values of the respective portions may add up or accumulate (positively interfere) at portions that correspond to positions of corresponding objects (targets) in the environment and may cancel each other out or average each other out (negatively interfere) at portions that do not correspond to the portions of the corresponding objects (targets) in the environment.
In some embodiments, the PC-SAR images of the set of PC-SAR images include grid maps, and the portions of the PC-SAR-images correspond to cells of the grid maps.
For example, the cells of the grid maps may have a rectangular or a quadratic shape. However, the present disclosure is not limited thereto. For example, the cells may instead have a triangular or a hexagonal shape or any other suitable shape.
For example, each cell may have an assigned candidate measurement probability value that may be updated independently from candidate measurement probability values assigned to other cells of the grid maps.
Each cell of a grid map may correspond to a portion of the environment. A size of a cell (e.g., a side length of a rectangular or quadratic cell, a diameter of a hexagonal cell, an altitude of a triangular cell or a diameter of a circular cell) may, for example, correspond to 1 centimeter, 1 decimeter or
1 meter in the environment, without limiting the present disclosure to these values or to these orders of magnitude.
All cells of a grid map may have a same size and shape. However, the present disclosure is not limited thereto, and a grid map may include cells of different sizes and/ or of different shapes.
For example, all PC-SAR images of the set of PC-SAR images may include grid maps, wherein the grid maps of all PC-SAR images of the set of PC-SAR images may be equally configured, e.g., the grid maps of all PC-SAR images may have a same number of cells, a same number of rows and columns of cells, a same shape of cells, a same size of cells, and a same mapping of cells to the environment. Likewise, a grid map of the SAR image to be generated by the circuitry may be equally configured as the grid maps of the PC-SAR images.
However, the present disclosure is not limited to equally configured grid maps. For example, the grid maps of the set of PC-SAR images and/ or the grid map of the SAR image to be generated may be configured differently from each other. However, in some embodiments, a grid map of each PC- SAR image of the set of PC-SAR images as well as a grid map of the SAR image to be generated by the circuity may have a defined mapping from its cells to positions in the environment, and a grid map of each PC-SAR image of the set of PC-SAR images may have a defined mapping to a grid map of the SAR image to be generated by the circuitry.
For example, cells of a grid map of the SAR image to be generated by the circuitry may correspond to pixels of the SAR image. When displaying the SAR image, the pixels of the SAR image that correspond to the cells of the grid map of the SAR image may be displayed with a color that corresponds to a probability, e.g., a measurement probability or a target probability, that is assigned to the corresponding cell.
In some embodiments, the circuitry is further configured to correct the received phase of the radar signal based on the range of the radar signal; and perform the generating and the mapping of the set of data tuples based on the corrected phase.
For example, the range of the radar signal may represent a distance between the radar sensor and an object in the environment that has reflected the radar signal, and the circuitry may correct the received phase based on the range of the radar signal. The circuitry may, for example, obtain the distance based on the radar signal (e.g., based on a phase between the received phase of the radar signal and a phase of a previously emitted radar signal that corresponds to the radar signal received from the environment). For example, the correcting of the received phase may be based on multiplying a phase correction factor to a complex pointer that represents, with its absolute value, a received power (or a measurement probability based on the received power) and, with its argument, a received phase of the radar signal. For example, the phase correction factor may be based on a complex exponential function, as shown in equation (3):
Figure imgf000017_0001
In equation (3), j is the imaginary unit, ƒstart is a start frequency of a frequency ramp of a radar signal emitted into the environment, c0 is the speed of light, and r is the determined distance between the radar sensor and an object that reflects the radar signal. The factor 2 is inserted for taking into account a round trip from the sensor to the reflecting object and back to the sensor.
Thus, a corrected phase may correspond to a phase at a position of the radar sensor. Correcting the received phase in such a way allows, in some embodiments, comparing received phases of radar signals received at different sensor positions in the environment, e.g., at different distances from the reflecting object.
The circuitry may correct the received phase before generating the set of data tuples and before mapping the set of data tuples to the set of PC-SAR images. The circuitry may correct the received phase before generating the set of candidate phase values. The circuitry may generate the set of candidate phase values based on the corrected phase, and may perform the generating of the set of data tuples and the mapping of the set of data tuples to the set of PC-SAR images based on the candidate phase values that are based on the corrected phase.
Some embodiments pertain to a method for generating a SAR image, wherein the method includes obtaining, based on a radar signal detected by a radar sensor, radar measurement data that indicate a received power and a received phase of the radar signal; mapping the received power of the radar signal to a measurement probability; generating a set of data tuples by associating each candidate phase value of a set of candidate phase values with a candidate measurement probability value of a set of candidate measurement probability values that correspond to a distribution of the measurement probability according to a phase uncertainty of the received phase; mapping the set of data tuples to a set of PC-SAR images; and determining a target phase value based on the set of PC- SAR images.
The method may be configured as described above with respect to the circuitry. Thus, all features of the circuitry may correspond to features of the method. The methods as described herein are also implemented in some embodiments as a computer program causing a computer and/or a processor to perform the method, when being carried out on the computer and/ or processor. In some embodiments, also a non-transitory computer-readable recording medium is provided that stores therein a computer program product, which, when executed by a processor, such as the processor described above, causes the methods described herein to be performed.
Returning to Fig. 1, Fig. 1 shows a block diagram of a circuitry 1 for generating a SAR image according to an embodiment. The circuitry 1 includes a processor 2, a storage unit 3 and a communication unit 4.
The processor 2 controls an operation of the circuitry 1 based on instructions stored in the storage unit 3. The storage unit 3 stores the instructions for the processor 2 and further data for generating a SAR image, including radar measurement data, temporary data (including PC-SAR images) and the generated SAR image. The communication unit 4 includes a universal storage bus (USB) interface, receives, via the USB interface, radar measurement data and outputs, via the USB interface, a SAR image generated based on the received radar measurement data.
The circuitry 1 further includes a measurement data obtaining unit 5, a phase correcting unit 6, a power mapping unit 7, a data tuple generating unit 8, a data tuple mapping unit 9, an updating unit 10 and a target phase determining unit 11.
The measurement data obtaining unit 5 obtains, based on radar signals detected by a radar sensor, radar measurement data that indicate a received power of the radar signals, a received phase of the radar signals and a range of the radar signals. The radar signals are detected by the radar sensor from different positions within an environment of the radar sensor. The radar signals correspond to radar signals emitted by the radar sensor into the environment and are reflected by objects (targets) in the environment. The radar sensor includes a chirped-sequence radar sensor that is capable of measuring a received power and a received phase of received radar signals. The range of the radar signals is determined based on the received phase of the radar signals. The measurement data obtaining unit 5 obtains the radar measurement data via the communication unit 4 and stores the radar measurement data in the storage unit 3.
The phase correcting unit 6 corrects the received phase of the radar signals based on a distance of the radar signals. The correcting of the received phase is based on equation (3). The phase correcting unit 6 obtains the radar measurement data from the storage unit 3 and stores the corrected phase in the storage unit 3. The power mapping unit 7 maps the received power of the radar signals to a measurement probability. The power mapping unit 7 receives the received power from the storage unit 3 and stores the measurement probability, to which the received power is mapped, in the storage unit 3.
The power mapping unit 7 includes a signal-to-noise ratio (SNR) determining unit 12 and a probability determining unit 13. The SNR determining unit 12 determines a SNR of the radar signals based on a quotient of the received power and a predetermined noise level. The probability determining unit 13 determines the measurement probability based on an exponential function, wherein the exponent of the exponential function includes a product of the SNR determined by the SNR determining unit 12 with a predetermined scaling factor. The power mapping unit 7 outputs, to the storage unit 3, the measurement probability determined by the probability determining unit 13.
The data tuple generating unit 8 generates, for each radar signal of the radar measurement data, a set of data tuples by associating each candidate phase value of a set of candidate phase values with a candidate measurement probability value of a set of candidate measurement probability values that correspond to a distribution of the measurement probability according to a phase uncertainty of the received phase. The data tuple generating unit 8 receives the measurement probability and the received phase (represented by the corrected phase corrected by the phase correcting unit 6) from the storage unit 3 and stores the generated set of data tuples in the storage unit 3.
The distribution of the candidate measurement probability values of the set of data tuples corresponds to a probability distribution, wherein a mean and a mode of the probability distribution corresponds to the received phase of the radar signal and has a probability value that corresponds to the received power of the radar signal. In the embodiment of Fig. 1, the distribution of the measurement probability corresponds to a Gaussian distribution (normal distribution). A mean and a mode of the distribution are located at the corrected phase of the corresponding radar signal (which is corrected by the phase correcting unit 6 and which represents the received phase of the radar signal), and a probability value of the distribution at the mean and at the mode corresponds to the measurement probability determined by the power mapping unit 7 based on the received power of the corresponding radar signal. A standard deviation of the distribution corresponds to a phase uncertainty of the received phase.
The data tuple generating unit 8 evaluates a probability density function of the distribution of the measurement probability at the candidate phase values of the set of candidate phase values. The candidate phase values of the set of candidate phase values are predetermined and are equally distributed in an interval from 0° to 360°, i.e., the set of candidate phase values covers one period. The values obtained by evaluating the probability density function at the candidate phase values of the set of candidate phase values are the candidate measurement probability values of the set of measurement probability values.
For generating the set of data tuples, the data tuple generating unit 8 associates each candidate phase value of the set of candidate phase values with the corresponding candidate measurement probability value of the set of candidate measurement probability values, wherein each pair of a candidate phase value and a candidate measurement probability value associated with each other is a data tuple of the set of data tuples.
The data tuple generating unit 8 generates a set of data tuples for each radar signal indicated by the radar measurement data and stores each generated set of data tuples in the storage unit 3.
The data tuple mapping unit 9 maps the sets of data tuples generated by the data tuple generating unit 8 to a set of PC-SAR images. The data tuple mapping unit 9 receives the sets of data tuples from the storage unit 3 and stores a generated mapping in the storage unit 3.
Each PC-SAR image of the set of PC-SAR images is associated with a candidate phase value of the set of candidate phase values. The data tuple mapping unit 9 maps each data tuple generated by the data tuple generating unit 8 to the PC-SAR image of the set of PC-SAR images that is associated with the candidate phase value of the data tuple.
Furthermore, each PC-SAR image of the set of PC-SAR images includes a grid map, wherein each cell of the grid map is quadratic and corresponds to a region of one centimeter by one centimeter in the environment. The grid maps of the PC-SAR images have a same number of rows of cells and have a same number of columns of cells, wherein cells at equal positions in the grid maps correspond to a same region in the environment for all grid maps. The mapping of the set of data tuples includes mapping the set of data tuples to portions, of the respective PC-SAR images, that correspond to the respective radar signal, wherein the portions of the respective PC-SAR images correspond to cells of the grid maps of the PC-SAR images.
The data tuple mapping unit 9 determines which cells correspond to the radar signal based on the range of the radar signal and based on a position of the radar sensor in the environment at a time of receiving the radar signal. Then, the data tuple mapping unit 9 maps each respective data tuple to the determined cells, of the PC-SAR image associated with the data tuple, that correspond to the range of the radar signal around the position of the radar sensor in the environment at the time of receiving the radar signal.
The data tuple mapping unit 9 performs the mapping for each set of data tuples generated by the data tuple generating unit 8 for each radar signal indicated by the radar measurement data. The updating unit 10 updates, based on the sets of data tuples generated for the radar signals, the candidate measurement probability values associated with the respective PC-SAR images for the radar signals. The updating unit 10 receives the mapping generated by the data tuple mapping unit 9 from the storage unit 3 and stores updated candidate measurement probability values in the storage unit 3.
The updating of the candidate measurement probability values is based on a product of a first factor and a second factor, wherein the first factor is based on the candidate measurement probability values associated with the respective PC-SAR images and the second factor is based on the respective candidate measurement probability values of the respective radar signals. More concretely, the updating is based on equation (2) and is performed iteratively for each data tuple.
The target phase determining unit 11 determines a target phase value based on the set of PC-SAR images, wherein the determining of the target phase value includes selecting, as the target phase value, a candidate phase value that is mapped to a PC-SAR image associated with a highest updated candidate measurement probability value among the set of PC-SAR images. The target phase determining unit 11 receives the set of PC-SAR images and the updated candidate measurement probability values determined by the updating unit 10 from the storage unit 3 and stores the determined target phase value in the storage unit 3.
More concretely, the target phase determining unit 11 determines for each cell of the grid maps of the set of PC-SAR images a target phase value that indicates a candidate phase value associated with the PC-SAR image from which the candidate measurement probability value of the corresponding cell should be used for the SAR image to be generated by the circuitry 1. The target phase determining unit 11 selects, as the target phase value, the candidate phase value associated with the PC-SAR image of the set of PC-SAR images in which a largest candidate measurement probability value is assigned to the corresponding cell.
The circuitry 1 further generates a SAR image that includes a grid map with a same number of columns of cells and with a same number of rows of cells as the grid maps of the set of PC-SAR images, and wherein the cells of the grid map of the SAR image correspond to a same respective region in the environment as the corresponding cells in the grid maps of the set of PC-SAR images. The circuitry 1 assigns to each cell of the grid map of the SAR image a value that corresponds to the candidate measurement probability value that is assigned to the corresponding cell of the PC-SAR image associated with the target phase value determined by the target phase determining unit 11. The circuitry 1 then outputs the generated SAR image via the communication unit 4. Fig. 2 shows a flow diagram of a method 20 for generating a SAR image according to an embodiment. The method 20 is performed by the circuitry 1 of Fig. 1.
At S21, the method 20 obtains radar measurement data. The obtaining of radar measurement data at S21 is performed by the measurement data obtaining unit 5 of Fig. 1. The obtaining of measurement data at S21 obtains, based on radar signals detected by a radar sensor, radar measurement data that indicate a received power of the radar signals, a received phase of the radar signals and a range of the radar signals. The radar signals are detected by the radar sensor from different positions within an environment of the radar sensor. The radar signals correspond to radar signals emitted by the radar sensor into the environment and are reflected by objects (targets) in the environment. The radar sensor includes a chirped-sequence radar sensor that is capable of measuring a received power and a received phase of received radar signals. The range of the radar signals is determined based on the received phase of the radar signals. The obtaining of measurement data obtains the radar measurement data via the communication unit 4 of Fig. 1 and stores the radar measurement data in the storage unit 3 of Fig. 1.
At S22, the method 20 corrects the received phase of the radar signals. The correcting of the received phase at S22 is performed by the phase correcting unit 6 of Fig. 1. The correcting of the received phase at S22 corrects the received phase of the radar signals based on a distance of the radar signals. The correcting of the received phase is based on equation (3). The correcting of the received phase obtains the radar measurement data from the storage unit 3 of Fig. 1 and stores the corrected phase in the storage unit 3 of Fig. 1.
At S23, the method 20 maps the received power of the radar signals to a measurement probability. The mapping of the received power at S23 is performed by the power mapping unit 7 of Fig. 1. The mapping of the received power receives the received power from the storage unit 3 of Fig. 1 and stores the measurement probability, to which the received power is mapped, in the storage unit 3 of Fig- 1.
The mapping of the received power at S23 includes determining a signal-to-noise ratio (SNR) at S24 and determining a measurement probability at S25. The determining of the SNR at S24 is performed by the SNR determining unit 12 of Fig. 1 and determines a SNR of the radar signals based on a quotient of the received power and a predetermined noise level. The determining of the measurement probability at S25 is performed by the probability determining unit 13 of Fig. 1 and determines the measurement probability based on an exponential function, wherein the exponent of the exponential function includes a product of the SNR determined at S24 with a predetermined scaling factor. The mapping of the received power at S23 outputs, to the storage unit 3 of Fig. 1, the measurement probability determined at S25.
At S26, the method 20 generates a set of data tuples. The generating of the set of data tuples at S26 is performed by the data tuple generating unit 8 of Fig. 1 and generates, for each radar signal of the radar measurement data, a set of data tuples by associating each candidate phase value of a set of candidate phase values with a candidate measurement probability value of a set of candidate measurement probability values that correspond to a distribution of the measurement probability according to a phase uncertainty of the received phase. The generating of the set of data tuples at S26 receives the measurement probability and the received phase (represented by the corrected phase corrected at S22) from the storage unit 3 of Fig. 1 and stores the generated set of data tuples in the storage unit 3 of Fig. 1.
The distribution of the candidate measurement probability values of the set of data tuples corresponds to a probability distribution, wherein a mean and a mode of the probability distribution corresponds to the received phase of the radar signal and has a probability value that corresponds to the received power of the radar signal. In the embodiment of Fig. 2, the distribution of the measurement probability corresponds to a Gaussian distribution (normal distribution). A mean and a mode of the distribution are located at the corrected phase of the corresponding radar signal (which is corrected at S22 and which represents the received phase of the radar signal), and a probability value of the distribution at the mean and at the mode corresponds to the measurement probability determined at S23 based on the received power of the corresponding radar signal. A standard deviation of the distribution corresponds to a phase uncertainty of the received phase.
The generating of the set of data tuples at S26 evaluates a probability density function of the distribution of the measurement probability at the candidate phase values of the set of candidate phase values. The candidate phase values of the set of candidate phase values are predetermined and are equally distributed in an interval from 0° to 360°, i.e., the set of candidate phase values covers one period. The values obtained by evaluating the probability density function at the candidate phase values of the set of candidate phase values are the candidate measurement probability values of the set of measurement probability values.
For generating the set of data tuples, the generating of the set of data tuples at S26 associates each candidate phase value of the set of candidate phase values with the corresponding candidate measurement probability value of the set of candidate measurement probability values, wherein each pair of a candidate phase value and a candidate measurement probability value associated with each other is a data tuple of the set of data tuples. The generating of the set of data tuples at S26 generates a set of data tuples for each radar signal indicated by the radar measurement data and stores each generated set of data tuples in the storage unit 3 of Fig. 1.
At S27, the method 20 maps the sets of data tuples generated at S26 to a set of PC-SAR images. The mapping of the set of data tuples at S27 is performed by the data tuple mapping unit 9. The mapping of the sets of data tuples at S27 receives the sets of data tuples from the storage unit 3 of Fig. 1 and stores a generated mapping in the storage unit 3 of Fig. 1.
Each PC-SAR image of the set of PC-SAR images is associated with a candidate phase value of the set of candidate phase values. The mapping of the sets of data tuples at S27 maps each data tuple generated at S26 to the PC-SAR image of the set of PC-SAR images that is associated with the candidate phase value of the data tuple.
Furthermore, each PC-SAR image of the set of PC-SAR images includes a grid map, wherein each cell of the grid map is quadratic and corresponds to a region of one centimeter by one centimeter in the environment. The grid maps of the PC-SAR images have a same number of rows of cells and have a same number of columns of cells, wherein cells at equal positions in the grid maps correspond to a same region in the environment for all grid maps. The mapping of the set of data tuples includes mapping the set of data tuples to portions, of the respective PC-SAR images, that correspond to the respective radar signal, wherein the portions of the respective PC-SAR images correspond to cells of the grid maps of the PC-SAR images.
The mapping of the sets of data tuples at S27 determines which cells correspond to the radar signal based on the range of the radar signal and based on a position of the radar sensor in the environment at a time of receiving the radar signal. Then, the mapping of the sets of data tuples at S27 maps each respective data tuple to the determined cells, of the PC-SAR image associated with the data tuple, that correspond to the range of the radar signal around the position of the radar sensor in the environment at the time of receiving the radar signal.
The mapping of the sets of data tuples at S27 performs the mapping for each set of data tuples generated at S26 for each radar signal indicated by the radar measurement data.
At S28, the method 20 updates candidate measurement probability values associated with PC-SAR images. The updating at S28 is performed by the updating unit 10 and updates, based on the sets of data tuples generated for the radar signals, the candidate measurement probability values associated with the respective PC-SAR images for the radar signals. The updating at S28 receives the mapping generated at S27 from the storage unit 3 of Fig. 1 and stores updated candidate measurement probability values in the storage unit 3 of Fig. 1. The updating of the candidate measurement probability values is based on a product of a first factor and a second factor, wherein the first factor is based on the candidate measurement probability values associated with the respective PC-SAR images and the second factor is based on the respective candidate measurement probability values of the respective radar signals. More concretely, the updating is based on equation (2) and is performed iteratively for each data tuple.
At S29, the method 20 determines a target phase value. The determining of the target phase value 29 is performed by the target phase determining unit 11 of Fig. 1 and determines a target phase value based on the set of PC-SAR images, wherein the determining of the target phase value includes selecting, as the target phase value, a candidate phase value that is mapped to a PC-SAR image associated with a highest updated candidate measurement probability value among the set of PC- SAR images. The determining of the target phase value at S29 receives the set of PC-SAR images and the updated candidate measurement probability values determined at S28 from the storage unit 3 of Fig. 1 and stores the determined target phase value in the storage unit 3 of Fig. 1.
More concretely, the determining of the target phase value at S29 determines for each cell of the grid maps of the set of PC-SAR images a target phase value that indicates a candidate phase value associated with the PC-SAR image from which the candidate measurement probability value of the corresponding cell should be used for the SAR image to be generated by the method 20. The determining of the target phase value at S29 selects, as the target phase value, the candidate phase value associated with the PC-SAR image of the set of PC-SAR images in which a largest candidate measurement probability value is assigned to the corresponding cell.
The method 20 further generates a SAR image that includes a grid map with a same number of columns of cells and with a same number of rows of cells as the grid maps of the set of PC-SAR images, and wherein the cells of the grid map of the SAR image correspond to a same respective region in the environment as the corresponding cells in the grid maps of the set of PC-SAR images. The method 20 assigns to each cell of the grid map of the SAR image a value that corresponds to the candidate measurement probability value that is assigned to the corresponding cell of the PC- SAR image associated with the target phase value determined at S29. The method 20 then outputs the generated SAR image via the communication unit 4 of Fig. 1.
Accordingly, in some embodiments, the present disclosure enables a processing of a synthetic aperture radar (SAR) using probabilities. In contrast to conventional SAR processing, in some embodiments, not only the complex numbers (e.g., with an absolute value that corresponds to a received power of a radar signal and with an argument that corresponds to a received phase of the radar signal) are added, but probabilistic assumptions are made based on the radar measurement data and models are created which, in some embodiments, lead to a more robust SAR image of the environment.
As a result, in some embodiments, the resulting image is no longer power-dependent but probability-dependent. This may allow a direct state description of each cell of a grid map of a SAR image.
Some embodiments differ from a conventional SAR algorithm by a range evaluation and an actual SAR processing, which is based on an ordinary backproj ection algorithm.
In the following, an embodiment is described which processes a radar signal that represents a frequency ramp in two steps for generating a SAR image.
In a first step, for each received ramp k, a one-dimensional Fast Fourier Transform (FFT) is calculated to transform time samples acquired by SAR into the frequency domain for determining range data. Since this is amplitude-dependent, it is calculated on the basis of the noise level according to equation (4):
Figure imgf000026_0001
Here, FFTRange,prob represents the Fourier transform of the radar signal after mapping the received power of the radar signal to a measurement probability represents the complex
Figure imgf000026_0002
argument of the Fourier transform of the radar signal and corresponds to the received phase of the radar signal. The measured angles remain unchanged, but the complex pointers have only a length in the interval between 0 and 1. l marks a factor with which a limited growth of the measurement probability can be determined. Since this step is not significantly dependent on a noise level, but only oriented on it, the noise level is determined sufficiently exactly by a median calculation, i.e., the noise level corresponds to a median of the received power of the radar signals received from the environment. The signal-to- noise ratio SNR of the radar signal is determined as a quotient of the received power of the radar signal and the noise level.
In a second step, a backproj ection is applied based on the probabilistic range FFT FFTRange,prob· Since a backprojection does not exist for probabilistic complex pointers, a corresponding model is used, which is presented in the following.
A phase correction is applied to a measured complex pointer Ar according to equation (5).
Figure imgf000026_0003
The phase correction is based on multiplying the phase correction factor of equation (3). Therefore, for a description of the phase correction factor, it is referred to the description regarding equation (3) above. The phase-corrected complex pointer
Figure imgf000027_0006
includes, like the measured complex pointer Ar, an absolute value that corresponds to the received power of the corresponding radar signal and an argument that corresponds to the received phase of the corresponding radar signal, wherein a value of the received phase indicated by is corrected with respect to a value indicated
Figure imgf000027_0005
by Ar.
Since A'r corresponds to a complex number, it is divided into the two components and Z ,
Figure imgf000027_0003
Figure imgf000027_0004
wherein the absolute value corresponds to the received power of the corresponding radar signal
Figure imgf000027_0002
and is represented by the measurement probability determined in the first step.
Due to measurement uncertainties, a distribution of the received phase of the radar signal is approximated based on the argument Z-Ar' with a Gaussian distribution according to equation (6).
Figure imgf000027_0001
Here, the expected value /z corresponds to the (corrected) received phase of the radar signal and thus applies. In addition, the standard deviation a corresponds to a phase uncertainty of
Figure imgf000027_0007
the radar sensor and is chosen accordingly. According to the Gaussian distribution, different candidate phase values generated based on the (corrected) received phase ZAJ. of the radar signal have different probabilities, as indicated in Fig. 3.
Fig. 3 shows a diagram 30 of a probability distribution of a received phase according to an embodiment. The diagram 30 indicates, in a horizontal direction, one period of a received phase of a radar signal and, in a vertical direction, a probability of a corresponding phase value. The diagram 30 shows a probability density function 31 of a Gaussian distribution.
The maximum value of the probability density function 31, which is indicated by an arrow 32, corresponds to both a mean and a mode of the probability density function and is located at a phase value (60° in Fig. 3) that corresponds to the (corrected) received phase . The maximum value
Figure imgf000027_0008
corresponds to the magnitude of FFTRange prob, i.e., to , which means that stronger targets have
Figure imgf000027_0009
a higher probability. The probability distribution of Fig. 3 has a standard deviation (σ that corresponds to a phase uncertainty of the radar sensor.
For obtaining a set of candidate measurement probability values, the probability density function 31 is evaluated at candidate phase values of a set of candidate phase values. In the embodiment of Fig. 3, the set of candidate phase values includes 18 candidate phase values that are evenly distributed over one period, such that the candidate phase values are 20° apart from each other (without limiting the disclosure to these values). The set of candidate measurement probability values includes the values obtained by evaluating the probability density function 31 at the candidate phase values.
A set of data tuples is generated, wherein the data tuples of the set of data tuples associate each a respective candidate phase value of the set of candidate phase values with a candidate measurement probability value of the set of candidate probability values obtained by evaluating the probability density function 31 at the respective candidate phase value. For example, a data tuple, indicated by an arrow 33, of the set of data tuples associates a candidate phase value of 20° with a candidate measurement probability value obtained by evaluating the probability density function 31 at 20°. Likewise, a data tuple, indicated by an arrow 34, of the set of data tuples associates a candidate phase value of 40° with a candidate measurement probability value corresponding to 40°. A further data tuple, indicated by an arrow 35, of the set of data tuples, associates a candidate phase value of 60° with a candidate measurement probability value corresponding to 60°. The arrows 33, 34 and 35 are only shown for illustrative purposes; as mentioned, the set of data tuples includes a data tuple for each candidate phase value of the set of candidate phase values.
The set of data tuples is mapped to a set of PC-SAR images. Each PC-SAR image of the set of PC- SAR images includes a grid map, and the set of data tuples generated for a radar signal is mapped to cells of the grid maps of the PC-SAR images that correspond to a region in an environment from which the radar signal is received.
A candidate measurement probability value of the respective cells of the grid maps is updated based on an update formula for occupancy grid maps (OGM) according to equation (2). Since the update formula for OGM is based on scalars, accordingly only scalar update steps can be performed.
Therefore, not only one PC-SAR image is processed but a set of PC-SAR images. Each PC-SAR image of the set of PC-SAR images is associated with a respective candidate phase value of the set of candidate phase values and represents a certain phase value of a resulting image (i.e., of a SAR image to be generated).
The set of data tuples is mapped to the set of PC-SAR images such that the candidate measurement probability value of each data tuple is assigned to a cell corresponding to the radar signal (i.e., corresponding to a region in the environment which an object has reflected the radar signal) in a PC-SAR image that is associated with the candidate phase value of the respective data tuple. Fig. 4 illustrates a mapping of the set of data tuples to the set of PC-SAR images according to an embodiment. Fig. 4 shows how the three data tuples indicated by the arrows 33, 34 and 35 of Fig. 3 are mapped to the corresponding PC-SAR images.
A PC-SAR image 41 of the set of PC-SAR images is associated with a candidate phase value of 20°. Thus, the data tuple indicated by the arrow 33 of Fig. 3, which also corresponds to a candidate phase value of 20°, is mapped to a cell 42, of the PC-SAR image 41, that corresponds to the radar signal. This is illustrated by an arrow 43 in the cell 42.
Likewise, the PC-SAR image 44 of the set of PC-SAR images is associated with a candidate phase value of 40°, and the data tuple indicated by the arrow 34 of Fig. 3, which also corresponds to a candidate phase value of 40°, is mapped to a cell 45, of the PC-SAR image 44, that corresponds to the radar signal. This is illustrated by an arrow 46 in the cell 45.
Also, the PC-SAR image 47 of the set of PC-SAR images is associated with a candidate phase value of 60°, and the data tuple indicated by the arrow 35 of Fig. 3, which also corresponds to a candidate phase value of 60°, is mapped to a cell 48, of the PC-SAR image 47, that corresponds to the radar signal. This is illustrated by an arrow 49 in the cell 48.
The candidate measurement probability values associated with the respective cells 42, 45 and 48 are then updated in a probabilistic updating based on the update formula according to equation (2) for the candidate measurement probability values of the corresponding data tuples of the set of data tuples. The probabilistic updating in the PC-SAR images of the set of PC-SAR images allows to compensate for the different candidate phase values, which cannot be directly accounted for by the update formula according to equation (2).
The first step and the second step are repeated for further radar signals that are received from different positions within the environment, wherein candidate measurement probability values of the same set of PC-SAR images are updated for all radar signals.
A final result is determined using a maximum search over all PC-SAR images of the set of PC-SAR images, generating a probabilistic SAR image of the environment. As the final result, the probabilistic SAR image is generated with a grid map. For each cell of the grid map of the probabilistic SAR image, a largest candidate measurement probability value is selected from the cells of the grid maps of the set of PC-SAR images that correspond to a same region in the environment as the cell in the probabilistic SAR image, and the selected candidate measurement probability value is assigned to the cell in the probabilistic SAR image. The first step and the second step may, for example, the performed by the circuitry 1 of Fig. 1 and/ or by the method 20 of Fig. 2.
In some embodiments, the present technology enables robust and nearly power-independent SAR processing. In some embodiments, the resulting map is no longer power dependent but based on probabilities. Thus, strong and weak targets that lead to constructive overlap of the complex pointers may be represented as “1”, and targets whose overlap is destructive may be represented as “0”. This may allow an environment mapping of all targets and not only such targets which have a high backscatter cross section (RCS).
As mentioned, in some embodiments, the present technology enables probabilistic SAR processing based on radar raw data.
Today, in some instances, the SAR processing is based on an amplitude-based summation of complex numbers. As a consequence, in some instances, the resulting environment representation is strongly amplitude-dependent, wherein weak targets can no longer be identified due to strong targets and the dynamics of the map.
In some embodiments of the present disclosure, this is no longer the case, since the present technology provides a probabilistic environmental representation. Thus, in some embodiments, each cell has an occupancy probability which can be determined as a function of the phases and amplitudes. In some embodiments, this results in a map in which weak and strong targets have similar occupancy probabilities as long as the corrected phase leads to a constructive superposition of all complex pointers. In some embodiments, this leads to a significant robustness gain due to the amplitude independence.
In some embodiments, no additional hardware is needed for implementing the present technology, only adapted software. Thus, in some embodiments, the algorithm may be used for any SAR processing and is not limited, e.g., to a certain hardware such as a certain sensor configuration.
In the following, examples of applications of the present technology are discussed.
The technology according to an embodiment of the present disclosure is applicable to various products. For example, the technology according to an embodiment of the present disclosure may be implemented as a device included in a mobile body that is any of kinds of automobiles, electric vehicles, hybrid electric vehicles, motorcycles, bicycles, personal mobility vehicles, airplanes, drones, ships, robots, construction machinery, agricultural machinery (tractors), and the like.
Fig. 5 is a block diagram depicting an example of schematic configuration of a vehicle control system 7000 as an example of a mobile body control system to which the technology according to an embodiment of the present disclosure can be applied. The vehicle control system 7000 includes a plurality of electronic control units connected to each other via a communication network 7010. In the example depicted in Fig. 5, the vehicle control system 7000 includes a driving system control unit 7100, a body system control unit 7200, a battery control unit 7300, an outside-vehicle information detecting unit 7400, an in-vehicle information detecting unit 7500, and an integrated control unit 7600. The communication network 7010 connecting the plurality of control units to each other may, for example, be a vehicle-mounted communication network compliant with an arbitrary standard such as controller area network (CAN), local interconnect network (LIN), local area network (LAN), FlexRay (registered trademark), or the like.
Each of the control units includes: a microcomputer that performs arithmetic processing according to various kinds of programs; a storage section that stores the programs executed by the microcomputer, parameters used for various kinds of operations, or the like; and a driving circuit that drives various kinds of control target devices. Each of the control units further includes: a network interface (I/F) for performing communication with other control units via the communication network 7010; and a communication I/F for performing communication with a device, a sensor, or the like within and without the vehicle by wire communication or radio communication. A functional configuration of the integrated control unit 7600 illustrated in Fig. 5 includes a microcomputer 7610, a general-purpose communication I/F 7620, a dedicated communication I/F 7630, a positioning section 7640, a beacon receiving section 7650, an in-vehicle device I/F 7660, a sound/image output section 7670, a vehicle-mounted network I/F 7680, and a storage section 7690. The other control units similarly include a microcomputer, a communication I/F, a storage section, and the like.
The driving system control unit 7100 controls the operation of devices related to the driving system of the vehicle in accordance with various kinds of programs. For example, the driving system control unit 7100 functions as a control device for a driving force generating device for generating the driving force of the vehicle, such as an internal combustion engine, a driving motor, or the like, a driving force transmitting mechanism for transmitting the driving force to wheels, a steering mechanism for adjusting the steering angle of the vehicle, a braking device for generating the braking force of the vehicle, and the like. The driving system control unit 7100 may have a function as a control device of an antilock brake system (ABS), electronic stability control (ESC), or the like.
The driving system control unit 7100 is connected with a vehicle state detecting section 7110. The vehicle state detecting section 7110, for example, includes at least one of a gyro sensor that detects the angular velocity of axial rotational movement of a vehicle body, an acceleration sensor that detects the acceleration of the vehicle, and sensors for detecting an amount of operation of an accelerator pedal, an amount of operation of a brake pedal, the steering angle of a steering wheel, an engine speed or the rotational speed of wheels, and the like. The driving system control unit 7100 performs arithmetic processing using a signal input from the vehicle state detecting section 7110, and controls the internal combustion engine, the driving motor, an electric power steering device, the brake device, and the like.
The body system control unit 7200 controls the operation of various kinds of devices provided to the vehicle body in accordance with various kinds of programs. For example, the body system control unit 7200 functions as a control device for a keyless entry system, a smart key system, a power window device, or various kinds of lamps such as a headlamp, a backup lamp, a brake lamp, a turn signal, a fog lamp, or the like. In this case, radio waves transmitted from a mobile device as an alternative to a key or signals of various kinds of switches can be input to the body system control unit 7200. The body system control unit 7200 receives these input radio waves or signals, and controls a door lock device, the power window device, the lamps, or the like of the vehicle.
The battery control unit 7300 controls a secondary battery 7310, which is a power supply source for the driving motor, in accordance with various kinds of programs. For example, the battery control unit 7300 is supplied with information about a battery temperature, a battery output voltage, an amount of charge remaining in the battery, or the like from a battery device including the secondary battery 7310. The battery control unit 7300 performs arithmetic processing using these signals, and performs control for regulating the temperature of the secondary battery 7310 or controls a cooling device provided to the battery device or the like.
The outside-vehicle information detecting unit 7400 detects information about the outside of the vehicle including the vehicle control system 7000. For example, the outside-vehicle information detecting unit 7400 is connected with at least one of an imaging section 7410 and an outside-vehicle information detecting section 7420. The imaging section 7410 includes at least one of a time-of- flight (ToF) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras. The outside-vehicle information detecting section 7420, for example, includes at least one of an environmental sensor for detecting current atmospheric conditions or weather conditions and a peripheral information detecting sensor for detecting another vehicle, an obstacle, a pedestrian, or the like on the periphery of the vehicle including the vehicle control system 7000.
The environmental sensor, for example, may be at least one of a rain drop sensor detecting rain, a fog sensor detecting a fog, a sunshine sensor detecting a degree of sunshine, and a snow sensor detecting a snowfall. The peripheral information detecting sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR device (Light detection and Ranging device, or Laser imaging detection and ranging device). Each of the imaging section 7410 and the outside-vehicle information detecting section 7420 may be provided as an independent sensor or device, or may be provided as a device in which a plurality of sensors or devices are integrated.
Fig. 6 depicts an example of installation positions of the imaging section 7410 and the outside- vehicle information detecting section 7420. Imaging sections 7910, 7912, 7914, 7916, and 7918 are, for example, disposed at at least one of positions on a front nose, sideview mirrors, a rear bumper, and a back door of the vehicle 7900 and a position on an upper portion of a windshield within the interior of the vehicle. The imaging section 7910 provided to the front nose and the imaging section 7918 provided to the upper portion of the windshield within the interior of the vehicle obtain mainly an image of the front of the vehicle 7900. The imaging sections 7912 and 7914 provided to the sideview mirrors obtain mainly an image of the sides of the vehicle 7900. The imaging section 7916 provided to the rear bumper or the back door obtains mainly an image of the rear of the vehicle 7900. The imaging section 7918 provided to the upper portion of the windshield within the interior of the vehicle is used mainly to detect a preceding vehicle, a pedestrian, an obstacle, a signal, a traffic sign, a lane, or the like.
Incidentally, Fig. 6 depicts an example of photographing ranges of the respective imaging sections 7910, 7912, 7914, and 7916. An imaging range a represents the imaging range of the imaging section 7910 provided to the front nose. Imaging ranges b and c respectively represent the imaging ranges of the imaging sections 7912 and 7914 provided to the sideview mirrors. An imaging range d represents the imaging range of the imaging section 7916 provided to the rear bumper or the back door. A bird’s-eye image of the vehicle 7900 as viewed from above can be obtained by superimposing image data imaged by the imaging sections 7910, 7912, 7914, and 7916, for example.
Outside-vehicle information detecting sections 7920, 7922, 7924, 7926, 7928, and 7930 provided to the front, rear, sides, and corners of the vehicle 7900 and the upper portion of the windshield within the interior of the vehicle may be, for example, an ultrasonic sensor or a radar device. The outside- vehicle information detecting sections 7920, 7926, and 7930 provided to the front nose of the vehicle 7900, the rear bumper, the back door of the vehicle 7900, and the upper portion of the windshield within the interior of the vehicle may be a LIDAR device, for example. These outside- vehicle information detecting sections 7920 to 7930 are used mainly to detect a preceding vehicle, a pedestrian, an obstacle, or the like.
Returning to Fig. 5, the description will be continued. The outside-vehicle information detecting unit 7400 makes the imaging section 7410 image an image of the outside of the vehicle, and receives imaged image data. In addition, the outside- vehicle information detecting unit 7400 receives detection information from the outside-vehicle information detecting section 7420 connected to the outside-vehicle information detecting unit 7400. In a case where the outside-vehicle information detecting section 7420 is an ultrasonic sensor, a radar device, or a LIDAR device, the outside-vehicle information detecting unit 7400 transmits an ultrasonic wave, an electromagnetic wave, or the like, and receives information of a received reflected wave. On the basis of the received information, the outside-vehicle information detecting unit 7400 may perform processing of detecting an object such as a human, a vehicle, an obstacle, a sign, a character on a road surface, or the like, or processing of detecting a distance thereto. The outside-vehicle information detecting unit 7400 may perform environment recognition processing of recognizing a rainfall, a fog, road surface conditions, or the like on the basis of the received information. The outside-vehicle information detecting unit 7400 may calculate a distance to an object outside the vehicle on the basis of the received information.
In addition, on the basis of the received image data, the outside-vehicle information detecting unit 7400 may perform image recognition processing of recognizing a human, a vehicle, an obstacle, a sign, a character on a road surface, or the like, or processing of detecting a distance thereto. The outside-vehicle information detecting unit 7400 may subject the received image data to processing such as distortion correction, alignment, or the like, and combine the image data imaged by a plurality of different imaging sections 7410 to generate a bird’s-eye image or a panoramic image. The outside- vehicle information detecting unit 7400 may perform viewpoint conversion processing using the image data imaged by the imaging section 7410 including the different imaging parts.
The in-vehicle information detecting unit 7500 detects information about the inside of the vehicle. The in-vehicle information detecting unit 7500 is, for example, connected with a driver state detecting section 7510 that detects the state of a driver. The driver state detecting section 7510 may include a camera that images the driver, a biosensor that detects biological information of the driver, a microphone that collects sound within the interior of the vehicle, or the like. The biosensor is, for example, disposed in a seat surface, the steering wheel, or the like, and detects biological information of an occupant sitting in a seat or the driver holding the steering wheel. On the basis of detection information input from the driver state detecting section 7510, the in-vehicle information detecting unit 7500 may calculate a degree of fatigue of the driver or a degree of concentration of the driver, or may determine whether the driver is dozing. The in-vehicle information detecting unit 7500 may subject an audio signal obtained by the collection of the sound to processing such as noise canceling processing or the like.
The integrated control unit 7600 controls general operation within the vehicle control system 7000 in accordance with various kinds of programs. The integrated control unit 7600 is connected with an input section 7800. The input section 7800 is implemented by a device capable of input operation by an occupant, such, for example, as a touch panel, a button, a microphone, a switch, a lever, or the like. The integrated control unit 7600 may be supplied with data obtained by voice recognition of voice input through the microphone. The input section 7800 may, for example, be a remote control device using infrared rays or other radio waves, or an external connecting device such as a mobile telephone, a personal digital assistant (PDA), or the like that supports operation of the vehicle control system 7000. The input section 7800 may be, for example, a camera. In that case, an occupant can input information by gesture. Alternatively, data may be input which is obtained by detecting the movement of a wearable device that an occupant wears. Further, the input section 7800 may, for example, include an input control circuit or the like that generates an input signal on the basis of information input by an occupant or the like using the above-described input section 7800, and which outputs the generated input signal to the integrated control unit 7600. An occupant or the like inputs various kinds of data or gives an instruction for processing operation to the vehicle control system 7000 by operating the input section 7800.
The storage section 7690 may include a read only memory (ROM) that stores various kinds of programs executed by the microcomputer and a random access memory (RAM) that stores various kinds of parameters, operation results, sensor values, or the like. In addition, the storage section 7690 may be implemented by a magnetic storage device such as a hard disc drive (HDD) or the like, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
The general-purpose communication I/F 7620 is a communication I/F used widely, which communication I/F mediates communication with various apparatuses present in an external environment 7750. The general-purpose communication I/F 7620 may implement a cellular communication protocol such as global system for mobile communications (GSM (registered trademark)), worldwide interoperability for microwave access (WiMAX (registered trademark)), long term evolution (LTE (registered trademark)), LTE-advanced (LTE-A), or the like, or another wireless communication protocol such as wireless LAN (referred to also as wireless fidelity (Wi-Fi (registered trademark)), Bluetooth (registered trademark), or the like. The general-purpose communication I/F 7620 may, for example, connect to an apparatus (for example, an application server or a control server) present on an external network (for example, the Internet, a cloud network, or a company-specific network) via a base station or an access point. In addition, the general-purpose communication I/F 7620 may connect to a terminal present in the vicinity of the vehicle (which terminal is, for example, a terminal of the driver, a pedestrian, or a store, or a machine type communication (MTC) terminal) using a peer to peer (P2P) technology, for example.
The dedicated communication I/F 7630 is a communication I/F that supports a communication protocol developed for use in vehicles. The dedicated communication I/F 7630 may implement a standard protocol such, for example, as wireless access in vehicle environment (WAVE), which is a combination of institute of electrical and electronic engineers (IEEE) 802.11p as a lower layer and IEEE 1609 as a higher layer, dedicated short range communications (DSRC), or a cellular communication protocol. The dedicated communication I/F 7630 typically carries out V2X communication as a concept including one or more of communication between a vehicle and a vehicle (Vehicle to Vehicle), communication between a road and a vehicle (Vehicle to Infrastructure), communication between a vehicle and a home (Vehicle to Home), and communication between a pedestrian and a vehicle (Vehicle to Pedestrian).
The positioning section 7640, for example, performs positioning by receiving a global navigation satellite system (GNSS) signal from a GNSS satellite (for example, a GPS signal from a global positioning system (GPS) satellite), and generates positional information including the latitude, longitude, and altitude of the vehicle. Incidentally, the positioning section 7640 may identify a current position by exchanging signals with a wireless access point, or may obtain the positional information from a terminal such as a mobile telephone, a personal handyphone system (PHS), or a smart phone that has a positioning function.
The beacon receiving section 7650, for example, receives a radio wave or an electromagnetic wave transmitted from a radio station installed on a road or the like, and thereby obtains information about the current position, congestion, a closed road, a necessary time, or the like. Incidentally, the function of the beacon receiving section 7650 may be included in the dedicated communication I/F 7630 described above.
The in-vehicle device I/F 7660 is a communication interface that mediates connection between the microcomputer 7610 and various in-vehicle devices 7760 present within the vehicle. The in-vehicle device I/F 7660 may establish wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), near field communication (NFC), or wireless universal serial bus (WUSB). In addition, the in-vehicle device I/F 7660 may establish wired connection by universal serial bus (USB), high-definition multimedia interface (HDMI (registered trademark)), mobile high-definition link (MHL), or the like via a connection terminal (and a cable if necessary) not depicted in the figures. The in-vehicle devices 7760 may, for example, include at least one of a mobile device and a wearable device possessed by an occupant and an information device carried into or attached to the vehicle. The in-vehicle devices 7760 may also include a navigation device that searches for a path to an arbitrary destination. The in-vehicle device I/F 7660 exchanges control signals or data signals with these in-vehicle devices 7760. The vehicle-mounted network I/F 7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010. The vehicle-mounted network I/F 7680 transmits and receives signals or the like in conformity with a predetermined protocol supported by the communication network 7010.
The microcomputer 7610 of the integrated control unit 7600 controls the vehicle control system 7000 in accordance with various kinds of programs on the basis of information obtained via at least one of the general-purpose communication I/F 7620, the dedicated communication I/F 7630, the positioning section 7640, the beacon receiving section 7650, the in-vehicle device I/F 7660, and the vehicle-mounted network I/F 7680. For example, the microcomputer 7610 may calculate a control target value for the driving force generating device, the steering mechanism, or the braking device on the basis of the obtained information about the inside and outside of the vehicle, and output a control command to the driving system control unit 7100. For example, the microcomputer 7610 may perform cooperative control intended to implement functions of an advanced driver assistance system (ADAS) which functions include collision avoidance or shock mitigation for the vehicle, following driving based on a following distance, vehicle speed maintaining driving, a warning of collision of the vehicle, a warning of deviation of the vehicle from a lane, or the like. In addition, the microcomputer 7610 may perform cooperative control intended for automatic driving, which makes the vehicle to travel autonomously without depending on the operation of the driver, or the like, by controlling the driving force generating device, the steering mechanism, the braking device, or the like on the basis of the obtained information about the surroundings of the vehicle.
The microcomputer 7610 may generate three-dimensional distance information between the vehicle and an object such as a surrounding structure, a person, or the like, and generate local map information including information about the surroundings of the current position of the vehicle, on the basis of information obtained via at least one of the general-purpose communication I/F 7620, the dedicated communication I/F 7630, the positioning section 7640, the beacon receiving section 7650, the in-vehicle device I/F 7660, and the vehicle-mounted network I/F 7680. In addition, the microcomputer 7610 may predict danger such as collision of the vehicle, approaching of a pedestrian or the like, an entry to a closed road, or the like on the basis of the obtained information, and generate a warning signal. The warning signal may, for example, be a signal for producing a warning sound or lighting a warning lamp.
The sound/image output section 7670 transmits an output signal of at least one of a sound and an image to an output device capable of visually or auditorily notifying information to an occupant of the vehicle or the outside of the vehicle. In the example of Fig. 5, an audio speaker 7710, a display section 7720, and an instrument panel 7730 are illustrated as the output device. The display section 7720 may, for example, include at least one of an on-board display and a head-up display. The display section 7720 may have an augmented reality (AR) display function. The output device may be other than these devices, and may be another device such as headphones, a wearable device such as an eyeglass type display worn by an occupant or the like, a projector, a lamp, or the like. In a case where the output device is a display device, the display device visually displays results obtained by various kinds of processing performed by the microcomputer 7610 or information received from another control unit in various forms such as text, an image, a table, a graph, or the like. In addition, in a case where the output device is an audio output device, the audio output device converts an audio signal constituted of reproduced audio data or sound data or the like into an analog signal, and auditorily outputs the analog signal.
Incidentally, at least two control units connected to each other via the communication network 7010 in the example depicted in Fig. 5 may be integrated into one control unit. Alternatively, each individual control unit may include a plurality of control units. Further, the vehicle control system 7000 may include another control unit not depicted in the figures. In addition, part or the whole of the functions performed by one of the control units in the above description may be assigned to another control unit. That is, predetermined arithmetic processing may be performed by any of the control units as long as information is transmitted and received via the communication network 7010. Similarly, a sensor or a device connected to one of the control units may be connected to another control unit, and a plurality of control units may mutually transmit and receive detection information via the communication network 7010.
Incidentally, a computer program for realizing the functions of the information processing device 100 according to the present embodiment described with reference to Fig. 5 can be implemented in one of the control units or the like. In addition, a computer readable recording medium storing such a computer program can also be provided. The recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like. In addition, the above-described computer program may be distributed via a network, for example, without the recording medium being used.
It should be recognized that the embodiments describe methods with an exemplary ordering of method steps. The specific ordering of method steps is however given for illustrative purposes only and should not be construed as binding. For example, the ordering of S22 and S23 in the embodiment of Fig. 2 may be exchanged. Other changes of the ordering of method steps may be apparent to the skilled person. Please note that the division of the circuitry 1 into units 2 to 11 and the division of the unit 7 into the units 12 and 13 is only made for illustration purposes and that the present disclosure is not limited to any specific division of functions in specific units. It should also be noted that the division of the control or circuitry 7600 of Fig. 5 into units 7610 to 7690 is only made for illustration purposes and that the present disclosure is not limited to any specific division of functions in specific units. For instance, at least parts of the circuitry 1 or 7600 could be implemented by a respective programmed processor, field programmable gate array (FPGA), dedicated circuits, and the like.
All units and entities described in this specification and claimed in the appended claims can, if not stated otherwise, be implemented as integrated circuit logic, for example on a chip, and functionality provided by such units and entities can, if not stated otherwise, be implemented by software.
In so far as the embodiments of the disclosure described above are implemented, at least in part, using software-controlled data processing apparatus, it will be appreciated that a computer program providing such software control and a transmission, storage or other medium by which such a computer program is provided are envisaged as aspects of the present disclosure.
Note that the present technology can also be configured as described below.
(1) A circuitry for generating a synthetic aperture radar image, wherein the circuitry is configured to: obtain, based on a radar signal detected by a radar sensor, radar measurement data that indicate a received power and a received phase of the radar signal; map the received power of the radar signal to a measurement probability; generate a set of data tuples by associating each candidate phase value of a set of candidate phase values with a candidate measurement probability value of a set of candidate measurement probability values that correspond to a distribution of the measurement probability according to a phase uncertainty of the received phase; map the set of data tuples to a set of phase candidate synthetic aperture radar, PC-SAR, images; and determine a target phase value based on the set of PC-SAR images.
(2) The circuitry of (1), wherein the mapping of the received power includes: determining a signal-to-noise ratio of the radar signal based on the received power; and determining the measurement probability based on the signal-to-noise ratio of the radar signal. (3) The circuitry of (1) or (2), wherein the distribution of the candidate measurement probability values of the set of data tuples corresponds to a probability distribution, wherein at least one of a mean and a mode of the probability distribution corresponds to the received phase of the radar signal and has a probability value that corresponds to the received power of the radar signal.
(4) The circuitry of any one of (1) to (3), wherein the determining of the target phase value includes selecting, as the target phase value, a candidate phase value that is mapped to a PC-SAR image associated with a highest candidate measurement probability value among the set of PC-SAR images.
(5) The circuitry of any one of (1) to (4), further configured to: obtain, based on at least one further radar signal detected by the radar sensor, further radar measurement data that indicate a received power and a received phase of the at least one further radar signal; perform the mapping of the received power, the generating of the set of data tuples and the mapping of the set of data tuples accordingly for the at least one further radar signal; update, based on the set of data tuples generated for the at least one further radar signal, the candidate measurement probability values associated with the respective PC-SAR images for the at least one further radar signal; and determine the target phase value based on the updated candidate measurement probability values.
(6) The circuitry of (5), wherein the updating is based on a product of a first factor and a second factor, wherein the first factor is based on the candidate measurement probability values associated with the respective PC-SAR images and the second factor is based on the respective candidate measurement probability values of the at least one further radar signal.
(7) The circuitry of (5) or (6), wherein the radar signal and the at least one further radar signal are detected by the radar sensor from different positions.
(8) The circuitry of any one of (1) to (7), wherein the radar measurement data further indicate a range of the radar signal; wherein the mapping of the set of data tuples includes mapping the set of data tuples to portions, of the respective PC-SAR images, that correspond to the radar signal.
(9) The circuitry of (8), wherein the PC-SAR images of the set of PC-SAR images include grid maps, and the portions of the PC-SAR-images correspond to cells of the grid maps. (10) The circuitry of (8) to (9), further configured to: correct the received phase of the radar signal based on the range of the radar signal; and perform the generating and the mapping of the set of data tuples based on the corrected phase.
(11) A method for generating a synthetic aperture radar image, the method comprising: obtaining, based on a radar signal detected by a radar sensor, radar measurement data that indicate a received power and a received phase of the radar signal; mapping the received power of the radar signal to a measurement probability; generating a set of data tuples by associating each candidate phase value of a set of candidate phase values with a candidate measurement probability value of a set of candidate measurement probability values that correspond to a distribution of the measurement probability according to a phase uncertainty of the received phase; mapping the set of data tuples to a set of phase candidate synthetic aperture radar, PC-SAR, images; and determining a target phase value based on the set of PC-SAR images.
(12) The method of (11), wherein the mapping of the received power includes: determining a signal-to-noise ratio of the radar signal based on the received power; and determining the measurement probability based on the signal-to-noise ratio of the radar signal.
(13) The method of (11) or (12), wherein the distribution of the candidate measurement probability values of the set of data tuples corresponds to a probability distribution, wherein at least one of a mean and a mode of the probability distribution corresponds to the received phase of the radar signal and has a probability value that corresponds to the received power of the radar signal.
(14) The method of any one of (11) to (13), wherein the determining of the target phase value includes selecting, as the target phase value, a candidate phase value that is mapped to a PC-SAR image associated with a highest candidate measurement probability value among the set of PC-SAR images.
(15) The method of any one of (11) to (14), further comprising: obtaining, based on at least one further radar signal detected by the radar sensor, further radar measurement data that indicate a received power and a received phase of the at least one further radar signal; performing the mapping of the received power, the generating of the set of data tuples and the mapping of the set of data tuples accordingly for the at least one further radar signal; updating, based on the set of data tuples generated for the at least one further radar signal, the candidate measurement probability values associated with the respective PC-SAR images for the at least one further radar signal; and determining the target phase value based on the updated candidate measurement probability values.
(16) The method of (15), wherein the updating is based on a product of a first factor and a second factor, wherein the first factor is based on the candidate measurement probability values associated with the respective PC-SAR images and the second factor is based on the respective candidate measurement probability values of the at least one further radar signal.
(17) The method of (15) or (16), wherein the radar signal and the at least one further radar signal are detected by the radar sensor from different positions.
(18) The method of any one of (11) to (17), wherein the radar measurement data further indicate a range of the radar signal; wherein the mapping of the set of data tuples includes mapping the set of data tuples to portions, of the respective PC-SAR images, that correspond to the radar signal.
(19) The method of (18), wherein the PC-SAR images of the set of PC-SAR images include grid maps, and the portions of the PC-SAR-images correspond to cells of the grid maps.
(20) The method of (18) or (19), further comprising: correcting the received phase of the radar signal based on the range of the radar signal; and performing the generating and the mapping of the set of data tuples based on the corrected phase.
(21) A computer program comprising program code causing a computer to perform the method according to anyone of (11) to (20), when being carried out on a computer.
(22) A non-transitory computer-readable recording medium that stores therein a computer program product, which, when executed by a processor, causes the method according to anyone of (11) to (20) to be performed.

Claims

1. A circuitry for generating a synthetic aperture radar image, wherein the circuitry is configured to: obtain, based on a radar signal detected by a radar sensor, radar measurement data that indicate a received power and a received phase of the radar signal; map the received power of the radar signal to a measurement probability; generate a set of data tuples by associating each candidate phase value of a set of candidate phase values with a candidate measurement probability value of a set of candidate measurement probability values that correspond to a distribution of the measurement probability according to a phase uncertainty of the received phase; map the set of data tuples to a set of phase candidate synthetic aperture radar, PC-SAR, images; and determine a target phase value based on the set of PC-SAR images.
2. The circuitry of claim 1, wherein the mapping of the received power includes: determining a signal-to-noise ratio of the radar signal based on the received power; and determining the measurement probability based on the signal-to-noise ratio of the radar signal.
3. The circuitry of claim 1, wherein the distribution of the candidate measurement probability values of the set of data tuples corresponds to a probability distribution, wherein at least one of a mean and a mode of the probability distribution corresponds to the received phase of the radar signal and has a probability value that corresponds to the received power of the radar signal.
4. The circuitry of claim 1, wherein the determining of the target phase value includes selecting, as the target phase value, a candidate phase value that is mapped to a PC-SAR image associated with a highest candidate measurement probability value among the set of PC-SAR images.
5. The circuitry of claim 1, further configured to: obtain, based on at least one further radar signal detected by the radar sensor, further radar measurement data that indicate a received power and a received phase of the at least one further radar signal; perform the mapping of the received power, the generating of the set of data tuples and the mapping of the set of data tuples accordingly for the at least one further radar signal; update, based on the set of data tuples generated for the at least one further radar signal, the candidate measurement probability values associated with the respective PC-SAR images for the at least one further radar signal; and determine the target phase value based on the updated candidate measurement probability values.
6. The circuitry of claim 5, wherein the updating is based on a product of a first factor and a second factor, wherein the first factor is based on the candidate measurement probability values associated with the respective PC-SAR images and the second factor is based on the respective candidate measurement probability values of the at least one further radar signal.
7. The circuitry of claim 5, wherein the radar signal and the at least one further radar signal are detected by the radar sensor from different positions.
8. The circuitry of claim 1, wherein the radar measurement data further indicate a range of the radar signal; wherein the mapping of the set of data tuples includes mapping the set of data tuples to portions, of the respective PC-SAR images, that correspond to the radar signal.
9. The circuitry of claim 8, wherein the PC-SAR images of the set of PC-SAR images include grid maps, and the portions of the PC-SAR-images correspond to cells of the grid maps.
10. The circuitry of claim 8, further configured to: correct the received phase of the radar signal based on the range of the radar signal; and perform the generating and the mapping of the set of data tuples based on the corrected phase.
11. A method for generating a synthetic aperture radar image, the method comprising: obtaining, based on a radar signal detected by a radar sensor, radar measurement data that indicate a received power and a received phase of the radar signal; mapping the received power of the radar signal to a measurement probability; generating a set of data tuples by associating each candidate phase value of a set of candidate phase values with a candidate measurement probability value of a set of candidate measurement probability values that correspond to a distribution of the measurement probability according to a phase uncertainty of the received phase; mapping the set of data tuples to a set of phase candidate synthetic aperture radar, PC-SAR, images; and determining a target phase value based on the set of PC-SAR images.
12. The method of claim 11, wherein the mapping of the received power includes: determining a signal-to-noise ratio of the radar signal based on the received power; and determining the measurement probability based on the signal-to-noise ratio of the radar signal.
13. The method of claim 11, wherein the distribution of the candidate measurement probability values of the set of data tuples corresponds to a probability distribution, wherein at least one of a mean and a mode of the probability distribution corresponds to the received phase of the radar signal and has a probability value that corresponds to the received power of the radar signal.
14. The method of claim 11, wherein the determining of the target phase value includes selecting, as the target phase value, a candidate phase value that is mapped to a PC-SAR image associated with a highest candidate measurement probability value among the set of PC-SAR images.
15. The method of claim 11, further comprising: obtaining, based on at least one further radar signal detected by the radar sensor, further radar measurement data that indicate a received power and a received phase of the at least one further radar signal; performing the mapping of the received power, the generating of the set of data tuples and the mapping of the set of data tuples accordingly for the at least one further radar signal; updating, based on the set of data tuples generated for the at least one further radar signal, the candidate measurement probability values associated with the respective PC-SAR images for the at least one further radar signal; and determining the target phase value based on the updated candidate measurement probability values.
16. The method of claim 15, wherein the updating is based on a product of a first factor and a second factor, wherein the first factor is based on the candidate measurement probability values associated with the respective PC-SAR images and the second factor is based on the respective candidate measurement probability values of the at least one further radar signal.
17. The method of claim 15, wherein the radar signal and the at least one further radar signal are detected by the radar sensor from different positions.
18. The method of claim 11, wherein the radar measurement data further indicate a range of the radar signal; wherein the mapping of the set of data tuples includes mapping the set of data tuples to portions, of the respective PC-SAR images, that correspond to the radar signal.
19. The method of claim 18, wherein the PC-SAR images of the set of PC-SAR images include grid maps, and the portions of the PC-SAR-images correspond to cells of the grid maps.
20. The method of claim 18, further comprising: correcting the received phase of the radar signal based on the range of the radar signal; and performing the generating and the mapping of the set of data tuples based on the corrected phase.
PCT/EP2023/074416 2022-09-09 2023-09-06 Circuitry and method WO2024052392A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP22194867 2022-09-09
EP22194867.2 2022-09-09

Publications (1)

Publication Number Publication Date
WO2024052392A1 true WO2024052392A1 (en) 2024-03-14

Family

ID=83271065

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2023/074416 WO2024052392A1 (en) 2022-09-09 2023-09-06 Circuitry and method

Country Status (1)

Country Link
WO (1) WO2024052392A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170269201A1 (en) * 2016-03-16 2017-09-21 Denso It Laboratory, Inc. Surrounding Environment Estimation Device and Surrounding Environment Estimating Method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170269201A1 (en) * 2016-03-16 2017-09-21 Denso It Laboratory, Inc. Surrounding Environment Estimation Device and Surrounding Environment Estimating Method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
GREBNER TIMO ET AL: "Radar-Based Mapping of the Environment: Occupancy Grid-Map Versus SAR", IEEE MICROWAVE AND WIRELESS COMPONENTS LETTERS, IEEE SERVICE CENTER, NEW YORK, NY, US, vol. 32, no. 3, 3 February 2022 (2022-02-03), pages 253 - 256, XP011902462, ISSN: 1531-1309, [retrieved on 20220309], DOI: 10.1109/LMWC.2022.3145661 *
LIHONG KANG ET AL: "A novel method for dual channel POLSAR raw data compression", 2012 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS), IEEE, 22 July 2012 (2012-07-22), pages 4561 - 4564, XP032468409, ISBN: 978-1-4673-1160-1, DOI: 10.1109/IGARSS.2012.6350455 *
THRUN, SEBASTIAN: "Learning occupancy grids with forward models", PROCEEDINGS 2001 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, 2001
TIMO ET AL.: "Radar-Based Mapping of the Environment: Occupancy Grid-Map Versus SAR", IEEE MICROWAVE AND WIRELESS COMPONENTS LETTERS, 2022

Similar Documents

Publication Publication Date Title
US10970877B2 (en) Image processing apparatus, image processing method, and program
WO2017057044A1 (en) Information processing device and information processing method
CN108028883B (en) Image processing apparatus, image processing method, and program
US11915452B2 (en) Information processing device and information processing method
JP7294148B2 (en) CALIBRATION DEVICE, CALIBRATION METHOD AND PROGRAM
US11255959B2 (en) Apparatus, method and computer program for computer vision
US20220390557A9 (en) Calibration apparatus, calibration method, program, and calibration system and calibration target
US11585898B2 (en) Signal processing device, signal processing method, and program
US20230219495A1 (en) Signal processing device, light adjusting control method, signal processing program, and light adjusting system
US20220012552A1 (en) Information processing device and information processing method
US11436706B2 (en) Image processing apparatus and image processing method for improving quality of images by removing weather elements
US20220308200A1 (en) Radar data determination circuitry and radar data determination method
WO2024052392A1 (en) Circuitry and method
JP2023122597A (en) Information processor, information processing method and program
WO2021065510A1 (en) Information processing device, information processing method, information processing system, and program
US20230316546A1 (en) Camera-radar fusion using correspondences
US20230119187A1 (en) Circuitry and method
WO2022059489A1 (en) Information processing device, information processing method, and program
WO2022196316A1 (en) Information processing device, information processing method, and program
US20240127042A1 (en) Information processing device, information processing system, information processing method, and recording medium
US11989901B2 (en) Information processing device and information processing method
US20220148283A1 (en) Information processing apparatus, information processing method, and program
US20230161026A1 (en) Circuitry and method
US20240004075A1 (en) Time-of-flight object detection circuitry and time-of-flight object detection method
CN117741575A (en) Generating 3D mesh map and point cloud using data fusion of mesh radar sensors

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23762545

Country of ref document: EP

Kind code of ref document: A1