CN114708378B - Automobile camera simulation method, electronic device and storage medium - Google Patents

Automobile camera simulation method, electronic device and storage medium Download PDF

Info

Publication number
CN114708378B
CN114708378B CN202210637513.8A CN202210637513A CN114708378B CN 114708378 B CN114708378 B CN 114708378B CN 202210637513 A CN202210637513 A CN 202210637513A CN 114708378 B CN114708378 B CN 114708378B
Authority
CN
China
Prior art keywords
image
camera
pixel
color
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210637513.8A
Other languages
Chinese (zh)
Other versions
CN114708378A (en
Inventor
王强
王寅东
孟佳旭
陈旭亮
杨永翌
孙琪佳
王剑飞
侯全杉
赵帅
朱向雷
赵鹏超
冷炘伦
孙博华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhongqi Zhilian Technology Co ltd
Original Assignee
Automotive Data of China Tianjin Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Automotive Data of China Tianjin Co Ltd filed Critical Automotive Data of China Tianjin Co Ltd
Priority to CN202210637513.8A priority Critical patent/CN114708378B/en
Publication of CN114708378A publication Critical patent/CN114708378A/en
Application granted granted Critical
Publication of CN114708378B publication Critical patent/CN114708378B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/80Shading
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Traffic Control Systems (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the invention discloses an automobile camera simulation method, electronic equipment and a storage medium. The method comprises the following steps: acquiring a light source model and a reflector model in a traffic space where an automobile is located; determining the color of each position of the surface of the reflector model after the light emitted by the light source model and before diffuse reflection is subjected to first diffuse reflection on the surface of the reflector model; simulating undistorted first image data of light rays projected on a camera light sensing plate after diffuse reflection according to the color; and simulating second image data of the first image after the first image is distorted by the camera according to the distortion parameters of the camera. The embodiment simulates the action mechanism and the error of the camera sensor and improves the simulation precision.

Description

Automobile camera simulation method, electronic device and storage medium
Technical Field
The embodiment of the invention relates to the field of automobile simulation, in particular to an automobile camera simulation method, electronic equipment and a storage medium.
Background
The vehicle operation process of the intelligent automobile can be roughly summarized as follows: the method comprises three processes of perception, decision and control, wherein the simulation of the decision and control processes is paid full attention, and the simulation of a perception link needs further research.
The main purposes of the intelligent automobile perception link simulation are as follows: and simulating a real sensing result for the intelligent automobile. In the prior art, the intelligent automobile simulation software adopts a perfect sensor scheme for simulating a sensing link, namely, an environment true value is extracted from environment data of the simulation software and is directly output as a measurement result of a sensor, and various real characteristics of an actual sensor cannot be simulated.
Disclosure of Invention
The embodiment of the invention provides an automobile camera simulation method, electronic equipment and a storage medium, which simulate the action mechanism and the error of a camera sensor and improve the simulation precision.
In a first aspect, an embodiment of the present invention provides an automobile camera simulation method, including:
acquiring a light source model and a reflector model in a traffic space where an automobile is located;
determining the color of each position of the surface of the reflector model after the light emitted by the light source model and before diffuse reflection is subjected to first diffuse reflection on the surface of the reflector model;
simulating undistorted first image data of light rays projected on a camera light sensing plate after diffuse reflection according to the color;
simulating second image data of the first image after the first image is distorted by the camera according to the distortion parameters of the camera;
wherein the simulation process of the first image data comprises: constructing a blank first image, and performing the following operations on each pixel of the first image: and determining the emergent position of the light rays which are projected at any pixel of the first image and subjected to diffuse reflection according to the internal parameters and the external parameters of the camera, and taking the color of the emergent position as the color of any pixel.
In a second aspect, an embodiment of the present invention provides an electronic device, including: memory, processor and computer program stored on the memory and executable by the processor, characterized in that the processor implements the above method when executing the computer program.
In a third aspect, an embodiment of the present invention provides a computer-readable storage medium, which stores computer instructions for causing the computer to execute the above method.
According to the embodiment of the invention, the process from the emitting of light to the receiving and imaging of a camera is divided into two stages before the first diffuse reflection and after the first diffuse reflection according to the propagation characteristics of light; in the first stage, forward simulation is adopted, and the color of light emitted after the light is subjected to first diffuse reflection on the surface of a reflector is calculated; in the second stage, reverse simulation is adopted, the emergent position of the corresponding diffuse reflection light is reversely deduced by the pixel in the light sensing plate, and the color of the emergent position is used as the color of the pixel. Therefore, the simulation of the actual physical process of sensing light and imaging by the camera is realized, the simulation precision is ensured, and the calculation process is simplified; meanwhile, simulation of a camera distortion process is particularly considered, sensor errors can be effectively reflected, and the problem that simulation precision is poor due to insufficient consideration of camera sensor action mechanisms and errors in the existing method is solved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flowchart of an automobile camera simulation method according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of a first image according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of a coordinate system conversion process of a camera imaging system according to an embodiment of the present invention.
Fig. 4 is a schematic diagram of a calibration picture according to an embodiment of the present invention.
Fig. 5 is a schematic diagram of determining a real exit position of a light ray after diffuse reflection according to an embodiment of the present invention.
Fig. 6 is a schematic diagram of determining four pixels around an original position according to an embodiment of the present invention.
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the description of the present invention, it should also be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
The automobile camera simulation method provided by the embodiment of the invention simulates the process from the sensing of light rays in a traffic environment by a camera sensor to the output of image data. The overall physical process from the emission of light to the output of image data by the camera is described with priority as follows: the light source emits light rays which are irradiated on the surface of the reflector, and different materials on the surface of the reflector absorb, reflect, refract and the like the light rays; after a series of absorption, reflection and refraction, the light rays finally enter a camera lens and are projected on a light sensing plate; the light information received in the light sensing plate is the corresponding image data and is output.
Based on the above whole physical processes, fig. 1 is a flowchart of an automobile camera simulation method provided in an embodiment of the present invention. The method is suitable for the condition of simulating the environment perception function of the automobile camera and is executed by the electronic equipment. As shown in fig. 1, the method specifically includes:
and S110, acquiring a light source model and a reflector model in a traffic space where the automobile is located.
The light source model and the reflector model are used for simulating a real light source and a real reflector in a traffic space where an automobile is located. The light source generally includes the sun, street lamp, etc., and the light source model needs to define the physical quantities such as the color (wavelength), direction, light intensity, etc. of the light emitted by the light source. The reflector comprises objects such as buildings, plants and the like in a traffic environment, and optical properties such as material reflectivity, absorptivity and the like of each object need to be defined in a reflector model.
And S120, determining the color of each position on the surface of the reflector model after the light emitted by the light source model and before diffuse reflection is subjected to first diffuse reflection on the surface of the reflector model.
According to the light transmission principle, light rays are subjected to multiple times of diffuse reflection after being emitted, but the light rays after the secondary diffuse reflection have very weak intensity, and the influence on the color of a reflector is far smaller than that of the primary diffuse reflection. Meanwhile, because the direction after the first diffuse reflection is not unique, the calculation resources consumed by continuous simulation are far larger than that of the first diffuse reflection. Based on the above reasons, the present embodiment only simulates the process from the light emitted by the light source to the first diffuse reflection, and omits the subsequent multiple reflection process. Therefore, the process from the light source to the camera for receiving and imaging can be divided into two stages: before and after the first diffuse reflection.
In the step, forward simulation is carried out on the first stage, and light rays emitted by the light source model and before diffuse reflection are simulated. Specifically, light sources that can be encountered in actual traffic environments such as the sun and street lamps are generally composite light emitters, and emitted light is composite light and is formed by mixing various colored lights. The embodiment represents the composite light by an RGB color space model, and simulates light before diffuse reflection by using visible light (red light, blue light, and green light) of 3 wavelengths with different intensities.
And then simulating the light rays which are emitted from any position after the light rays before diffuse reflection reach any position on the surface of the reflector and are subjected to first diffuse reflection according to the propagation principle of the light. Specifically, according to the principle of light propagation, the intensity of light rays emitted after the light rays with each wavelength travel along a straight line from a light source, reach any position on the surface of a reflector through attenuation and the like, are absorbed and transmitted, and are diffused and reflected is calculated.
And finally, determining the color of any position according to the light after diffuse reflection. Specifically, light of three wavelengths is synthesized according to the intensity of light of each wavelength to be emitted by diffuse reflection, so that the color of composite light is obtained. In the embodiment, the visible light with 3 primary colors of red, blue and green is selected from the light with multiple wavelengths, so that the color information can be completely represented, the calculation can be simplified, and the real-time performance of simulation can be improved. It should be noted that, when a plurality of light sources exist in the traffic environment, the intensity of light rays with each wavelength emitted by a single light source after being diffusely reflected on the surface of the reflector model is respectively calculated, and then the light ray intensities with each wavelength corresponding to the plurality of light sources are linearly superimposed to obtain the total light ray intensity with each wavelength, so as to obtain the color of the corresponding position.
And S130, simulating the undistorted first image data of the light rays after the diffuse reflection projected on the camera light-sensing plate according to the color. Wherein the simulation process of the first image data comprises: constructing a blank first image, and performing the following operations on each pixel of the first image: and determining the emergent position of the light rays which are projected at any pixel of the first image and subjected to diffuse reflection according to the internal parameters and the external parameters of the camera, and taking the color of the emergent position as the color of any pixel.
This embodiment performs an inverse simulation of the second stage. Firstly, constructing a blank first image at a focal point Of the camera according to the FOV (Field Of View) Of the camera; and determining the pixel distribution in the first image according to the resolution of the camera. As shown in fig. 2, F denotes a focal point, a blank first image is constructed at F,xandyrespectively representing the coordinates of the pixels in the first image in a pixel coordinate system. The higher the camera resolution, the more pixels in the first image.
After the blank image is constructed, starting from any pixel of the first image, the emergent position of the light ray which is projected at the pixel and is subjected to diffuse reflection is reversely deduced. Specifically, an internal parameter matrix and an external parameter matrix are formed by internal parameters and external parameters of the camera; according to a first coordinate of any pixel in the first image in a pixel coordinate system, calculating a second coordinate in a world coordinate system, which is transformed into the first coordinate after passing through the internal reference matrix and the external reference matrix; and determining the emergent position of the diffused and reflected light rays projected at the pixel according to the second coordinate. More details will be described in the following examples.
And after the emergent position is determined, reading the color of the light at the emergent position, and directly taking the color as the color of the pixel in the first image. By performing the above operation on each pixel, the blank first image can be filled, and a set of distortion-free simulation image data is obtained.
And S140, simulating second image data of the first image after the first image is distorted according to the distortion parameters of the camera.
The camera distortion is a degree of distortion of an image of an object by an optical system with respect to the object itself, and is an inherent characteristic of an optical lens, and is directly caused by distortion of the image due to difference in magnification between an edge portion and a central portion of the lens. The distortion-free simulation image data already contains all useful information of the camera, but in a simulation model with a high requirement for accuracy, the influence of the distortion of the camera cannot be ignored. In the step, the distortion process generated when the simulated light passes through the lens and other objects is realized according to the distortion parameters, so that the obtained second data image data is completely consistent with the data output by the real camera.
According to the transmission characteristics of light, the process from the light emitting to the image receiving by the camera is divided into two stages before the first diffuse reflection and after the first diffuse reflection; in the first stage, forward simulation is adopted, and the color of light emitted after the light is subjected to first diffuse reflection on the surface of a reflector is calculated; in the second stage, reverse simulation is adopted, the emergent position of the corresponding diffuse reflection light is reversely deduced by the pixel in the light sensing plate, and the color of the emergent position is used as the color of the pixel. Therefore, the simulation of the actual physical process of sensing light and imaging by the camera is realized, the simulation precision is ensured, and the calculation process is simplified; meanwhile, simulation of a camera distortion process is particularly considered, sensor errors can be effectively reflected, and the problem that simulation precision is poor due to insufficient consideration of camera sensor action mechanisms and errors in the existing method is solved. The method provided by the embodiment is suitable for various types of cameras including common cameras, wide-angle cameras, fisheye cameras and the like, can be applied to various types of traffic environments and traffic conditions, and has extremely high feasibility and universality.
On the basis of the above-described embodiment and the following-described embodiment, the present embodiment refines the simulation process of the first image. In order to realize the above-mentioned reverse simulation of the second stage by using the intrinsic parameters and extrinsic parameters of the camera, the application of the intrinsic parameters and extrinsic parameters in the forward process of camera imaging is preferably introduced. In image measurement and machine vision applications, in order to determine the correlation between the three-dimensional geometric position of a point on the surface of an object in space and the corresponding point in the image, a geometric model of the camera imaging must be established, and the parameters of the geometric model are the parameters of the camera. In the forward direction of camera imaging, the coordinate system conversion process of the camera imaging system is shown in fig. 3, and the coordinate conversion formula is as follows:
Figure 883665DEST_PATH_IMAGE001
(1)
wherein, Z represents a scale factor,
Figure 279618DEST_PATH_IMAGE002
representing a second coordinate of the spatial point under a world coordinate system;
Figure 820321DEST_PATH_IMAGE003
and the coordinates of the position projected in the first image and the coordinates in the pixel coordinate system after the diffused reflection light rays emitted by the point enter the camera imaging system are shown. Will be provided with
Figure 135896DEST_PATH_IMAGE004
Defined as an internal reference matrix of the camera, and records two processes of affine transformation and perspective transformation of the camera, whereinfRepresenting the image distance, dX and dY representing the physical length of a pixel on the camera plate in the x and y directions respectively,
Figure 2221DEST_PATH_IMAGE005
the coordinates of the center of the camera plate (focus F in the error-free case) in the pixel coordinate system of the first image are indicated, and θ represents the angle between the lateral and longitudinal edges of the plate, which is 90 ° in the error-free case. Will be provided with
Figure 386935DEST_PATH_IMAGE006
Defined as the camera's external reference matrix, which depends on the relative positions of the camera coordinate system and the world coordinate system, where R denotes the rotation matrix and T denotes the translation vector.
Based on the above principle, optionally, after constructing a blank first image and determining pixel distribution, first, according to a first coordinate of any pixel in the first image in a pixel coordinate system, a second coordinate in a world coordinate system, which is transformed into the first coordinate through the internal reference matrix and the external reference matrix, is calculated. Specifically, this step knows the first coordinate of each pixel in the pixel coordinate system
Figure 996907DEST_PATH_IMAGE003
Solving a second coordinate of the space position corresponding to each pixel under the world coordinate system through the inverse operation of the internal reference matrix and the external reference matrix
Figure 901410DEST_PATH_IMAGE002
. The internal reference matrix and the external reference matrix can be obtained through camera calibration. The general method of camera calibration experiment can be summarized as follows: making a calibration picture (as shown in fig. 4); extracting angular point information in the picture, solving an internal reference matrix and an external reference matrix according to the angular point information, and solving distortion parameters, distortion removal and inverse error projection. The specific process is the prior art, and is not described in detail in the application.
And after a second coordinate corresponding to each pixel is obtained, drawing a ray from the focal point of the camera to a space position represented by the second coordinate, determining the emergent position of the diffusely reflected light ray projected at the pixel according to the ray, and taking the color of the emergent position as the color of the pixel position. Specifically, through the calculation of the above embodiment, the color data of the reflector model surface is known, and the color data of the remaining spatial positions are empty. However, due to the change of the traffic environment and the existence of calculation errors, the space position obtained through the back-estimation of the internal parameter matrix and the external parameter matrix is not necessarily located on the surface of the reflector model, so the rays are drawn to approximately simulate the rays after diffuse reflection of the space position to find the real emergent position of the rays.
Further, according to the position relationship between the ray and the reflector model, the method includes the following embodiments.
First embodiment, if there is no intersection point of the ray with the reflector model surface, as in FIG. 5, the ray from focal point F to spatial position A1l1, determining the exit position of the diffusely reflected light ray projected at the pixel as follows: infinity. I.e. the diffusely reflected light rays come from the sky. At this time, the preset sky color is read as an infinite color as the color of the pixel corresponding to a 1.
In a second embodiment, if there is at least one intersection point between the ray and the reflector model table, the exit position of the diffusely reflected ray projected at the pixel is determined as: an intersection point nearest to the focal point. The color of the intersection is then read as the color of the pixel. As shown in fig. 5 from focal point F to spatial position a2l2, reading the color of B2 as the color of the pixel corresponding to A2, wherein the corresponding emergent position is B2; rays from focal point F to spatial location A3l3, the corresponding emission position is B3, and the color of B3 is read as the color of the pixel corresponding to A3.
After the color of each pixel in the first image is determined, the first image simulation process is completed.
According to the embodiment, the reverse backstepping from the pixel of the first image to the space position in the traffic environment is completed by utilizing the internal parameters and the external parameters of the camera, and the real emergent position of the light after diffuse reflection projected to the pixel is searched according to the position relation between the space position and the surface of the reflector, so that the simulation of distortion-free image data is realized.
On the basis of the above-described embodiment and the following-described embodiment, the present embodiment refines the simulation process of the second image. This stage enables simulation of camera distortion, so the principle of camera distortion is described with priority. The camera distortion is generally composed of three parts, radial distortion, tangential distortion and thin prism distortion. The sum of the three components forms the distortion of the whole camera and can be represented by distortion parameters. After the color of any pixel in the first image is distorted by the camera, the color is displayed at another position different from the pixel in the second image, and the mapping relation between the pixel and the other position is expressed as follows:
Figure 407477DEST_PATH_IMAGE007
(2)
wherein (A) and (B)x,y) Coordinates representing any pixel in the first image in a pixel coordinate system: (a)x 0 ,y 0 ) Representing the coordinates of another location in the second image in a pixel coordinate system. MappingmThe distortion formula is expressed as follows:
Figure 889274DEST_PATH_IMAGE008
(3)
wherein the content of the first and second substances,
Figure 929037DEST_PATH_IMAGE009
respectively representxRadial distortion, tangential distortion, thin prism distortion in the direction;
Figure 812679DEST_PATH_IMAGE010
respectively representyRadial distortion in the direction, tangential distortion, thin prism distortion. Distortion parameters are also obtained by camera calibration. Taking the third order distortion equation as an example, the radial distortion equation can be expressed as:
Figure 427331DEST_PATH_IMAGE011
(4)
Figure 396424DEST_PATH_IMAGE012
(5)
the tangential distortion equation can be expressed as:
Figure 472833DEST_PATH_IMAGE013
(6)
Figure 679824DEST_PATH_IMAGE014
(7)
the thin prism distortion formula can be expressed as:
Figure 324432DEST_PATH_IMAGE015
(8)
Figure 656187DEST_PATH_IMAGE016
(9)
wherein the content of the first and second substances,rrepresenting the distance of any pixel in the first image to the centre of the image, i.e.
Figure 677233DEST_PATH_IMAGE017
Figure 910635DEST_PATH_IMAGE018
The required sagittal distortion parameter, tangential distortion parameter and thin prism distortion parameter are indicated, respectively. Optionally, solving each distortion parameter by using a regression curve method in camera calibration specifically includes: according to the corner point information of the calibration picture and the internal reference matrix and the external reference matrix of the camera, coordinates of corner point pixels in the first image (the undistorted picture) are solved, and the coordinates of the corner point pixels in the first image and the coordinates of the corner point pixels in the second image (the actual distorted picture) are subjected to difference value calculation to obtain a total distortion value of the corner point pixels; and substituting the distortion value into the distortion formula to perform curve fitting to obtain the value of each distortion parameter.
After the distortion parameter is determined, optionally, according to the distortion parameter of the camera, simulating second image data of the first image after the distortion of the camera, specifically including the following steps:
step one, constructing a blank second image, wherein the pixel distribution of the second image is the same as that of the first image. The color of each pixel in the second image is unknown, and the color of each pixel is determined one by one.
Step two, each pixel of the second image is operated as follows: determining an original position of any pixel in the second image in the first image before camera distortion occurs according to the distortion parameter of the camera; and obtaining the color of any pixel according to the colors of four pixels adjacent to the original position in the first image. This step performs a reverse simulation of the camera distortion, which is the inverse of the color in the first image from the pixels in the second image.
In a specific embodiment, firstly, a mapping relationship between each pixel in the first image and a corresponding position in the second image, that is, a mapping relationship shown in formula (3), is determined; and solving an inverse mapping of the mapping relationship. In equation (3), the pixel in the first image: (x,y) Each coordinate of (a) is an integer, throughmPost-mapping to a location in the second image (x 0 ,y 0 ) Is obviously a decimal number. Therefore, this step is solvingmTo obtain a mapping of the pixels in the second image to the original positions in the first image. Through inverse mapping, the pixel coordinates actually output by the camera can be ensured to be integers
And then substituting the coordinates of each pixel in the second image under the pixel coordinate system into the inverse mapping to obtain the coordinates of the original position in the first image corresponding to each pixel under the pixel coordinate system. It should be noted that, since the distortion is an inherent characteristic of the camera, the distortion coefficient does not change for one or more cameras at any time. Therefore, the inverse mapping can be repeatedly used in multiple times of simulation after being built.
And finally, obtaining the color of any pixel according to the colors of four pixels adjacent to the original position in the first image. After the inverse mapping, although the pixel coordinates of the original position are decimal, a spatial interpolation method may be adopted to calculate R, G, B values of the original position from the surroundings (4 pixels with the shortest distance in 4 directions of upper left, lower left, upper right and lower right), and the method of taking points of the surrounding pixels is as shown in fig. 6.
Optionally, the spatial interpolation may be implemented by an inverse distance weighting method, a trend surface method, a spline function method, or the like. Interpolation by an inverse distance weighting method specifically comprises the following steps:
step one, calculating the distance from 4 adjacent pixels at any original position to the original positiond 1 ~d 4 As shown in fig. 6.
Step two, calculating the weight of each pixel according to the distance, wherein the formula is as follows:
Figure 726144DEST_PATH_IMAGE019
(10)
step three, according to the weight, R, G, B weighted value of the original position is calculated, and the formula is as follows:
Figure 545196DEST_PATH_IMAGE020
(11)
since the magnification of the edge portion and the central portion of the lens are different, and since the four corner regions in the second image are blank after distortion, the color of the outermost pixel is used for filling in the prior art, and the displayed color is not the real color in the environment. The embodiment starts from the pixel after the distortion, and reversely deduces the spatial position of the pixel in the second image in the world coordinate, so that no blank pixel exists in the second image after the distortion, and the pixel colors are all from the real color of the environment, but not the color of the outermost pixel. The linear interpolation of the pixels in the first image, rather than the second image, is due to: the first image before distortion is scaled relative to the environment image, and the result obtained by linear interpolation basically conforms to the real environment; in the distorted second image, the environmental image is compressed and distorted in an arc shape from the center to the periphery, and the accuracy can be maintained only by utilizing the interpolation of the arc-shaped compression rule, which is difficult to realize.
On the basis of the above-described embodiment and the following-described embodiment, the present embodiment refines the image data post-processing procedure. In general, image data obtained after the camera is subjected to light sensing and digital-to-analog conversion is rawRGB information, and the camera needs to perform image signal processing on the image data to generate a file with a unified standard, so that information sensed by the sensor can be used by subsequent links such as planning decision and control of the intelligent automobile. Optionally, after the second image data is obtained, the following image signal processing process is further included.
First, the second image data is subjected to visualization processing. Most of the output of the camera sensor is output in the form of a picture or video stream, and the second image data is only pixel color data containing image information and needs to be visualized. Specifically, the visualization processing program for generating the picture by using RGB information such as opencv function library can be directly processed into a format of the picture (× png, × jpg, etc.) or the video (× mp4, av i, etc.) according to the requirement of the subsequent output.
Then, according to the camera configuration, personalized image processing is performed on the visualization result. In the step, most of the camera sensors for the intelligent vehicle are empty, but some cameras can also be configured with data post-processing processes such as an anti-shake function and intelligent AI noise reduction, and the step is just to simulate similar personalized image processing functions. The specific processing program does not need to be developed by self, and the visualization result can be directly input into the personalized image processing program carried by the camera.
And finally, outputting the processing result into a data form with adaptive bits according to the standardized input and output interface. In this step, different interfaces are set according to the communication protocol of the IO interface and the like. Typically, the camera communicates with other hardware facilities of the vehicle through a network bus or CAN bus. Correspondingly, after the personalized image processing result is obtained, the personalized image processing result needs to be converted into an adaptive data form according to an IO interface protocol, so that when joint simulation is carried out on the camera model and a simulation model of a subsequent link, a communication protocol is unified, and communication content is correct.
The present embodiment simulates the post-processing of the second image data. It should be noted that, the personalized image processing process is selectively performed according to the camera configuration, and if the camera is not configured with the personalized image processing function, the step is skipped, and the process directly enters the adaptation process of the input/output interface.
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present invention, as shown in fig. 7, the electronic device includes a processor 60, a memory 61, an input device 62, and an output device 63; the number of processors 60 in the device may be one or more, and one processor 60 is taken as an example in fig. 7; the processor 60, the memory 61, the input device 62 and the output device 63 in the apparatus may be connected by a bus or other means, which is exemplified in fig. 7.
The memory 61 is a computer-readable storage medium for storing software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the simulation method of the car camera in the embodiment of the present invention. The processor 60 executes various functional applications of the device and data processing by running software programs, instructions and modules stored in the memory 61, that is, implements the above-described automobile camera simulation method.
The memory 61 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal, and the like. Further, the memory 61 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, the memory 61 may further include memory located remotely from the processor 60, which may be connected to the device over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 62 may be used to receive entered numeric or character information and to generate key signal inputs relating to user settings and function controls of the apparatus. The output device 63 may include a display device such as a display screen.
The embodiment of the invention also provides a computer readable storage medium, wherein a computer program is stored on the computer readable storage medium, and when the computer program is executed by a processor, the computer readable storage medium realizes the automobile camera simulation method of any embodiment.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, or the like, as well as conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions deviate from the technical solutions of the embodiments of the present invention.

Claims (10)

1. An automobile camera simulation method is characterized by comprising the following steps:
acquiring a light source model and a reflector model in a traffic space where an automobile is located;
determining the color of each position of the surface of the reflector model after the light emitted by the light source model and before diffuse reflection is subjected to first diffuse reflection on the surface of the reflector model;
simulating undistorted first image data of light rays projected on a camera light sensing plate after diffuse reflection according to the color;
simulating second image data of the first image after the first image is distorted by the camera according to the distortion parameters of the camera;
wherein the simulation process of the first image data comprises: constructing a blank first image, and performing the following operations on each pixel of the first image: and determining the emergent position of the light rays which are projected at any pixel of the first image and subjected to diffuse reflection according to the internal parameters and the external parameters of the camera, and taking the color of the emergent position as the color of any pixel.
2. The method of claim 1, wherein the determining the color of the reflector model surface at each position after the first diffuse reflection of the light emitted by the light source model before the diffuse reflection occurs on the reflector model surface comprises:
simulating light rays emitted by the light source model and before diffuse reflection;
simulating the light rays which are emitted from any position after the light rays before diffuse reflection reach any position on the surface of the reflector and are subjected to first diffuse reflection according to the propagation principle of the light;
and determining the color of any position according to the light rays after diffuse reflection.
3. The method of claim 1, wherein constructing the blank first image comprises:
constructing a blank first image at a focal point of the camera according to the field angle of the camera;
determining a distribution of pixels in the first image according to a resolution of the camera.
4. The method of claim 1, wherein determining the exit position of the diffusely reflected light projected at any pixel of the first image according to the internal and external parameters of the camera comprises:
forming an internal parameter matrix and an external parameter matrix by the internal parameters and the external parameters of the camera;
according to a first coordinate of any pixel in the first image in a pixel coordinate system, calculating a second coordinate in a world coordinate system, which is transformed into the first coordinate after passing through the internal reference matrix and the external reference matrix;
and determining the emergent position of the diffused and reflected light rays projected at the pixel according to the second coordinate.
5. The method of claim 4, wherein determining the exit position of the diffusely reflected light projected at the pixel from the second coordinate comprises:
drawing a ray from the focal point of the camera to the spatial location represented by the second coordinate;
if the intersection point does not exist between the ray and the reflector model surface, determining the emergent position of the diffusely reflected ray projected at the pixel as follows: infinity;
if at least one intersection point exists between the ray and the reflector model table, determining the emergent position of the diffusely reflected ray projected at the pixel as follows: an intersection point nearest to the focal point.
6. The method according to claim 1, wherein the regarding the color of the exit position as the color of any pixel comprises:
and if the emergent position is infinity, taking a preset sky color as the color of any pixel.
7. The method of claim 1, wherein simulating camera-distorted second image data of the first image according to the distortion parameters of the camera comprises:
constructing a blank second image, wherein the pixel distribution of the second image is the same as the first image;
performing the following for each pixel of the second image: determining an original position of any pixel in the second image in the first image before the distortion of the camera occurs according to the distortion parameter of the camera; and obtaining the color of any pixel according to the colors of four pixels adjacent to the original position in the first image.
8. The method of claim 7, wherein determining an original position of any pixel in the second image in the first image before camera distortion occurs according to the distortion parameter of the camera comprises:
determining a mapping relation between each pixel in the first image and a corresponding position in a second image according to the distortion parameter of the camera, wherein the color of each pixel in the first image is displayed at the corresponding position in the second image after being distorted by the camera;
and determining the original position of any pixel in the second image in the first image before the camera distortion occurs according to the inverse mapping of the mapping relation.
9. An electronic device, comprising: memory, processor and computer program stored on the memory and executable by the processor, characterized in that the processor implements the method according to any of claims 1 to 8 when executing the computer program.
10. A computer-readable storage medium storing computer instructions for causing a computer to perform the method of any one of claims 1 to 8.
CN202210637513.8A 2022-06-08 2022-06-08 Automobile camera simulation method, electronic device and storage medium Active CN114708378B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210637513.8A CN114708378B (en) 2022-06-08 2022-06-08 Automobile camera simulation method, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210637513.8A CN114708378B (en) 2022-06-08 2022-06-08 Automobile camera simulation method, electronic device and storage medium

Publications (2)

Publication Number Publication Date
CN114708378A CN114708378A (en) 2022-07-05
CN114708378B true CN114708378B (en) 2022-08-16

Family

ID=82177653

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210637513.8A Active CN114708378B (en) 2022-06-08 2022-06-08 Automobile camera simulation method, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN114708378B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104200509A (en) * 2014-08-19 2014-12-10 山东大学 Photon mapping accelerating method based on point cache
CN108288301A (en) * 2018-01-26 2018-07-17 南京乐飞航空技术有限公司 A kind of binocular night vision Imaging Simulation method and system based on OpenGL
CN111830810A (en) * 2020-06-12 2020-10-27 北京邮电大学 Method and device for generating computer hologram representing real illumination on voxel
CN113298924A (en) * 2020-08-28 2021-08-24 阿里巴巴集团控股有限公司 Scene rendering method, computing device and storage medium
CN114322842A (en) * 2021-12-09 2022-04-12 中国石油大学(华东) High-reflectivity part measuring method and system based on improved Phong model

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007066064A (en) * 2005-08-31 2007-03-15 Sega Corp Image generating device and image generating program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104200509A (en) * 2014-08-19 2014-12-10 山东大学 Photon mapping accelerating method based on point cache
CN108288301A (en) * 2018-01-26 2018-07-17 南京乐飞航空技术有限公司 A kind of binocular night vision Imaging Simulation method and system based on OpenGL
CN111830810A (en) * 2020-06-12 2020-10-27 北京邮电大学 Method and device for generating computer hologram representing real illumination on voxel
CN113298924A (en) * 2020-08-28 2021-08-24 阿里巴巴集团控股有限公司 Scene rendering method, computing device and storage medium
CN114322842A (en) * 2021-12-09 2022-04-12 中国石油大学(华东) High-reflectivity part measuring method and system based on improved Phong model

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A Framework for Realistic Image Synthesis;Donald P. Greenberg et.al;《SIGGRAPH "97: Proceedings of the 24th annual conference on Computer graphics and interactive techniques》;19970831;第1-18页 *
基于光线追踪的实时渲染技术分析;赵亮;《数字技术与应用》;20190430;第37卷(第04期);第42-43页 *

Also Published As

Publication number Publication date
CN114708378A (en) 2022-07-05

Similar Documents

Publication Publication Date Title
US11257272B2 (en) Generating synthetic image data for machine learning
CN106023070B (en) Real time panoramic joining method and device
CN110336987A (en) A kind of projector distortion correction method, device and projector
US7321839B2 (en) Method and apparatus for calibration of camera system, and method of manufacturing camera system
CN106981050A (en) The method and apparatus of the image flame detection shot to fish eye lens
US10990836B2 (en) Method and apparatus for recognizing object, device, vehicle and medium
CN109003297B (en) Monocular depth estimation method, device, terminal and storage medium
CN113256778B (en) Method, device, medium and server for generating vehicle appearance part identification sample
JP2004187298A (en) Plotting and encoding processing of panoramic image and omnidirection image
CN110335330B (en) Image simulation generation method and system, deep learning algorithm training method and electronic equipment
CN111968216A (en) Volume cloud shadow rendering method and device, electronic equipment and storage medium
EP2600314A1 (en) Simulation of three-dimensional (3d) cameras
US20240037856A1 (en) Walkthrough view generation method, apparatus and device, and storage medium
Song et al. Deep sea robotic imaging simulator
CN112053440A (en) Method for determining individualized model and communication device
CN115908716A (en) Virtual scene light rendering method and device, storage medium and electronic equipment
CA3120722C (en) Method and apparatus for planning sample points for surveying and mapping, control terminal and storage medium
JP7432793B1 (en) Mapping methods, devices, chips and module devices based on three-dimensional point clouds
CN114708378B (en) Automobile camera simulation method, electronic device and storage medium
US20230401837A1 (en) Method for training neural network model and method for generating image
CN116258756B (en) Self-supervision monocular depth estimation method and system
CN115861145A (en) Image processing method based on machine vision
CN116524101A (en) Global illumination rendering method and device based on auxiliary buffer information and direct illumination
JP5413502B2 (en) Halation simulation method, apparatus, and program
CN113052884A (en) Information processing method, information processing apparatus, storage medium, and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20230324

Address after: 300392 1st Floor, Building B, New City Center, No. 3, Wanhui Road, Zhongbei Town, Xiqing District, Tianjin

Patentee after: Zhongqi Zhilian Technology Co.,Ltd.

Address before: Room 12-17, block B1, new city center, No.3 Wanhui Road, Zhongbei Town, Xiqing District, Tianjin, 300385

Patentee before: Sinotruk data (Tianjin) Co.,Ltd.

TR01 Transfer of patent right