CN111767862A - Vehicle labeling method and device, computer equipment and readable storage medium - Google Patents

Vehicle labeling method and device, computer equipment and readable storage medium Download PDF

Info

Publication number
CN111767862A
CN111767862A CN202010615093.4A CN202010615093A CN111767862A CN 111767862 A CN111767862 A CN 111767862A CN 202010615093 A CN202010615093 A CN 202010615093A CN 111767862 A CN111767862 A CN 111767862A
Authority
CN
China
Prior art keywords
vehicle
display area
tail lamp
area
processed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010615093.4A
Other languages
Chinese (zh)
Inventor
彭进华
孙鹏
黄佳健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Weride Technology Co Ltd
Original Assignee
Guangzhou Weride Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Weride Technology Co Ltd filed Critical Guangzhou Weride Technology Co Ltd
Priority to CN202010615093.4A priority Critical patent/CN111767862A/en
Publication of CN111767862A publication Critical patent/CN111767862A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/584Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/08Detecting or categorising vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides a vehicle marking method, a vehicle marking device, computer equipment and a computer readable storage medium, wherein the method comprises the following steps: determining a vehicle display area of at least one vehicle and a tail lamp display area of a vehicle tail lamp, and labeling a two-dimensional labeling frame for each vehicle display area and each tail lamp display area; judging whether the vehicle display area displays at least two outer surfaces of the vehicle; if yes, determining a separation line between the adjacent outer surfaces according to the tail lamp display area; and generating a three-dimensional labeling frame in the vehicle display area according to the separation line and the two-dimensional labeling frame. The vehicle marking method provided by the invention can realize three-dimensional marking of the vehicle.

Description

Vehicle labeling method and device, computer equipment and readable storage medium
Technical Field
The present invention relates to the field of pattern recognition, and in particular, to a vehicle labeling method and apparatus, a computer device, and a computer-readable storage medium.
Background
In the prior art, the environment, the road driving state and the like of a vehicle often need to be analyzed, and a camera arranged on the vehicle is usually adopted to collect images around the vehicle in real time and further perform image recognition so as to obtain information such as the environment, the road driving state and the like of the vehicle. The partial image identification needs to label moving or static vehicles so as to facilitate observation of users or further fine image identification, and the existing vehicle labels are two-dimensional frame selection labels and cannot embody three-dimensional information for identifying the vehicles.
Disclosure of Invention
The invention mainly aims to provide a vehicle labeling method, a vehicle labeling device, computer equipment and a computer readable storage medium, and aims to solve the technical problem that three-dimensional information cannot be labeled in the prior art.
To achieve the above object, the present invention provides a vehicle labeling method, comprising the steps of:
determining a vehicle display area of at least one vehicle and a tail lamp display area of a vehicle tail lamp, and labeling a two-dimensional labeling frame for each vehicle display area and each tail lamp display area;
judging whether the vehicle display area displays at least two outer surfaces of the vehicle;
if yes, determining a separation line between the adjacent outer surfaces according to the tail lamp display area;
and generating a three-dimensional labeling frame in the vehicle display area according to the separation line and the two-dimensional labeling frame.
Preferably, the step of determining a separation line between adjacent outer surfaces based on the tail light display area includes:
judging whether the number of corresponding tail lamp display areas in the vehicle display area is greater than 1;
if the number of the corresponding tail lamp display areas in the vehicle display area is one, determining a separation line according to the positions of the tail lamp display areas;
and if the number of the corresponding tail lamp display areas in the vehicle display area is more than 1, calculating the distance between each tail lamp display area and the midpoint of the vehicle display area, and determining a separation line according to the position of the tail lamp display area corresponding to the minimum distance.
Preferably, the step of determining the separation line according to the position of the tail light display area comprises:
acquiring the ratio of the longitudinal edge and the transverse edge of the two-dimensional marking frame corresponding to the tail lamp display area, and judging whether the ratio is greater than 1;
if the ratio is less than or equal to 1, determining a separation line according to the midpoint of the tail lamp display area;
and if the ratio is larger than 1, determining a separation line along any longitudinal edge of the tail lamp display area.
Preferably, the step of determining at least one of a vehicle display area of the vehicle and a tail light display area of the vehicle tail light, and labeling each of the vehicle display area and the tail light display area with a two-dimensional labeling frame includes:
the method comprises the steps of performing real-time framing, identifying a framing picture, and determining a to-be-processed area of a vehicle in the framing picture and a tail lamp display area in the to-be-processed area;
judging whether the area to be processed displays a plurality of mutually shielded vehicles or not;
if the area to be processed does not display a plurality of mutually shielded vehicles, setting the area to be processed as a vehicle display area corresponding to a single vehicle, and labeling a two-dimensional labeling frame on the vehicle display area and the tail lamp display area;
and if the area to be processed displays a plurality of mutually shielded vehicles, determining the vehicle display area corresponding to each vehicle in the area to be processed, and respectively labeling the two-dimensional labeling frame on each vehicle display area and the tail lamp display area.
Preferably, the step of judging whether the areas to be processed are a plurality of vehicles which are mutually shielded comprises the following steps:
judging whether the outer contour of the area to be processed is matched with a preset vehicle contour or not;
if the outline of the area to be processed is matched with the outline of a preset vehicle, determining that the area to be processed does not display a plurality of mutually shielded vehicles;
and if the outline of the area to be processed is not matched with the preset vehicle outline, determining that the area to be processed displays a plurality of mutually shielded vehicles.
Preferably, the step of judging whether the areas to be processed are a plurality of vehicles which are mutually shielded comprises the following steps:
determining the number of the tail lamp display areas in the area to be processed;
when the number of the tail lamp display areas in the area to be processed is 1, determining that a plurality of mutually shielded vehicles are not displayed in the area to be processed;
when the number of the tail lamp display areas in the area to be processed is 2, judging whether the two tail lamp display areas are symmetrically displayed tail lamp display areas or not;
if the two tail lamp display areas are symmetrically displayed tail lamp display areas, determining that the area to be processed does not display a plurality of mutually shielded vehicles;
if the two tail lamp display areas are asymmetrically displayed tail lamp display areas, determining that the area to be processed displays a plurality of mutually shielded vehicles;
and when the number of the tail lamp display areas in the area to be processed is larger than 2, determining that the area to be processed displays a plurality of mutually shielded vehicles.
Preferably, the step of determining the vehicle display areas corresponding to the vehicles in the area to be processed and labeling the two-dimensional labeling frames on the vehicle display areas includes:
and determining the vehicle display area corresponding to each vehicle in the area to be processed according to the color information of each color block in the vehicle display area, and labeling a two-dimensional labeling frame on each vehicle display area.
Preferably, the step of determining the vehicle display areas corresponding to the vehicles in the area to be processed and labeling the two-dimensional labeling frames on the vehicle display areas includes:
acquiring symmetrically displayed tail lamp display areas in each tail lamp display area, and calculating to obtain the vehicle display area of the vehicle corresponding to the tail lamp display area according to a preset mapping table, the distance between the symmetrically displayed tail lamp display areas and the height of the area to be processed;
and determining the vehicle display areas of the shielded vehicles according to the vehicle display areas of the vehicles corresponding to the tail lamp display areas and the to-be-processed areas which are symmetrically displayed, and labeling two-dimensional labeling frames for each vehicle display area.
In order to achieve the above object, the present invention further provides a labeling apparatus, comprising:
the identification module is used for determining a vehicle display area of at least one vehicle and a tail lamp display area of a vehicle tail lamp and labeling a two-dimensional labeling frame for each vehicle display area and each tail lamp display area;
the judging module is used for judging whether the vehicle display area displays at least two outer surfaces of the vehicle or not;
the marking module is used for determining a separation line between the adjacent outer surfaces according to the tail lamp display area if the vehicle display area displays at least two outer surfaces of the vehicle; and generating a three-dimensional labeling frame in the vehicle display area according to the separation line and the two-dimensional labeling frame.
To achieve the above object, the present invention further provides a computer device comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the vehicle labeling method as described above.
To achieve the above object, the present invention also provides a computer readable storage medium having stored thereon a computer program which, when being executed by a processor, realizes the steps of the vehicle labeling method as described above.
According to the vehicle marking method, the vehicle marking device, the computer equipment and the computer readable storage medium, whether a three-dimensional marking frame can be established for a vehicle display area is determined by judging whether the vehicle display area displays at least two outer surfaces of a vehicle; determining a separation line between adjacent outer surfaces according to the position of the tail lamp display area, so that a three-dimensional marking frame can be generated in the vehicle display area according to the separation line and the two-dimensional marking frame; the staff can further carry out further statistics, identification and analysis to the three-dimensional marking frame and the display area corresponding to the three-dimensional marking frame, and can also facilitate the user to observe the real-time change of the environment where the vehicle is located.
Drawings
FIG. 1 is a schematic flow chart diagram of a first embodiment of a vehicle labeling method of the present invention;
FIG. 2 is a detailed flowchart of step S300 of the vehicle labeling method according to the second embodiment of the present invention,
FIG. 3 is a detailed flowchart of step S320 of the vehicle labeling method according to the third embodiment of the present invention,
FIG. 4 is a detailed flowchart of step S100 of a fourth embodiment of the vehicle labeling method according to the present invention,
FIG. 5 is a block diagram of a computer device according to the present invention;
FIG. 6 is a schematic block diagram of a labeling apparatus according to the present invention.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The invention provides a vehicle marking method which is applied to an outdoor unit of computer equipment. Referring to fig. 1, fig. 1 is a schematic flow chart of a first embodiment of a vehicle labeling method of the present invention, the method comprising the steps of:
step S100, determining a vehicle display area of at least one vehicle and a tail lamp display area of a vehicle tail lamp, and labeling a two-dimensional labeling frame for each vehicle display area and each tail lamp display area;
the method comprises the steps that real-time framing of preset interval time can be carried out on an environment where a vehicle is located through a camera arranged on the vehicle, real-time video recording can also be carried out on the environment where the vehicle is located through the camera arranged on the vehicle, each picture in the video recording is used as real-time framing, the vehicle and a tail lamp in a framing picture are identified through a trained image identification model, and therefore a vehicle display area of at least one vehicle in the framing picture and a tail lamp display area of a vehicle tail lamp are determined. The recognition model can also be trained by adopting a plurality of pictures containing different vehicle patterns to obtain the trained image recognition model, and a person skilled in the art can select different recognition models to train according to actual needs.
Step S200, judging whether the vehicle display area displays at least two outer surfaces of the vehicle;
if yes, executing: step S300;
and if not, the vehicle display area only displays one surface of the vehicle, and the two-dimensional marking frame and the three-dimensional marking frame are the same and are not processed.
Step S300, determining a separation line between adjacent outer surfaces according to the tail lamp display area;
and the division parting lines between different outer surfaces can be determined according to the color, the brightness, the vehicle type, the vehicle lamp position, the wheel position and the like of the vehicle display area only when the vehicle display area displays at least two outer surfaces of the vehicle.
And S400, generating a three-dimensional labeling frame in the vehicle display area according to the separation line and the two-dimensional labeling frame.
And respectively determining intersection points of two ends of the separation line and the two-position marking frame, taking the intersection points as vertex angles of the three-dimensional marking frame, and taking the separation line and the two-position marking frame as edges of the three-dimensional marking frame.
In the invention, whether a three-dimensional marking frame can be established for a vehicle display area is determined by judging whether the vehicle display area displays at least two outer surfaces of a vehicle; determining a separation line between adjacent outer surfaces according to the position of the tail lamp display area, so that a three-dimensional marking frame can be generated in the vehicle display area according to the separation line and the two-dimensional marking frame; the staff can further carry out further statistics, identification and analysis to the three-dimensional marking frame and the display area corresponding to the three-dimensional marking frame, and can also facilitate the user to observe the real-time change of the environment where the vehicle is located.
Referring to fig. 2, fig. 2 is a detailed flowchart of step S300 of the vehicle labeling method according to the second embodiment of the present invention, and step S300 includes:
step S310, judging whether the number of corresponding tail lamp display areas in the vehicle display area is more than 1;
if the number of the corresponding tail lamp display areas in the vehicle display area is 1, executing:
step S320, determining a separation line according to the position of the tail lamp display area;
when only one tail lamp is arranged in the vehicle display area, the camera only shoots a part of the vehicle corresponding to the vehicle display area, and the vehicle tail lamp is arranged on two sides of the vehicle, so that the separation line between the rear side of the vehicle and the left side or the right side of the vehicle can be directly determined according to the position of the tail lamp display area.
If the number of the corresponding tail lamp display areas in the vehicle display area is greater than 1, executing:
and step S330, calculating the distance between each tail lamp display area and the midpoint of the vehicle display area, and determining a separation line according to the position of the tail lamp display area corresponding to the minimum distance.
Because the vehicle taillights are arranged oppositely, in a view picture shot by the camera, only a complete vehicle taillight and a partial structure of the other vehicle taillight can be displayed in the vehicle display area, so that the vehicle taillight is positioned at the edge of the vehicle display area, the vehicle taillight positioned at the edge of the vehicle display area is positioned at the edge, namely the position of the vehicle taillight is the outer edge of the three-dimensional marking frame, and the division of the dividing line is not influenced.
Referring to fig. 3, fig. 3 is a detailed flowchart of step S320 of the vehicle labeling method according to the third embodiment of the present invention, and step S300 includes:
step S321, obtaining the ratio of the longitudinal edge and the transverse edge of the two-dimensional marking frame corresponding to the tail lamp display area, and judging whether the ratio is greater than 1;
the longitudinal side is a side in the vertical direction, namely, the direction corresponding to the gravity of the vehicle, and the transverse side is a side perpendicular to the longitudinal side.
If the ratio is less than or equal to 1, executing step S322, determining a separation line according to the midpoint of the tail light display area;
if the ratio is greater than 1, step S323 is executed to determine a separation line along any longitudinal side of the tail light display area.
When the ratio is less than or equal to 1, the tail lamp corresponding to the tail lamp display area is transversely arranged, the tail lamp can extend very long in the transverse direction, so that the division difference of the separation lines is caused, and the separation lines are determined through the middle points of the tail lamp display area, so that the difference caused by the fact that the tail lamp is transversely overlong is reduced; when the ratio is greater than 1, the tail lamp corresponding to the tail lamp display area is longitudinally arranged, namely, the tail lamp is arranged in an extending manner along the gravity direction, and the separation line can be determined by any one of the left side and the right side of the vertically arranged tail lamp.
Referring to fig. 4, fig. 4 is a detailed flowchart of step S100 of the vehicle labeling method according to the fourth embodiment of the present invention, and step S100 includes:
step S110, performing real-time framing, identifying a framing picture, and determining a to-be-processed area of a vehicle in the framing picture and a tail lamp display area in the to-be-processed area;
the area to be processed may be a vehicle display area corresponding to one vehicle, or may be a plurality of vehicle display areas corresponding to a plurality of vehicles that are mutually blocked.
Step S120, judging whether the area to be processed displays a plurality of mutually shielded vehicles;
because different vehicles have different colors, the average color value of each block can be compared with a preset threshold value through an average block method to determine whether a plurality of mutually shielded vehicles are displayed; whether a plurality of mutually shielded vehicles are displayed or not can be judged by identifying the number of the vehicle tires; the outline of the area to be processed can be compared with a preset outline, and whether a plurality of vehicles which are mutually shielded are displayed or not is judged.
If the to-be-processed area does not display a plurality of mutually shielded vehicles, executing:
step S130, setting the area to be processed as a vehicle display area corresponding to a single vehicle, and labeling a two-dimensional labeling frame on the vehicle display area and the tail lamp display area;
if the area to be processed displays a plurality of mutually shielded vehicles, executing:
step S140, determining the vehicle display areas corresponding to the vehicles in the area to be processed, and labeling two-dimensional labeling frames for the vehicle display areas and the tail lamp display area respectively.
When the area to be processed is a vehicle display area corresponding to a plurality of vehicles, the vehicle display area corresponding to each vehicle in the area to be processed needs to be determined, and then two-dimensional labeling frames are respectively labeled on each vehicle display area and the tail lamp display area, so that a three-dimensional labeling frame is further generated according to each two-dimensional labeling frame.
In another embodiment, the step S140 includes:
step S141, determining the vehicle display area corresponding to each vehicle in the area to be processed according to the color information of each color block in the vehicle display area, and labeling a two-dimensional labeling frame on each vehicle display area.
Specifically, the vehicle display area is divided into a plurality of color blocks with the same area, and the vehicle display areas corresponding to different vehicles are determined according to the color tolerance of each color block. And the vehicle display area corresponding to the single vehicle is not required to be further identified, so that the vehicle marking speed is improved.
In another embodiment, the step S140 includes:
step S142, obtaining symmetrically displayed tail lamp display areas in each tail lamp display area, and calculating to obtain the vehicle display area of the vehicle corresponding to the tail lamp display area according to a preset mapping table, the distance between the symmetrically displayed tail lamp display areas and the height of the area to be processed;
the preset mapping table stores the vehicle lengths and the vehicle widths corresponding to the vehicles with different heights. When symmetrically displayed tail lamp display areas exist in each tail lamp display area, the width of the vehicle displayed on the vehicle display area can be determined according to the distance between the symmetrically displayed tail lamp display areas, and the height of the vehicle displayed on the vehicle display area can be determined according to the height of the area to be processed, so that the length corresponding to the ratio of the displayed width to the displayed height can be obtained according to a preset mapping table; and then the corresponding display length of the vehicle on the vehicle display area can be obtained through conversion according to the preset coefficient. The preset coefficient is obtained by the user through calculation according to the lens distortion coefficient of the camera, the position of the camera on the vehicle and the perspective coefficient of the camera in advance.
Step S143, determining the vehicle display area of the sheltered vehicle according to the vehicle display area and the to-be-processed area of the vehicle corresponding to the tail lamp display area which are symmetrically displayed, and labeling a two-dimensional labeling frame for each vehicle display area.
And when the vehicle display area corresponding to the symmetrically displayed tail lamp display area is determined, the other part of the area to be processed is the vehicle display area of the rest tail lamp display areas.
And respectively processing the shielding conditions of a single vehicle and a plurality of vehicles by judging whether the to-be-processed area displays a plurality of mutually shielded vehicles.
In one embodiment, the step S120 includes:
step S121, judging whether the outer contour of the area to be processed is matched with a preset vehicle contour;
if the outline of the area to be processed is matched with the outline of a preset vehicle, determining that the area to be processed does not display a plurality of mutually shielded vehicles;
and if the outline of the area to be processed is not matched with the preset vehicle outline, determining that the area to be processed displays a plurality of mutually shielded vehicles.
The person skilled in the art can preset the outer contours of a plurality of individual vehicles at different angles as the preset vehicle contour. The outline of the area to be processed is matched with the outline of the preset vehicle, so that whether a plurality of vehicles which are mutually shielded are displayed in the area to be processed is judged quickly.
In another embodiment, the step S120 includes:
step S122, determining the number of the tail lamp display areas in the area to be processed;
when the number of the tail lamp display areas in the area to be processed is 1, determining that a plurality of mutually shielded vehicles are not displayed in the area to be processed;
when the number of the tail lamp display areas in the area to be processed is 2, executing step S123, and determining whether two of the tail lamp display areas are symmetrically displayed tail lamp display areas;
specifically, the sizes of the two tail lamp display areas can be converted into the same size, and then mirror image comparison is performed.
If the two tail lamp display areas are symmetrically displayed tail lamp display areas, determining that the area to be processed does not display a plurality of mutually shielded vehicles;
if the two tail lamp display areas are asymmetrically displayed tail lamp display areas, determining that the area to be processed displays a plurality of mutually shielded vehicles;
and when the number of the tail lamp display areas in the area to be processed is larger than 2, determining that the area to be processed displays a plurality of mutually shielded vehicles.
When the number of the tail lamp display areas is 1, the fact that the area to be processed only contains one tail lamp is shown, even if vehicles are mutually shielded, the exposed part of the shielded vehicles is too small, and the significance of three-dimensional marking is not great; when the number of the tail light display areas is 2, it means that the area to be processed can include two tail lights of one vehicle or two tail lights of two vehicles, and the two situations can be distinguished by judging whether the tail lights are symmetrically displayed or not. The embodiment distinguishes whether the area to be processed displays a plurality of mutually-shielded vehicles or not through the number of the tail lamp display areas.
Referring to fig. 6, the present invention also provides a labeling apparatus, including:
the vehicle-mounted display system comprises an identification module 1, a display module and a display module, wherein the identification module is used for determining a vehicle display area of at least one vehicle and a tail lamp display area of a vehicle tail lamp and marking a two-dimensional marking frame on each vehicle display area and each tail lamp display area;
the judging module 2 is used for judging whether the vehicle display area displays at least two outer surfaces of the vehicle or not;
the marking module 3 is used for determining a separation line between the adjacent outer surfaces according to the tail lamp display area if the vehicle display area displays at least two outer surfaces of the vehicle; and generating a three-dimensional labeling frame in the vehicle display area according to the separation line and the two-dimensional labeling frame.
Since the labeling device adopts all technical solutions of all the embodiments, at least all the beneficial effects brought by the technical solutions of the embodiments are achieved, and are not repeated herein. Each module in the labeling apparatus has one or more functional units to implement the steps of the above-described method embodiments.
Referring to fig. 5, components of the computer device communication module 10, the memory 20, the processor 30, and the like are described in a hardware configuration. The processor 30 is connected to the memory 20 and the communication module 10, respectively, the memory 20 having stored thereon a computer program that is executed by the processor 30, the computer program implementing the steps of the above-described method embodiments when executed.
The communication module 10 may be connected to an external communication device through a network. The communication module 10 may receive a request from an external communication device, and may also send a request, an instruction, and information to the external communication device, where the external communication device may be another intelligent terminal, an infrared remote controller, a bluetooth remote controller, or the like, such as a mobile phone.
The memory 20 may be used to store software programs as well as various data. The memory 20 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program (such as an ambient temperature detection program, a compressor control program) required for at least one function, and the like; the storage data area may include a database, and the storage data area may store data or information created according to use of the system, or the like. Further, the memory 20 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 30, which is a control center, connects various parts of the entire computer apparatus using various interfaces and lines, performs various functions of the computer apparatus and processes data by running or executing software programs and/or modules stored in the memory 20 and calling data stored in the memory 20, thereby monitoring the computer apparatus as a whole. Processor 30 may include one or more processing units; alternatively, the processor 30 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 30.
Although not shown in fig. 5, the computer device may further include a circuit control module for connecting with a power supply to ensure the normal operation of other components. Those skilled in the art will appreciate that the computer device configuration illustrated in FIG. 5 does not constitute a limitation of computer devices, and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The invention also proposes a computer-readable storage medium on which a computer program is stored. The computer-readable storage medium may be the Memory 20 in the computer device in fig. 5, and may also be at least one of a ROM (Read-Only Memory)/RAM (Random Access Memory), a magnetic disk, and an optical disk, where the computer-readable storage medium includes instructions for enabling a terminal device (which may be a television, an automobile, a mobile phone, a computer, a server, a terminal, or a network device) having a processor to execute the method according to the embodiments of the present invention.
In the present invention, the terms "first", "second", "third", "fourth" and "fifth" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance, and those skilled in the art can understand the specific meanings of the above terms in the present invention according to specific situations.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Although the embodiment of the present invention has been shown and described, the scope of the present invention is not limited thereto, it should be understood that the above embodiment is illustrative and not to be construed as limiting the present invention, and that those skilled in the art can make changes, modifications and substitutions to the above embodiment within the scope of the present invention, and that these changes, modifications and substitutions should be covered by the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (11)

1. A vehicle labeling method, characterized in that the method comprises:
determining a vehicle display area of at least one vehicle and a tail lamp display area of a vehicle tail lamp, and labeling a two-dimensional labeling frame for each vehicle display area and each tail lamp display area;
judging whether the vehicle display area displays at least two outer surfaces of the vehicle;
if yes, determining a separation line between the adjacent outer surfaces according to the tail lamp display area;
and generating a three-dimensional labeling frame in the vehicle display area according to the separation line and the two-dimensional labeling frame.
2. The vehicle marking method as set forth in claim 1, wherein the step of determining a separation line between adjacent outer surfaces based on the tail light display area comprises:
judging whether the number of corresponding tail lamp display areas in the vehicle display area is greater than 1;
if the number of the corresponding tail lamp display areas in the vehicle display area is one, determining a separation line according to the positions of the tail lamp display areas;
and if the number of the corresponding tail lamp display areas in the vehicle display area is more than 1, calculating the distance between each tail lamp display area and the midpoint of the vehicle display area, and determining a separation line according to the position of the tail lamp display area corresponding to the minimum distance.
3. The vehicle labeling method of claim 2, wherein said step of determining a separation line based on the position of said tail light display area comprises:
acquiring the ratio of the longitudinal edge and the transverse edge of the two-dimensional marking frame corresponding to the tail lamp display area, and judging whether the ratio is greater than 1;
if the ratio is less than or equal to 1, determining a separation line according to the midpoint of the tail lamp display area;
and if the ratio is larger than 1, determining a separation line along any longitudinal edge of the tail lamp display area.
4. The vehicle labeling method of claim 1, wherein said step of determining at least one of a vehicle display area of the vehicle and a tail light display area of a tail light of the vehicle, and labeling each of said vehicle display area and tail light display area with a two-dimensional label box comprises:
the method comprises the steps of performing real-time framing, identifying a framing picture, and determining a to-be-processed area of a vehicle in the framing picture and a tail lamp display area in the to-be-processed area;
judging whether the area to be processed displays a plurality of mutually shielded vehicles or not;
if the area to be processed does not display a plurality of mutually shielded vehicles, setting the area to be processed as a vehicle display area corresponding to a single vehicle, and labeling a two-dimensional labeling frame on the vehicle display area and the tail lamp display area;
and if the area to be processed displays a plurality of mutually shielded vehicles, determining the vehicle display area corresponding to each vehicle in the area to be processed, and respectively labeling the two-dimensional labeling frame on each vehicle display area and the tail lamp display area.
5. The vehicle labeling method of claim 4, wherein the step of determining whether the areas to be processed are a plurality of vehicles that are occluded from each other comprises:
judging whether the outer contour of the area to be processed is matched with a preset vehicle contour or not;
if the outline of the area to be processed is matched with the outline of a preset vehicle, determining that the area to be processed does not display a plurality of mutually shielded vehicles;
and if the outline of the area to be processed is not matched with the preset vehicle outline, determining that the area to be processed displays a plurality of mutually shielded vehicles.
6. The vehicle labeling method of claim 4, wherein the step of determining whether the areas to be processed are a plurality of vehicles that are occluded from each other comprises:
determining the number of the tail lamp display areas in the area to be processed;
when the number of the tail lamp display areas in the area to be processed is 1, determining that a plurality of mutually shielded vehicles are not displayed in the area to be processed;
when the number of the tail lamp display areas in the area to be processed is 2, judging whether the two tail lamp display areas are symmetrically displayed tail lamp display areas or not;
if the two tail lamp display areas are symmetrically displayed tail lamp display areas, determining that the area to be processed does not display a plurality of mutually shielded vehicles;
if the two tail lamp display areas are asymmetrically displayed tail lamp display areas, determining that the area to be processed displays a plurality of mutually shielded vehicles;
and when the number of the tail lamp display areas in the area to be processed is larger than 2, determining that the area to be processed displays a plurality of mutually shielded vehicles.
7. The vehicle labeling method of claim 4, wherein the step of determining the vehicle display area corresponding to each vehicle in the area to be processed and labeling each vehicle display area with a two-dimensional labeling frame comprises:
and determining the vehicle display area corresponding to each vehicle in the area to be processed according to the color information of each color block in the vehicle display area, and labeling a two-dimensional labeling frame on each vehicle display area.
8. The vehicle labeling method of claim 4, wherein the step of determining the vehicle display area corresponding to each vehicle in the area to be processed and labeling each vehicle display area with a two-dimensional labeling frame comprises:
acquiring symmetrically displayed tail lamp display areas in each tail lamp display area, and calculating to obtain the vehicle display area of the vehicle corresponding to the tail lamp display area according to a preset mapping table, the distance between the symmetrically displayed tail lamp display areas and the height of the area to be processed;
and determining the vehicle display areas of the shielded vehicles according to the vehicle display areas of the vehicles corresponding to the tail lamp display areas and the to-be-processed areas which are symmetrically displayed, and labeling two-dimensional labeling frames for each vehicle display area.
9. A marking device, comprising:
the identification module is used for determining a vehicle display area of at least one vehicle and a tail lamp display area of a vehicle tail lamp and labeling a two-dimensional labeling frame for each vehicle display area and each tail lamp display area;
the judging module is used for judging whether the vehicle display area displays at least two outer surfaces of the vehicle or not;
the marking module is used for determining a separation line between the adjacent outer surfaces according to the tail lamp display area if the vehicle display area displays at least two outer surfaces of the vehicle; and generating a three-dimensional labeling frame in the vehicle display area according to the separation line and the two-dimensional labeling frame.
10. Computer arrangement, characterized in that the computer arrangement comprises a memory, a processor, and a computer program stored on the memory and executable on the processor, which computer program, when being executed by the processor, carries out the steps of the vehicle labeling method according to any one of claims 1 to 8.
11. A computer-readable storage medium, characterized in that a computer program is stored thereon, which computer program, when being executed by a processor, carries out the steps of the vehicle labeling method according to any one of claims 1 to 8.
CN202010615093.4A 2020-06-30 2020-06-30 Vehicle labeling method and device, computer equipment and readable storage medium Pending CN111767862A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010615093.4A CN111767862A (en) 2020-06-30 2020-06-30 Vehicle labeling method and device, computer equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010615093.4A CN111767862A (en) 2020-06-30 2020-06-30 Vehicle labeling method and device, computer equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN111767862A true CN111767862A (en) 2020-10-13

Family

ID=72723414

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010615093.4A Pending CN111767862A (en) 2020-06-30 2020-06-30 Vehicle labeling method and device, computer equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN111767862A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110163904A (en) * 2018-09-11 2019-08-23 腾讯大地通途(北京)科技有限公司 Object marking method, control method for movement, device, equipment and storage medium
CN111046743A (en) * 2019-11-21 2020-04-21 新奇点企业管理集团有限公司 Obstacle information labeling method and device, electronic equipment and storage medium
CN111062255A (en) * 2019-11-18 2020-04-24 苏州智加科技有限公司 Three-dimensional point cloud labeling method, device, equipment and storage medium
WO2020088076A1 (en) * 2018-10-31 2020-05-07 阿里巴巴集团控股有限公司 Image labeling method, device, and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110163904A (en) * 2018-09-11 2019-08-23 腾讯大地通途(北京)科技有限公司 Object marking method, control method for movement, device, equipment and storage medium
WO2020088076A1 (en) * 2018-10-31 2020-05-07 阿里巴巴集团控股有限公司 Image labeling method, device, and system
CN111062255A (en) * 2019-11-18 2020-04-24 苏州智加科技有限公司 Three-dimensional point cloud labeling method, device, equipment and storage medium
CN111046743A (en) * 2019-11-21 2020-04-21 新奇点企业管理集团有限公司 Obstacle information labeling method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US7936903B2 (en) Method and a system for detecting a road at night
CN109389046B (en) All-weather object identification and lane line detection method for automatic driving
CN109801282A (en) Pavement behavior detection method, processing method, apparatus and system
US8432447B2 (en) Stripe pattern detection system, stripe pattern detection method, and program for stripe pattern detection
EP3398158B1 (en) System and method for identifying target objects
JP2003189293A (en) Device for displaying state of surroundings of vehicle and image-providing system
CN110619674B (en) Three-dimensional augmented reality equipment and method for accident and alarm scene restoration
CN110796104A (en) Target detection method and device, storage medium and unmanned aerial vehicle
JP5423764B2 (en) Moving object detection apparatus, computer program, and moving object detection method
CN108805184B (en) Image recognition method and system for fixed space and vehicle
Muller et al. 3-D reconstruction of a dynamic environment with a fully calibrated background for traffic scenes
CN112529335B (en) Model detection method, device, equipment and storage medium
CN117455762A (en) Method and system for improving resolution of recorded picture based on panoramic automobile data recorder
CN111767862A (en) Vehicle labeling method and device, computer equipment and readable storage medium
CN112580489A (en) Traffic light detection method and device, electronic equipment and storage medium
CN117041484A (en) People stream dense area monitoring method and system based on Internet of things
CN111491103A (en) Image brightness adjusting method, monitoring equipment and storage medium
CN112115737A (en) Vehicle orientation determining method and device and vehicle-mounted terminal
CN115565155A (en) Training method of neural network model, generation method of vehicle view and vehicle
CN112364693B (en) Binocular vision-based obstacle recognition method, device, equipment and storage medium
CN113422915A (en) Monitoring video fusion display method and system
CN113099176A (en) Vehicle-mounted video monitoring equipment and rail transit vehicle
JP2021128532A (en) Image transmission system, image processing system, and image transmission program
JP2003085535A (en) Position recognition method for road guide sign
CN117392634B (en) Lane line acquisition method and device, storage medium and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination