CN117830412A - Target identification method, device, equipment and storage medium - Google Patents

Target identification method, device, equipment and storage medium Download PDF

Info

Publication number
CN117830412A
CN117830412A CN202311869730.0A CN202311869730A CN117830412A CN 117830412 A CN117830412 A CN 117830412A CN 202311869730 A CN202311869730 A CN 202311869730A CN 117830412 A CN117830412 A CN 117830412A
Authority
CN
China
Prior art keywords
information
target
actual
determining
size
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311869730.0A
Other languages
Chinese (zh)
Inventor
戴源远
展志昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tuogong Nanjing Robot Co ltd
Original Assignee
Tuogong Nanjing Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tuogong Nanjing Robot Co ltd filed Critical Tuogong Nanjing Robot Co ltd
Priority to CN202311869730.0A priority Critical patent/CN117830412A/en
Publication of CN117830412A publication Critical patent/CN117830412A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/933Radar or analogous systems specially adapted for specific applications for anti-collision purposes of aircraft or spacecraft
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Electromagnetism (AREA)
  • Geometry (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The embodiment of the disclosure discloses a target identification method, a device, equipment and a storage medium, comprising the following steps: acquiring observation information of a target, wherein the observation information comprises first information acquired by a vision sensor and second information acquired by a radar; inputting the first information into a neural network model, and determining perception information of the target according to the output of the neural network model; and if the second information is matched with the perception information, determining the actual position and the actual size of the target according to the second information. According to the technical scheme, the information acquired by the visual sensor is processed through the neural network to obtain the perception information, the actual position and the size of the target are determined according to the perception information and the second information acquired by the radar, the problem that the millimeter wave radar cannot measure the object size is solved, the operation decision of the unmanned aerial vehicle can be assisted, and a reliable basis is provided for path planning.

Description

Target identification method, device, equipment and storage medium
Technical Field
The embodiment of the disclosure relates to the technical field of target detection, in particular to a target identification method, a device, equipment and a storage medium.
Background
Unmanned aerial vehicles often encounter obstacles such as: telegraph poles, wires, trees, etc., if they strike, can cause the damage of the machine body, the current mainstream scheme is generally that unmanned aerial vehicles are equipped with millimeter wave radar to detect the peripheral obstacle target. The millimeter wave radar detects a front target by using electromagnetic waves, the electromagnetic waves can form effective reflection when encountering barriers such as a telegraph pole, and the distance, speed and azimuth of the target can be calculated by using frequency modulation continuous wave (Frequency Modulated Continuous Wave, FMCW), multiple-Input Multiple-Output (MIMO) technology and Doppler effect.
In practical application, unmanned aerial vehicle need go planning future flight path according to obstacle target shape after meetting the obstacle, millimeter wave radar horizontal detection angle is wider, generally is between 90 and 120, however millimeter wave radar's vertical dimension's detection angle is narrower, generally is about 10, can lead to unmanned aerial vehicle unable perception peripheral obstacle's general view like this.
Disclosure of Invention
The embodiment of the disclosure provides a target identification method, device, equipment and storage medium, which solve the problem that millimeter wave radar cannot measure the size of an object.
In a first aspect, a method for identifying a target is provided, including:
acquiring observation information of a target, wherein the observation information comprises first information acquired by a vision sensor and second information acquired by a radar;
inputting the first information into a neural network model, and determining perception information of the target according to the output of the neural network model;
and if the second information is matched with the perception information, determining the actual position and the actual size of the target according to the second information.
In a second aspect, there is provided an object recognition apparatus comprising:
the information acquisition module is used for acquiring the observation information of the target, wherein the observation information comprises first information acquired by a vision sensor and second information acquired by a radar;
the information determining module is used for inputting the first information into a neural network model and determining the perception information of the target according to the output of the neural network model;
and the information matching module is used for determining the actual position and the actual size of the target according to the second information if the second information is matched with the perception information.
In a third aspect, an electronic device is provided, the electronic device comprising:
at least one processor; and;
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the object recognition method provided in the first aspect above.
In a fourth aspect, a computer readable storage medium is provided, where the computer readable storage medium stores computer instructions for enabling a controller to implement the object recognition method provided in the first aspect according to the embodiment of the present disclosure when the controller executes the method.
The embodiment of the disclosure provides a target identification method, a device, equipment and a storage medium, comprising the following steps: acquiring observation information of a target, wherein the observation information comprises first information acquired by a vision sensor and second information acquired by a radar; inputting the first information into a neural network model, and determining perception information of the target according to the output of the neural network model; and if the second information is matched with the perception information, determining the actual position and the actual size of the target according to the second information. According to the technical scheme, the information acquired by the visual sensor is processed through the neural network to obtain the perception information, and the actual position and the actual size of the target are determined according to the perception information and the second information acquired by the radar. Compared with the prior art, the method solves the problem that the millimeter wave radar cannot measure the object size, enables the unmanned aerial vehicle system to sense the width and the height of the target object through fusing image features, can assist the operation decision of the unmanned aerial vehicle, and provides a reliable basis for path planning.
It should be understood that the description in this section is not intended to identify key or critical features of the disclosed embodiments, nor is it intended to be used to limit the scope of the disclosed embodiments. Other features of the embodiments of the present disclosure will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and other drawings may be obtained according to these drawings without inventive effort for a person of ordinary skill in the art.
FIG. 1 is a flow chart of a method for identifying an object according to a first embodiment of the present disclosure;
fig. 2 is a schematic diagram of steps performed by a target recognition method applied to a drone according to an embodiment of the present disclosure;
FIG. 3 is a flowchart of another object recognition method according to a second embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a target recognition device according to a third embodiment of the present disclosure;
fig. 5 presents a schematic view of the structure of an electronic device used to implement an embodiment of the present disclosure.
Detailed Description
In order that those skilled in the art will better understand the aspects of the embodiments of the present disclosure, a technical solution of the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present disclosure, and it is apparent that the described embodiments are only some embodiments, not all embodiments of the present disclosure. All other embodiments, which may be made by one of ordinary skill in the art without undue burden from the disclosed embodiments, are intended to be within the scope of the disclosed embodiments.
It should be noted that the terms "first," "second," and the like in the description and claims of the embodiments of the present disclosure and the above-described figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the disclosed embodiments described herein may be implemented in other sequences than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
Fig. 1 is a flowchart of a target recognition method according to a first embodiment of the present disclosure, where the method may be performed by a target recognition device, the target recognition device may be implemented in hardware and/or software, and the target recognition device may be configured in an electronic apparatus. As shown in fig. 1, the method includes:
s110, acquiring observation information of a target, wherein the observation information comprises first information acquired by a vision sensor and second information acquired by a radar.
In this embodiment, the visual sensor and the radar carried by the unmanned aerial vehicle may collect the observation information of the target, where the target may be a target object in a range in front of the unmanned aerial vehicle, and the target may be a telegraph pole, an electric wire, and/or a tree, for example.
Specifically, the observation information may be information observed by an observation device carried by the unmanned aerial vehicle, where the observation information may include first information and second information, the first information may be information collected by a vision sensor, and the first information may include an image of a target. The second information may be information acquired by a radar, and the second information may include an azimuth and a distance.
S120, inputting the first information into the neural network model, and determining the perception information of the target according to the output of the neural network model.
It should be explained that, after the first information is obtained through the vision sensor, the first information can be input into the neural network model, the neural network model can be a pre-trained model for identifying perception information, the neural network model is an algorithm mathematical model for simulating the behavior characteristics of the human nervous system, and is formed by interconnecting an input layer, a hidden layer and an output layer, and the information is processed by adjusting the interconnection relation among a large number of internal nodes, so that the method is widely applied to the fields of image identification, segmentation, reconstruction and the like at present.
In view of the foregoing, the output of the neural network may be obtained after the first information is processed by the neural network model, where the output of the neural network model includes a reference position coordinate of the target, a reference size, and a category, and the reference size includes a reference width and a reference height, where the reference position coordinate may be a position coordinate of the target with respect to an upper left corner of the image, and the reference size may be a size of the target with respect to the image. In the case of output determination of the neural network, the perception information of the target may be determined according to the output of the neural network, and the perception information may include an azimuth range of the target.
And S130, if the second information is matched with the perception information, determining the actual position and the actual size of the target according to the second information.
It can be known that after the sensing information is calculated through the neural network model, the second information can be matched with the sensing information. When the second information matches the perceived information, an actual position and an actual size of the target may be determined according to the second information, wherein the actual position may be a coordinate in the real world relative to the target of the unmanned aerial vehicle, and the actual size may be a size relative to the target in the real world.
The embodiment provides a target identification method, which comprises the following steps: acquiring observation information of a target, wherein the observation information comprises first information acquired by a vision sensor and second information acquired by a radar; inputting the first information into a neural network model, and determining perception information of the target according to the output of the neural network model; and if the second information is matched with the perception information, determining the actual position and the actual size of the target according to the second information. The technical scheme solves the problem that the millimeter wave radar cannot measure the object size, can assist the operation decision of the unmanned aerial vehicle, and provides a reliable basis for path planning.
As an optional embodiment of the present embodiment, the second information includes an azimuth angle, and the target identifying method provided in the present embodiment may further include:
and if the azimuth angle is in the azimuth angle range, determining that the second information is matched with the perception information.
It should be explained that, the range of the azimuth angle of the target can be obtained through the neural network model calculation, and when the range of the azimuth angle acquired by the radar is within the range of the azimuth angle obtained through the neural network calculation, it can be determined that the second information is matched with the perception information.
Fig. 2 is a schematic diagram of an execution step of a target recognition method applied to an unmanned aerial vehicle according to the embodiment, as shown in fig. 2, millimeter wave energy detects a distance and an azimuth of a target, a camera can capture a shape feature of the target, a more comprehensive environmental model is generated by fusing data of a millimeter wave radar and the camera, a problem that the millimeter wave radar cannot measure an object size is solved, and an unmanned aerial vehicle system can sense a width and a height of the target by fusing image features, so that a reliable basis is provided for path planning.
Example two
Fig. 3 is a flowchart of another object recognition method according to the second embodiment of the present disclosure. The embodiment of the present disclosure is a further optimization and expansion of the first embodiment. As shown in fig. 3, the method includes:
s210, acquiring observation information of a target, wherein the observation information comprises first information acquired by a vision sensor and second information acquired by a radar.
The vision sensor may be a camera, a frame of picture with a target object can be acquired by the camera, and the second signal acquired by the radarThe message may include a distance D o And azimuth angle A o
S220, inputting the first information into the neural network model, and calculating an azimuth angle range according to the reference position coordinates of the target, the field angle of the vision sensor and the reference size of the target, which are output by the neural network.
In this embodiment, after the first information is collected by the vision sensor, the first information may be input into the neural network, to obtain the reference position coordinate, the reference size and the category output by the neural network. The range of azimuth angles may be calculated from the reference position coordinates output from the neural network, the reference size, and the field of view (FOV) of the vision sensor.
The reference position coordinates of the target output by the neural network may be (U) o ,V o ) The reference width of the object is W o The reference height is H o The field angle of the vision sensor may be a fov The azimuth range of the target can be calculated:
where W may represent the frame width of the lens in pixels. Thus, the range of azimuth angles at which the target can be obtained is (A o1 ,A o2 )。
And S230, if the second information is matched with the range of the azimuth angle, determining the unit pixel size of the target according to the distance, wherein the second information comprises the distance.
In particular, the second information acquired by the radar may include an azimuth angle and a distance D of the target o When the azimuth angle measured by the radar is matched with the range of azimuth angles calculated by the neural network, namely, the azimuth angle measured by the radar belongs to the range of azimuth angles calculated by the neural network, the method can determine according to the distance of the targetThe unit pixel size of the target.
S240, determining the actual position and the actual size of the target according to the unit pixel size.
In this embodiment, after the unit pixel size is determined, the actual position and the actual size of the target can be determined according to the unit pixel size. Wherein the actual dimensions include an actual width and an actual height; the actual position is the product of the difference between half of the reference size and the corresponding reference position coordinates and the unit pixel size; the actual width is the product of the reference width and the unit pixel size; the actual width is a product of the first width and the unit pixel size.
For example, with the unmanned aerial vehicle as a coordinate center, the calculation formula of the actual width of the target is:
w=W o *PS
wherein W is the actual width of the target, W o A reference width for the target;
the calculation formula of the actual height of the target is as follows:
H=H o *PS
wherein H is o Is the reference height of the target.
Actual position of target (X o ,Y o ) The calculation mode of (a) is as follows:
X o =(0.5*W-U o )*PS
Y o =(0.5*H-V o )*PS
wherein W represents the width of the frame of the lens, and H represents the height of the frame of the lens.
The technical solution of the disclosed embodiments first obtains observation information of a target, the observation information including first information collected by a vision sensor and second information collected by a radar. And inputting the first information into a neural network model, and calculating the azimuth angle range according to the reference position coordinates of the target output by the neural network, the field angle of the vision sensor and the reference size of the target. The second information includes a distance from which a unit pixel size of the target is determined. And determining the actual position and the actual size of the target according to the unit pixel size. The technical scheme solves the problem that the millimeter wave radar cannot measure the object size, can assist the operation decision of the unmanned aerial vehicle, and provides a reliable basis for path planning.
Optionally, determining the unit pixel size of the target according to the distance includes:
dividing the distance by the focal length of the vision sensor and multiplying the focal length by the pixel size to obtain the unit pixel size of the target.
Specifically, the calculation formula of the unit pixel size can be expressed as:
PS=D/F*PU
wherein F is focal length (mm), PU is pixel size (um), D is object distance, in this embodiment, object distance D may be approximately D o
Example III
Fig. 4 is a schematic structural diagram of a target recognition device according to a third embodiment of the present disclosure. As shown in fig. 4, the apparatus includes: an information acquisition module 310, an information determination module 320, and an information matching module 330.
The information acquisition module 310 is configured to acquire observation information of a target, where the observation information includes first information acquired by a vision sensor and second information acquired by a radar;
an information determining module 320, configured to input the first information into a neural network model, and determine perception information of the target according to an output of the neural network model;
and the information matching module 330 is configured to determine an actual position and an actual size of the target according to the second information if the second information matches the sensing information.
The third embodiment of the disclosure provides a target recognition device, which solves the problem that the millimeter wave radar cannot measure the object size, can assist the operation decision of the unmanned aerial vehicle, and provides a reliable basis for path planning.
Further, the first information includes an image including the target;
the output of the neural network model includes reference position coordinates, reference dimensions, and a class of the target, the reference dimensions including a reference width and a reference height.
Further, the perception information includes an azimuth range of the target; the information determining module 320 may be further configured to:
and calculating the azimuth angle range according to the reference position coordinates of the target, the field angle of the vision sensor and the reference size of the target.
Optionally, the second information includes an azimuth; the apparatus further comprises:
and the confirmation module is used for determining that the second information is matched with the perception information if the azimuth angle is in the azimuth angle range.
Further, the second information includes a distance; the information matching module 330 may further include:
a unit pixel size determining unit configured to determine a unit pixel size of the target according to the distance;
an actual position and actual size determining unit configured to determine an actual position and an actual size of the target according to the unit pixel size.
Alternatively, the unit pixel size determining unit may be further configured to:
dividing the distance by the focal length of the vision sensor and multiplying the focal length by the pixel size to obtain the unit pixel size of the target.
Optionally, the actual dimensions include an actual width and an actual height; the actual position is the product of the difference between half of the reference size and the corresponding reference position coordinates and the unit pixel size;
the actual width is the product of the reference width and the unit pixel size;
the actual width is a product of the first width and the unit pixel size.
The object recognition device provided by the embodiment of the disclosure can execute the object recognition method provided by any embodiment of the disclosure, and has the corresponding functional modules and beneficial effects of the execution method.
Example IV
Fig. 5 shows a schematic diagram of the structure of an electronic device 10 that may be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the embodiments of the disclosure described and/or claimed herein.
As shown in fig. 5, the electronic device 10 includes at least one processor 11, and a memory, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, etc., communicatively connected to the at least one processor 11, in which the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data required for the operation of the electronic device 10 may also be stored. The processor 11, the ROM 12 and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, etc.; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microprocessor, etc. The processor 11 performs the various methods and processes described above, such as the target recognition method.
In some embodiments, the object recognition method may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into RAM 13 and executed by processor 11, one or more steps of the object recognition method described above may be performed. Alternatively, in other embodiments, the processor 11 may be configured to perform the object recognition method in any other suitable way (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for implementing the methods of embodiments of the present disclosure may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the disclosed embodiments, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the embodiments of the present disclosure may be performed in parallel, may be performed sequentially, or may be performed in a different order, so long as the desired result of the technical solution of the embodiments of the present disclosure is achieved, and the present disclosure is not limited herein.
The above detailed description should not be construed as limiting the scope of the embodiments of the present disclosure. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions, improvements, etc. which are within the spirit and principles of the embodiments of the present disclosure are intended to be included within the scope of the embodiments of the present disclosure.

Claims (10)

1. A method of target identification, comprising:
acquiring observation information of a target, wherein the observation information comprises first information acquired by a vision sensor and second information acquired by a radar;
inputting the first information into a neural network model, and determining perception information of the target according to the output of the neural network model;
and if the second information is matched with the perception information, determining the actual position and the actual size of the target according to the second information.
2. The method of claim 1, wherein the step of determining the position of the substrate comprises,
the first information includes an image including the target;
the output of the neural network model includes reference position coordinates, reference dimensions, and a class of the target, the reference dimensions including a reference width and a reference height.
3. The method of claim 2, wherein the perception information comprises an azimuthal range of the target;
determining perception information of the target according to the output of the neural network model, including:
and calculating the azimuth angle range according to the reference position coordinates of the target, the field angle of the vision sensor and the reference size of the target.
4. The method of claim 2, wherein the second information comprises an azimuth angle;
the method further comprises the steps of:
and if the azimuth angle is in the azimuth angle range, determining that the second information is matched with the perception information.
5. The method of claim 2, wherein the second information comprises a distance;
determining the actual position and the actual size of the target according to the second information, including:
determining a unit pixel size of the target according to the distance;
and determining the actual position and the actual size of the target according to the unit pixel size.
6. The method of claim 5, wherein determining a unit pixel size of the target based on the distance comprises:
dividing the distance by the focal length of the vision sensor and multiplying the focal length by the pixel size to obtain the unit pixel size of the target.
7. The method of claim 5, wherein the actual dimensions comprise an actual width and an actual height;
the determining the actual position and the actual size of the target according to the unit pixel size includes:
the actual position is the product of the difference between half of the reference size and the corresponding reference position coordinates and the unit pixel size;
the actual width is the product of the reference width and the unit pixel size;
the actual width is a product of the first width and the unit pixel size.
8. An object recognition apparatus, comprising:
the information acquisition module is used for acquiring the observation information of the target, wherein the observation information comprises first information acquired by a vision sensor and second information acquired by a radar;
the information determining module is used for inputting the first information into a neural network model and determining the perception information of the target according to the output of the neural network model;
and the information matching module is used for determining the actual position and the actual size of the target according to the second information if the second information is matched with the perception information.
9. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the object recognition method according to any one of claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the object recognition method as claimed in any one of claims 1-7.
CN202311869730.0A 2023-12-29 2023-12-29 Target identification method, device, equipment and storage medium Pending CN117830412A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311869730.0A CN117830412A (en) 2023-12-29 2023-12-29 Target identification method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311869730.0A CN117830412A (en) 2023-12-29 2023-12-29 Target identification method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117830412A true CN117830412A (en) 2024-04-05

Family

ID=90511270

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311869730.0A Pending CN117830412A (en) 2023-12-29 2023-12-29 Target identification method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117830412A (en)

Similar Documents

Publication Publication Date Title
CN113902897A (en) Training of target detection model, target detection method, device, equipment and medium
WO2023155387A1 (en) Multi-sensor target detection method and apparatus, electronic device and storage medium
CN116129422A (en) Monocular 3D target detection method, monocular 3D target detection device, electronic equipment and storage medium
CN115937950A (en) Multi-angle face data acquisition method, device, equipment and storage medium
CN117830412A (en) Target identification method, device, equipment and storage medium
CN113920273B (en) Image processing method, device, electronic equipment and storage medium
CN115760575A (en) Laser point cloud data processing method and device, electronic equipment and storage medium
CN114266879A (en) Three-dimensional data enhancement method, model training detection method, three-dimensional data enhancement equipment and automatic driving vehicle
CN117647852B (en) Weather state detection method and device, electronic equipment and storage medium
CN117372988B (en) Road boundary detection method, device, electronic equipment and storage medium
CN111784659A (en) Image detection method and device, electronic equipment and storage medium
CN113312979B (en) Image processing method and device, electronic equipment, road side equipment and cloud control platform
CN114612544B (en) Image processing method, device, equipment and storage medium
CN116229209B (en) Training method of target model, target detection method and device
CN115205939B (en) Training method and device for human face living body detection model, electronic equipment and storage medium
CN113705620B (en) Training method and device for image display model, electronic equipment and storage medium
CN114581746B (en) Object detection method, device, equipment and medium
CN118115765A (en) Image matching method, device, equipment and storage medium
CN118115540A (en) Three-dimensional target tracking method, device, equipment and storage medium
CN117911831A (en) Model training method, target detection method, electronic equipment and storage medium
CN116310403A (en) Target tracking method, device, electronic equipment and readable storage medium
CN117746386A (en) Target object position sensing method, device and computer program product
CN117576371A (en) Infrared and visible image cooperative target detection method, device, equipment and medium
CN117008136A (en) Ranging method and device for obstacle in front of vehicle, electronic equipment and storage medium
CN115346194A (en) Three-dimensional detection method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination