WO2022148068A1 - 一种车辆检测方法和车辆检测装置 - Google Patents

一种车辆检测方法和车辆检测装置 Download PDF

Info

Publication number
WO2022148068A1
WO2022148068A1 PCT/CN2021/120343 CN2021120343W WO2022148068A1 WO 2022148068 A1 WO2022148068 A1 WO 2022148068A1 CN 2021120343 W CN2021120343 W CN 2021120343W WO 2022148068 A1 WO2022148068 A1 WO 2022148068A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
detected
component
image
server
Prior art date
Application number
PCT/CN2021/120343
Other languages
English (en)
French (fr)
Inventor
林忠能
许国华
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022148068A1 publication Critical patent/WO2022148068A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/006Indicating maintenance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/008Registering or indicating the working of vehicles communicating information to a remotely located station
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Definitions

  • the present solution relates to the field of computer technology, and in particular, to a vehicle detection method and a vehicle detection device.
  • the vehicle to be detected can be photographed by a camera, and then the fault condition of the vehicle to be detected can be judged according to the photographed image. This method cannot accurately obtain the fault condition of the components to be detected of the vehicle to be detected.
  • Embodiments of the present application provide a vehicle detection method and a vehicle detection device, wherein the vehicle detection device includes a server, a vehicle to be detected, and a first vehicle.
  • the server may determine the first vehicle according to the position of the vehicle to be detected in response to the detection request of the component to be detected for the vehicle to be detected, and then send the first work instruction to the vehicle to be detected and the second work instruction to the first vehicle , wherein the first work order is used to instruct the component to be detected to be turned on at least once within the target time period; a second work order is sent to the first vehicle, and the second work order is used to instruct the first vehicle to photograph the first vehicle within the target time period
  • the image and the first image are sent to the server, and finally the server identifies the fault condition of the component to be inspected from the first image.
  • the server can control the vehicle to be detected and the first vehicle to cooperate with each other to determine the failure of the components to be detected of the vehicle to be detected, thereby improving the detection accuracy.
  • an embodiment of the present application provides a vehicle detection method, which is characterized in that, applied to a server, the method includes:
  • the fault condition of the component to be inspected is identified from the first image.
  • the server may receive a detection request for a component to be detected of the vehicle to be detected, and the detection request is used to instruct the server to send the first work instruction to the vehicle to be detected and to send the first vehicle to the first vehicle. Second work order.
  • the server may simultaneously acquire detection requests of multiple vehicles to be detected, and determine the detection requests of the above-mentioned multiple vehicles to be detected according to the number and position of the first vehicles around the vehicle to be detected. detection order.
  • the server obtains the path planning of each vehicle in at least one vehicle and the path planning of the vehicle to be detected, and the distance between the at least one vehicle and the vehicle to be detected is smaller than the target threshold ; Screen out a vehicle whose path planning partially coincides with the path planning part of the vehicle to be detected from the at least one vehicle; and determine the first vehicle from the screened vehicles.
  • the path planning on the overlapped path of the screened vehicle and the vehicle to be detected is planned as a straight path within the target time period.
  • the straight road is a lane in which the lane lines of the straight road do not intersect with the lane lines in other directions.
  • the server may filter out a vehicle whose path is planned to coincide with the vehicle to be detected and whose path is planned to be straight within the target time period as the first vehicle, so as to avoid affecting the above-mentioned first vehicle after the vehicle detection method is executed. original path planning.
  • the server may screen out, according to the component to be detected, a vehicle whose path planning is a curve in the target time period overlapping the path planning of the vehicle to be detected as the first vehicle. For example, when the component to be detected is a right turn signal, the server may filter out the vehicle whose route planning coincides with the vehicle to be detected and the route planning in the target time period is a right-turn curve ahead as the first vehicle. It can be understood that this method can prevent the vehicle from turning on the left and right turn signals and waiting for the detection component in the straight road section, and prevent the normal driving of other vehicles from being affected.
  • the server sends a photographing instruction to the filtered vehicle, where the photographing instruction is used to instruct the filtered vehicle to photograph the second image respectively;
  • the second image and the identification of the vehicle to be detected, the first vehicle is determined from the filtered vehicles, and the second image captured by the first vehicle includes the identification.
  • the component to be detected when the component to be detected is located on the front side of the vehicle to be detected, the first vehicle is located in front of the vehicle to be detected.
  • the first vehicle is located behind the vehicle to be inspected.
  • the method further includes:
  • the shooting angle of the camera is determined, and the second work instruction carries the shooting angle.
  • the server identifies the component to be detected from the first image; when the average gray value of the component to be detected in the first image is within the target range, it is determined that the component to be detected is not detected. There is a malfunction.
  • the server identifies the position of the component to be detected from the first image; when the position of the component to be detected in the first image is at the target position, it is determined that the component to be detected does not exist Fault.
  • the first image includes multiple frames of images
  • the server identifies the position of the component to be detected in each frame of the first image; when the component to be detected is in the multiple frames of images When the positions of the components are inconsistent, it is determined that the component to be tested is not faulty.
  • the first image includes multiple frames of images
  • the server identifies the component to be detected in each frame of the above-mentioned multi-frame image
  • the degree value identifies the working state of the component to be detected; if the change rule of the working state of the component to be detected identified according to the multi-frame image meets the preset law, it is determined that the component to be detected does not have a fault, and the first work instruction is used to indicate the to-be-detected component.
  • the detection components work in a preset pattern.
  • an embodiment of the present application provides a vehicle detection method, which is characterized in that, when applied to a vehicle to be detected, the method includes:
  • the server is also used to send a second work instruction to the first vehicle, the distance between the first vehicle and the vehicle to be detected is less than the target threshold, and the second work instruction is used to indicate that the first vehicle is in the target time period Shooting a first image and sending the first image to the server, where the first image is used to identify the fault condition of the component to be detected of the vehicle to be detected;
  • the component to be detected is controlled to be turned on at least once within the target time period.
  • the vehicle to be detected may send a detection request for the component to be detected of the vehicle to be detected to the server, and the detection request is used to instruct the server to send the first work instruction to the vehicle to be detected and to The first vehicle sends a second work order.
  • the server controls the component to be detected to work with a preset regularity within the target time period according to the first work instruction.
  • the vehicle to be detected controls the component to be detected to be in a working state within the target time period.
  • the vehicle to be detected turns on the component to be detected at a first time point; the component to be detected is turned off at a second time point, and the first time point and the second time point are Two time points within the target time period.
  • an embodiment of the present application provides a vehicle detection method, which is characterized in that, applied to a first vehicle, the method includes:
  • the server is also used to send a first work instruction to the vehicle to be tested, the first work instruction is used to instruct the to-be-detected component of the to-be-detected vehicle to be turned on at least once within the target time period, the first vehicle and the The distance of the vehicle to be detected is less than the target threshold;
  • the first image is captured within the target time period
  • the first image is sent to the server, and the first image is used to identify the failure condition of the component to be inspected.
  • the second work instruction is generated when the server responds to a detection request for the component to be detected sent by the vehicle to be detected, and the detection request is also used to instruct the server to send the vehicle to be detected. Send the first work order.
  • the first vehicle before receiving the second work instruction sent by the server, the first vehicle receives the shooting instruction sent by the server; The image is used to determine the first vehicle; the second image is sent to the server, and the second image includes the identification of the vehicle to be detected.
  • the second work instruction carries a shooting angle
  • the first vehicle can control the camera to shoot the first image at the shooting angle
  • the shooting angle is determined by the server according to the image of the vehicle to be detected. The location, the location of the first vehicle and the component to be inspected are determined.
  • the second work instruction carries a lane
  • the first vehicle can switch to the above lane
  • the lane is determined by the server according to the position of the vehicle to be detected, the position of the first vehicle and the component to be detected. definite.
  • the second work instruction carries speed information. For example, when the vehicle to be detected switches to the right lane of the first vehicle, if the first vehicle completes the shooting of the parts to be detected on the left side of the vehicle to be detected, and finds that there are other vehicles in front of the vehicle, the first vehicle can slow down to make room, So that the vehicle to be detected can switch to its own lane and further switch to the left lane from its own lane.
  • the server may determine a plurality of first vehicles according to the vehicles to be detected, so as to complete the detection of the above vehicles to be detected.
  • the server may determine the vehicle to be detected according to the first instruction, and then determine the first vehicle according to the vehicle to be detected.
  • the server can issue the second instruction to the multiple first vehicles at the same time, and respectively to different first vehicles at different time periods. Different detection component ranges are allocated, and multiple first vehicles cooperate to carry out the detection of the vehicle to be detected at the same time or in a time-sharing manner.
  • the server can screen out three first vehicles in the same lane in front of the vehicle to be detected, the rear and the left rear of the left lane, and can issue wipers to the first vehicle in front, detect water spray parts, and send the first vehicle in the rear to the first vehicle.
  • the vehicle sends the detection of the double-jump light components, and sends the detection of the left window component to the first vehicle behind the left, and at the same time, sends an instruction carrying the vehicle speed information, and the instruction is used to make all the first vehicles and the vehicle to be detected. Maintain the same speed and carry out inspections at the same time.
  • the server sends a brake light detection instruction to the rear first vehicle.
  • the brake light detection instruction can carry deceleration information, and the deceleration information is used for The distance between the first vehicle and the vehicle to be detected is widened so that the vehicle to be detected has a space for performing braking action.
  • an embodiment of the present application provides a vehicle detection device, which is characterized in that, when applied to a server, the device includes:
  • a determining unit configured to determine a first vehicle according to the position information of the vehicle to be detected, and the distance between the first vehicle and the vehicle to be detected is less than a target threshold
  • a sending unit configured to send a first work instruction to the vehicle to be inspected, where the first work instruction is used to instruct the component to be inspected of the vehicle to be inspected to be turned on at least once within a target time period;
  • a sending unit configured to send a second work instruction to the first vehicle, where the second work instruction is used to instruct the first vehicle to take a first image within a target time period and send the first image to the server;
  • the identification unit is used for identifying the failure condition of the component to be inspected from the first image.
  • the vehicle detection device may further include a receiving unit for receiving a detection request for a component to be detected of the vehicle to be detected, the detection request being used to instruct the server to send a first work instruction to the vehicle to be detected and to send the first work instruction to the vehicle to be detected.
  • a second work order is sent.
  • the vehicle detection device may further include a storage unit for storing data or computer instructions.
  • the receiving unit or the transmitting unit may be a transceiver or an interface
  • the storage unit may be a memory
  • the determining unit or the identifying unit may be a processor
  • an embodiment of the present application provides a vehicle detection device, which is characterized in that, when applied to a vehicle to be detected, the device includes:
  • the receiving unit is used to receive the first work instruction sent by the server, and the server is also used to send the second work instruction to the first vehicle, the distance between the first vehicle and the vehicle to be detected is less than the target threshold, and the second work instruction is used to indicate the first work instruction.
  • the vehicle captures a first image within the target time period and sends the first image to the server, where the first image is used to identify the fault condition of the component to be detected of the vehicle to be detected;
  • the control unit is configured to control the component to be detected to be turned on at least once within the target time period according to the first work instruction.
  • the vehicle detection device may further include a sending unit for sending a detection request for the component to be detected of the vehicle to be detected to the server, where the detection request is used to instruct the server to send the first work instruction to the vehicle to be detected and to send the first work instruction to the vehicle to be detected.
  • the vehicle sends a second work order.
  • the vehicle detection device may further include a storage unit for storing data or computer instructions.
  • the receiving unit or the transmitting unit may be a transceiver or an interface
  • the storage unit may be a memory
  • the control unit may be a processor
  • an embodiment of the present application provides a vehicle detection device, which is characterized in that, when applied to a first vehicle, the device includes:
  • a receiving unit configured to receive a second work instruction sent by a server, and the server is further configured to send a first work instruction to the vehicle to be tested, where the first work instruction is used to indicate a component to be tested of the vehicle to be tested Turn on at least once within a target time period; the distance between the first vehicle and the vehicle to be detected is less than a target threshold;
  • a photographing unit configured to photograph a first image within the target time period according to the second work instruction
  • a sending unit configured to send the first image to the server, where the first image is used to identify a fault condition of the component to be detected.
  • the vehicle detection device may further include a storage unit for storing data or computer instructions.
  • the receiving unit or the transmitting unit may be a transceiver or an interface
  • the storage unit may be a memory
  • the photographing unit may be a processor
  • an embodiment of the present application provides an electronic device, characterized in that the electronic device includes: one or more processors and a memory; the memory is coupled to the one or more processors, and the memory is used to store computer program codes, and the computer
  • the program code comprises computer instructions for invoking the computer instructions by the one or more processors to cause the electronic device to perform the method of any of the first, second or third aspects.
  • an embodiment of the present application provides an electronic device, which may include a processor and a communication interface.
  • the processor obtains program instructions through the communication interface, and when the program instructions are executed by the processor, the first aspect, the second aspect, or the third aspect is implemented. The method of any of the aspects.
  • an embodiment of the present application provides an electronic device, the electronic device may include a processing circuit, and the processing circuit is configured to execute the method described in any one of the first aspect, the second aspect, or the third aspect.
  • an embodiment of the present application provides a chip, the chip is applied to an electronic device, the chip includes one or more processors, and the processor is used for invoking computer instructions to cause the electronic device to execute the first aspect, the first The method of the second aspect or the third aspect and any one of the preceding aspects.
  • the process of executing the method described in the first aspect or the second aspect or the third aspect the process of sending the above-mentioned instruction and receiving the above-mentioned instruction in the above-mentioned method can be understood as the process of outputting the above-mentioned instruction by the processor, and the process of processing the above-mentioned instruction.
  • the process of receiving the above-mentioned command as input.
  • the processor When outputting the above-mentioned instructions, the processor outputs the above-mentioned instructions to the transceiver for transmission by the transceiver. After the above-mentioned instruction is output by the processor, other processing may be required before reaching the transceiver. Similarly, when the processor receives the above-mentioned instruction input, the transceiver receives the above-mentioned instruction and inputs it into the processor. Furthermore, after the transceiver receives the above-mentioned command, the above-mentioned command may need to perform other processing before being input to the processor.
  • the above-mentioned processor may be a processor specially used to execute these methods, or may be a processor that executes computer instructions in a memory to execute these methods, such as a general-purpose processor.
  • the above-mentioned memory can be a non-transitory (non-transitory) memory, such as a read-only memory (Read Only Memory, ROM), which can be integrated with the processor on the same chip, or can be set on different chips respectively.
  • ROM read-only memory
  • the embodiment does not limit the type of the memory and the setting manner of the memory and the processor.
  • the chip system may further include a memory for storing necessary program instructions and data of the vehicle.
  • the chip system may be composed of chips, or may include chips and other discrete devices.
  • the chip system may include an application specific integrated circuit (ASIC), an off-the-shelf programmable gate array (field programmable gate array, FPGA), or other programmable logic devices.
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • the chip system may further include interface circuits and the like.
  • the memory is located inside the processor; or the memory is located outside the processor.
  • an embodiment of the present application further provides a processor, configured to execute the method described in the first aspect or the second aspect or the third aspect.
  • an embodiment of the present application provides a computer program product including instructions, when the computer program product is executed on an electronic device, the electronic device is made to execute the first aspect, the second aspect, or the third aspect and The method of any of the above.
  • an embodiment of the present application provides a computer-readable storage medium, including instructions, when the above-mentioned instructions are executed on an electronic device, the above-mentioned electronic device is caused to perform the first aspect, the second aspect, or the third aspect and The method of any of the above.
  • FIG. 1 is a schematic diagram of the architecture of a vehicle detection system provided by an embodiment of the present application
  • FIG. 2 is a schematic diagram of the architecture of another vehicle detection system provided by an embodiment of the present application.
  • FIG. 3 is a functional block diagram of a vehicle 002 according to an embodiment of the present application.
  • FIG. 4 is a flowchart of a vehicle detection method provided by an embodiment of the present application.
  • FIG. 5 is a schematic diagram of a vehicle detection method provided by an embodiment of the present application.
  • FIG. 6 is a schematic diagram of a plurality of first vehicles according to an embodiment of the present application.
  • FIG. 7 is a flowchart of a method for determining a first vehicle according to an embodiment of the present application.
  • FIG. 8A is a schematic diagram of determining a vehicle in a first area according to an embodiment of the present application.
  • FIG. 8B is a schematic diagram of determining a second vehicle according to an embodiment of the present application.
  • 8C is a schematic diagram of determining a first vehicle from a second vehicle according to an embodiment of the present application.
  • FIG. 9 is a flowchart of a method for determining a first vehicle according to a second image provided by an embodiment of the present application.
  • FIG. 10 is a hardware structure diagram of a server provided by an embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of a vehicle detection device provided by an embodiment of the application.
  • FIG. 12 is a schematic structural diagram of a vehicle detection device provided by an embodiment of the application.
  • FIG. 13 is a schematic structural diagram of a vehicle detection device provided by an embodiment of the application.
  • FIG. 1 is a schematic structural diagram of a vehicle detection system provided by an embodiment of the present application.
  • the system architecture includes a server 10 , a vehicle to be detected 20 and a first vehicle 30 , wherein the first vehicle 30 may include multiple vehicles. in:
  • the vehicle to be detected 20 may send a detection request for the components to be detected of the vehicle to be detected 20 to the server 10 , and accordingly, the server 10 determines the first vehicle 30 according to the position of the vehicle to be detected 20 in response to the detection request. For example, the server 10 may determine a vehicle whose distance from the vehicle 20 to be detected is less than the target threshold as the first vehicle 30 .
  • the server 10 may send the first work instruction to the vehicle to be detected 20 and the second work instruction to the first vehicle 30 .
  • the vehicle 20 to be inspected opens the part to be inspected at least once within the target time period according to the first work instruction; the first vehicle 30 captures a first image according to the second work instruction, and then sends the first image to the server 10 .
  • the server 10 can identify the failure condition of the component to be inspected from the first image.
  • the server 10 may also send a photographing instruction to the vehicle whose distance from the vehicle 20 to be detected is less than the target threshold, where the photographing instruction is used to instruct the vehicle receiving the photographing instruction to photograph a second image and send the second image to server 10. Furthermore, the server 10 may determine a vehicle containing the identification of the vehicle 20 to be detected in the captured second image as the first vehicle 30 .
  • the vehicle 20 to be detected and the first vehicle 30 in the embodiment of the present application may be the vehicle 002, as shown in FIG. 3 .
  • Vehicle 002 is a car that senses the road environment through the on-board sensing system, automatically plans the driving route and controls the vehicle to reach the predetermined target.
  • Smart cars use technologies such as computer, modern sensing, information fusion, communication, artificial intelligence and automatic control.
  • the intelligent vehicle in the present application may be an intelligent vehicle with an intelligent driving instrument mainly based on a computer system, wherein the intelligent driving instrument is used to enable the vehicle to realize unmanned driving, or may have an assisted driving system or fully automatic driving
  • the intelligent vehicle of the system can also be a wheeled mobile robot.
  • the server 10 in this embodiment of the present application may be implemented by an independent server or a server cluster composed of multiple servers.
  • the vehicle detection system architecture includes a server, a vehicle to be detected, a first vehicle and a roadside camera, wherein the server can also interact with the roadside camera. Specifically, when the vehicle to be detected and the first vehicle pass through the road section where the roadside camera is located, the server may send an instruction to the roadside camera to enable the roadside camera to start shooting and upload the captured image to the server. Finally, the server may According to the image captured by the roadside camera and the first image, the failure condition of the component to be inspected of the vehicle to be inspected is identified.
  • the architecture of the vehicle detection system in FIG. 1 or FIG. 2 is only an exemplary implementation in the embodiment of the present application, and the architecture of the vehicle detection system in the embodiment of the present application includes but is not limited to the above vehicle detection system architecture. .
  • FIG. 3 is a functional block diagram of a vehicle 002 provided by an embodiment of the present application.
  • the vehicle 002 may be configured in a fully or partially autonomous driving mode.
  • the vehicle 002 can control itself while in an autonomous driving mode, and can determine the current state of the vehicle and its surroundings through human manipulation, determine the likely behavior of at least one other vehicle in the surroundings, and determine the other vehicle's
  • the vehicle 002 is controlled based on the determined information with a confidence level corresponding to the likelihood of performing the possible behavior.
  • the vehicle 002 may be placed to operate without human interaction.
  • Vehicle 002 may include various subsystems, such as travel system 202 , sensor system 204 , control system 206 , one or more peripherals 208 and power supply 210 , computer system 212 and user interface 216 .
  • the vehicle 002 may include more or fewer subsystems, and each subsystem may include multiple elements. Additionally, each of the subsystems and elements of the vehicle 002 may be interconnected by wire or wirelessly.
  • Travel system 202 may include components that provide powered motion for vehicle 002 .
  • travel system 202 may include engine 218 , energy source 219 , transmission 220 , and wheels 221 .
  • Engine 218 may be an internal combustion engine, an electric motor, an air compression engine, or other types of engine combinations, such as a hybrid engine consisting of an air oil engine and an electric motor, a hybrid engine consisting of an internal combustion engine and an air compression engine.
  • Engine 218 converts energy source 219 into mechanical energy.
  • Examples of energy sources 219 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electricity.
  • the energy source 219 may also provide energy to other systems of the vehicle 002 .
  • Transmission 220 may transmit mechanical power from engine 218 to wheels 221 .
  • Transmission 220 may include a gearbox, a differential, and a driveshaft.
  • transmission 220 may also include other devices, such as clutches.
  • the drive shafts may include one or more axles that may be coupled to one or more wheels 221 .
  • the sensor system 204 may include several sensors that sense information about the environment surrounding the vehicle 002 .
  • the sensor system 204 may include a global positioning system 222 (the global positioning system may be a GPS system, a Beidou system or other positioning systems), an inertial measurement unit (IMU) 224, a radar 226, a laser rangefinder 228 and camera 230.
  • the sensor system 204 may also include sensors that monitor the internal systems of the vehicle 002 (eg, an in-vehicle air quality monitor, a fuel gauge, an oil temperature gauge, etc.). Sensor data from one or more of these sensors can be used to detect objects and their corresponding characteristics (position, shape, orientation, velocity, etc.). This detection and identification is a critical function for the safe operation of the autonomous vehicle 002 .
  • the global positioning system 222 may be used to estimate the geographic location of the vehicle 002 .
  • the IMU 224 is used to sense position and orientation changes of the vehicle 002 based on inertial acceleration.
  • IMU 224 may be a combination of an accelerometer and a gyroscope.
  • IMU 224 may be used to measure the curvature of vehicle 002 .
  • Radar 226 may utilize radio signals to sense objects within the surrounding environment of vehicle 002 . In some embodiments, in addition to sensing objects, radar 226 may be used to sense the speed and/or heading of objects.
  • Laser rangefinder 228 may utilize laser light to sense objects in the environment in which vehicle 002 is located.
  • the laser rangefinder 228 may include one or more laser sources, laser scanners, and one or more detectors, among other system components.
  • Camera 230 may be used to capture multiple images of the surrounding environment of vehicle 002 .
  • the camera 230 may be a still camera or a video camera, or a visible light camera or an infrared camera, or any camera used to acquire images, which is not limited in this embodiment of the present application.
  • the camera 230 may be installed on the front side, the rear side, and the left and right sides of the vehicle 002 , and the camera 230 may be a camera that can be rotated to adjust the shooting angle.
  • the camera in the embodiment of the present application can also be installed at any position on the smart vehicle through a telescopic rod. When an image needs to be acquired, the telescopic rod is extended to acquire an image; when no image acquisition is required, the telescopic rod is retracted.
  • the camera 230 may be turned on and off according to the instruction of the second work instruction received by the first vehicle, and photographed according to the shooting angle carried in the second work instruction.
  • Control system 206 controls the operation of the vehicle 002 and its components.
  • Control system 206 may include various elements, including steering unit 232, throttle 234, braking unit 236, sensor fusion algorithm unit 238, computer vision system 240, route control system 242, and obstacle avoidance system 244.
  • Steering unit 232 is operable to adjust the heading of vehicle 002 .
  • it may be a steering wheel system.
  • the throttle 234 is used to control the operating speed of the engine 218 and thus the speed of the vehicle 002 .
  • the braking unit 236 is used to control the deceleration of the vehicle 002 .
  • the braking unit 236 may use friction to slow the wheels 221 .
  • the braking unit 236 may convert the kinetic energy of the wheels 221 into electrical current.
  • the braking unit 236 may also take other forms to slow the wheels 221 to control the speed of the vehicle 002 .
  • Computer vision system 240 is operable to process and analyze images captured by camera 230 in order to identify objects and/or features in the environment surrounding vehicle 002 .
  • the objects and/or features may include traffic signals, road boundaries and obstacles.
  • Computer vision system 240 may use object recognition algorithms, Structure from Motion (SFM) algorithms, video tracking, and other computer vision techniques.
  • SFM Structure from Motion
  • computer vision system 240 may be used to map the environment, track objects, estimate the speed of objects, and the like.
  • the route control system 242 is used to determine the route of travel of the vehicle 002 .
  • the route control system 242 may combine data from the sensor fusion algorithm unit 238, the GPS 222, and one or more predetermined maps to determine a driving route for the vehicle 002.
  • Obstacle avoidance system 244 is used to identify, evaluate and avoid or otherwise traverse potential obstacles in the environment of vehicle 002 .
  • control system 206 may additionally or alternatively include components other than those shown and described. Alternatively, some of the components shown above may be reduced.
  • Peripherals 208 may include a wireless communication system 246 , an onboard computer 248 , a microphone 250 and/or a speaker 252 .
  • peripherals 208 provide a means for a user of vehicle 002 to interact with user interface 216 .
  • the onboard computer 248 may provide information to the user of the vehicle 002 .
  • User interface 216 may also operate on-board computer 248 to receive user input.
  • the onboard computer 248 can be operated via a touch screen.
  • peripherals 208 may provide a means for vehicle 002 to communicate with other devices located within the vehicle.
  • microphone 250 may receive audio (eg, voice commands or other audio input) from a user of vehicle 002 .
  • speakers 252 may output audio to a user of vehicle 002 .
  • Wireless communication system 246 may wirelessly communicate with one or more devices, either directly or via a communication network. For example, Code Division Multiple Access (CDMA), Enhanced Versatile Disk (EVD), Global System for Mobile Communications (GSM)/General Packet Radio Service Technology (general packet radio service, GPRS), or 4G cellular communications, such as long term evolution (LTE), or 5G cellular communications, or new radio (NR) systems, or future communication systems, etc.
  • the wireless communication system 246 may communicate with a wireless local area network (WLAN) using WiFi.
  • the wireless communication system 246 may communicate directly with the device using an infrared link, Bluetooth, or a wireless personal area network (ZigBee).
  • Other wireless protocols, such as various vehicle communication systems, for example, wireless communication system 246 may include one or more dedicated short range communications (DSRC) devices, which may include a combination of vehicle and/or roadside stations. public and/or private data communications between them.
  • DSRC dedicated short range communications
  • Power supply 210 may provide power to various components of vehicle 002 .
  • the power source 210 may be a rechargeable lithium-ion or lead-acid battery.
  • One or more battery packs of such batteries may be configured as a power source to provide power to various components of the vehicle 002 .
  • power source 210 and energy source 219 may be implemented together, such as in some all-electric vehicles.
  • Computer system 212 may include at least one processor 213 that executes instructions 215 stored in a non-transitory computer readable medium such as data storage device 214.
  • Computer system 212 may also be multiple computing devices that control individual components or subsystems of vehicle 002 in a distributed fashion.
  • the processor 213 may be any conventional processor, such as a commercially available CPU. Alternatively, the processor may be a dedicated device such as an ASIC or other hardware-based processor.
  • FIG. 3 functionally illustrates a processor, memory, and other elements of the computer 120 in the same block, one of ordinary skill in the art will understand that the processor, computer, or memory may actually include what may or may not be Multiple processors, computers, or memories stored within the same physical enclosure.
  • the memory may be a hard drive or other storage medium located within an enclosure other than computer 120 .
  • reference to a processor or computer will be understood to include reference to a collection of processors or computers or memories that may or may not operate in parallel.
  • some components such as the steering and deceleration components may each have their own processor that only performs computations related to component-specific functions .
  • a processor may be located remotely from the vehicle and in wireless communication with the vehicle. In other aspects, some of the processes described herein are performed on a processor disposed within the vehicle while others are performed by a remote processor, including taking steps necessary to perform a single maneuver.
  • memory 214 may include instructions 215 (eg, program logic) executable by processor 213 to perform various functions of vehicle 002 , including those described above.
  • Data storage 224 may also contain additional instructions, including sending data to, receiving data from, interacting with, and/or performing operations on one or more of travel system 202 , sensor system 204 , control system 206 , and peripherals 208 control commands.
  • memory 214 may store data such as road maps, route information, vehicle location, direction, speed, and other such vehicle data, among other information.
  • Such information may be used by the vehicle 002 and the computer system 212 during operation of the vehicle 002 in autonomous, semi-autonomous, and/or manual modes.
  • the current speed and current curvature of the vehicle can be fine-tuned according to the road information of the target road segment and the received vehicle speed range and vehicle curvature range, so that the speed and curvature of the intelligent vehicle are within the vehicle speed range and the vehicle curvature range.
  • User interface 216 for providing information to or receiving information from a user of vehicle 002 .
  • user interface 216 may include one or more input/output devices within the set of peripheral devices 208 , such as wireless communication system 246 , onboard computer 248 , microphone 250 and speaker 252 .
  • Computer system 212 may control functions of vehicle 002 based on input received from various subsystems (eg, travel system 202 , sensor system 204 , and control system 206 ) and from user interface 216 .
  • computer system 212 may utilize input from control system 206 in order to control steering unit 232 to avoid obstacles detected by sensor system 204 and obstacle avoidance system 244 .
  • computer system 212 is operable to provide control of various aspects of vehicle 002 and its subsystems.
  • one or more of these components described above may be installed or associated with the vehicle 002 separately.
  • memory 214 may exist partially or completely separate from vehicle 002 .
  • the above-described components may be communicatively coupled together in a wired and/or wireless manner.
  • the above component is just an example.
  • components in each of the above modules may be added or deleted according to actual needs, and FIG. 3 should not be construed as a limitation on the embodiments of the present application.
  • a self-driving car traveling on a road can recognize objects within its surroundings to determine adjustments to its current speed.
  • the objects may be other vehicles, traffic control equipment, or other types of objects.
  • each identified object may be considered independently, and based on the object's respective characteristics, such as its current speed, acceleration, distance from the vehicle, etc., may be used to determine the speed at which the autonomous vehicle is to adjust.
  • the autonomous vehicle vehicle 002 or a computing device associated with the autonomous vehicle 002 may be based on the characteristics of the identified objects and the state of the surrounding environment. (eg, traffic, rain, ice on the road, etc.) to predict the behavior of the identified object.
  • each identified object is dependent on the behavior of the other, so it is also possible to predict the behavior of a single identified object by considering all identified objects together.
  • the vehicle 002 can adjust its speed based on the predicted behavior of the identified object. In this process, other factors may also be considered to determine the speed of the vehicle 002, such as the lateral position of the vehicle 002 in the road on which it is traveling, the curvature of the road, the proximity of static and dynamic objects, and the like.
  • the computing device may also provide instructions to modify the steering angle of the vehicle 002 so that the self-driving car follows a given trajectory and/or maintains contact with objects in the vicinity of the self-driving car (eg, , cars in adjacent lanes on the road) safe lateral and longitudinal distances.
  • objects in the vicinity of the self-driving car eg, , cars in adjacent lanes on the road
  • the above-mentioned vehicles 002 can be cars, trucks, motorcycles, buses, boats, airplanes, helicopters, lawn mowers, recreational vehicles, playground vehicles, construction equipment, trams, golf carts, trains, and carts, etc.
  • the application examples are not particularly limited.
  • the smart vehicle function diagram in FIG. 3 is only an exemplary implementation in the embodiments of the present application, and the smart vehicles in the embodiments of the present application include but are not limited to the above structures.
  • FIG. 4 is a flowchart of a vehicle detection method provided by an embodiment of the application.
  • the method can be applied to the vehicle detection system described in FIG. 1 or FIG. 2 , wherein the server 10 can be used to support And execute the method flow steps 102-105 and 109 shown in FIG. 4, the vehicle to be detected can be used to support and execute the method flow steps 101 and 106 shown in FIG. 4, the first vehicle can be used to support and Steps 107-108 of the method flow shown in FIG. 4 are performed.
  • the method may include some or all of the following steps.
  • the vehicle to be detected sends a detection request to a server.
  • the vehicle to be detected may periodically send a detection request to the server to obtain the fault condition of the vehicle, or may send a detection request for the component to be detected to the server when the number of operations of the component to be detected reaches a preset number of times, which is not limited here.
  • the inspection request may be directed to one or more components to be inspected of the vehicle to be inspected.
  • the vehicle to be detected may send a detection request to the server when it captures a signal triggering a component to be detected of itself to start working, where the detection request is used to instruct the server to immediately detect the component to be detected.
  • the vehicle to be detected when the vehicle to be detected turns right and turns on the right turn signal, the vehicle to be detected sends a detection request to the server when it senses the signal that the right turn signal is turned on, and the detection request is used to instruct the server to execute the detection request.
  • Right turn signal detection It can be understood that the method can complete the detection of the components to be detected by using the timing when the vehicle normally opens the components during the driving process of the vehicle, and can avoid the situation that the vehicle to be detected does not comply with the traffic rules in order to complete the detection of the components to be detected. For example, in order to complete the detection of the turn signal, the vehicle to be detected turns on the turn signal on a straight road section.
  • step 101 is an optional step, and the server may determine the vehicle to be detected according to the real-time location information of the vehicle to be detected stored by itself.
  • the server receives a detection request for a component to be detected of the vehicle to be detected.
  • the detection request may be sent by the vehicle to be detected in step 101 to the server.
  • the detection request is generated by a server, and the server can determine the component to be detected according to weather conditions and time. For example, when it is raining, it can determine that the component to be detected is a wiper, or turn on the lights at night. Understandably, this method can avoid affecting other vehicles.
  • the detection request received by the server may also be sent by another server or input manually, which is not limited here.
  • the server determines the first vehicle according to the position information of the vehicle to be detected.
  • the server may search for a vehicle according to the location information of the vehicle to be detected and a preset condition, and determine a vehicle that satisfies the preset condition as the first vehicle.
  • the preset condition may be that the distance between the first vehicle and the vehicle to be detected is less than the target threshold.
  • the preset condition may be that the distance between the first vehicle and the vehicle to be detected is less than 10 meters
  • the server may first obtain the position of the vehicle to be detected, and then determine the vehicle whose distance from the vehicle to be detected is less than 10 meters as the first vehicle .
  • the server can acquire the position of the vehicle to be detected in real time. It can be understood that only vehicles around the vehicle to be detected have the conditions for photographing the vehicle to be detected, so the vehicles around the vehicle to be detected can be used as the first vehicle.
  • the location information may be the location indicated on the map according to the latitude and longitude information reported by the vehicle.
  • the server can determine, according to the location and the lane direction information on the map, whether the searched first vehicle is located in front of or behind the vehicle's forward direction indication.
  • the location information may also include path planning information, etc., which is not limited here.
  • the preset conditions may further include conditions determined according to the component to be detected. For example, when the components to be detected are located on the left and right sides of the vehicle to be detected, and the preset condition further includes that the vehicle to be detected is driving on a road with at least two lanes, the server obtains the current position of the vehicle to be detected according to the current position of the vehicle to be detected. When the vehicle to be detected is driving on a road with at least two lanes, the vehicle whose distance from the vehicle to be detected is less than the target threshold is then searched. It can be understood that when the component to be detected is located on the left and right sides of the vehicle to be detected, if the first vehicle and the vehicle to be detected are front and rear vehicles in the same lane, it is difficult for the first vehicle to photograph the component to be detected.
  • the server may also determine the first vehicle according to other preset conditions, which is not limited here.
  • the method for the server to determine the first vehicle according to the vehicle to be detected reference may be made to steps 201 to 204, which will not be repeated here.
  • the server can feed back the situation to the vehicle to be detected, or it can continue to obtain the location of the vehicle to be detected and search for the first vehicle. In the case of a vehicle, it is determined that the vehicle is the first vehicle, and then steps 104 to 108 are executed.
  • the server may accept detection requests from multiple vehicles to be detected at the same time, and further, the server determines the first vehicle of the plurality of vehicles to be detected respectively, and then determines the number of the first vehicles of the plurality of vehicles to be detected according to the number of the vehicles to be detected. and position, to determine the detection sequence of the plurality of vehicles to be detected, and further, to detect each vehicle of the plurality of vehicles to be detected respectively.
  • the server sends a first work instruction to the vehicle to be detected.
  • the vehicle to be detected receives the first work instruction.
  • the first work instruction is used to instruct the component to be detected to be turned on at least once within the target time period.
  • the first work order may be generated by the server according to the component to be inspected, wherein the first work order may further include the work time and work order of the component to be inspected.
  • the first work instruction can be used to instruct the vehicle to be detected to control the wiper to work for 5 seconds when the first work command is received.
  • the first work instruction The command can be used to turn on the right turn signal when the vehicle to be detected receives the first work command, turn off the right turn signal and turn on the left turn signal after the right turn signal works for 5 seconds, and turn off the left turn signal after the left turn signal works for 5 seconds lamp.
  • the server may also adjust the detection time of the to-be-detected component of the to-be-detected vehicle according to weather conditions and the like.
  • the server sends a second work instruction to the first vehicle.
  • the first vehicle receives the second work instruction.
  • the second work instruction is used to instruct the first vehicle to capture the first image and send the first image to the server within the target time period. It can be understood that the first work instruction and the second work instruction are used to enable the first vehicle to photograph the working condition of the component to be inspected to obtain the first image.
  • the second work order may also carry the shooting angle of the camera.
  • the server may determine the shooting angle of the camera according to the position of the vehicle to be detected, the position of the first vehicle, and the component to be detected, wherein the position of the vehicle to be detected and the position of the first vehicle are used to determine the relationship between the vehicle to be detected and the first vehicle.
  • the relative position between a vehicle For example, the server can obtain that the vehicle to be detected is located behind the first vehicle according to the positions of the vehicle to be detected and the first vehicle, and then further determine the shooting angle of the camera according to the specific positions of the left turn signal and the left turn signal of the components to be detected.
  • the positions of the vehicle to be detected, the first vehicle and the component to be detected in the embodiments of the present application may be spatial geographic coordinates, and thus the obtained shooting angle may be a direction in three-dimensional space.
  • the first work order and the second work order may further include instructions such as controlling the speed of the vehicle, switching lanes, etc., so as to adjust the positions of the vehicle to be detected and the first vehicle.
  • the component to be detected is a brake light
  • the first work instruction is used to instruct the vehicle to be detected to perform braking at least once
  • the second work instruction is also used to instruct the first vehicle to adjust the vehicle to be detected before and after each brake is performed and to be detected. distance of the vehicle.
  • a specific cooperation process between the first vehicle and the vehicle to be detected may be as follows: the first vehicle first maintains a first distance from the vehicle to be detected according to the second work instruction, and the vehicle to be detected then steps on the brakes according to the first work instruction.
  • the distance between the first vehicle and the vehicle to be detected is the second distance, and the first distance is greater than the second distance. It can be understood that controlling the distance between the vehicle and the vehicle to be detected by the first vehicle will not affect the normal operation of the components to be detected of the vehicle to be detected, and can also prevent collisions between vehicles that are too close together and images taken from too far away. The first image is not clear enough.
  • the vehicle to be detected controls the component to be detected to be turned on at least once within the target time period according to the first work instruction.
  • the vehicle to be detected may turn on the component to be detected within the target time period according to the first work instruction.
  • the vehicle to be detected can turn on the lamp at the first time point of the target time period according to the first work instruction for the vehicle lamp; turn off the vehicle lamp at the second time point, the first time point and the second time point
  • the points are two time points within the target time period.
  • the vehicle to be detected may control the wiper to be in a working state within the target time period according to the first work instruction for the wiper.
  • the vehicle to be inspected may also control the component to be inspected to work according to a preset rule within a target time period according to the first work instruction. For example, the vehicle to be detected turns on and off the fog lights at preset time intervals within one minute after receiving the first work instruction.
  • the vehicle to be inspected may also control the operation of the part to be inspected within the target time period according to the first work instruction, and start other parts to work, so as to cooperate with the first vehicle to capture the first image.
  • FIG. 5 is a schematic diagram of a vehicle detection method provided by an embodiment of the present application.
  • the components to be detected are the tires on the left and right sides of the vehicle, the body, the left and right rear-view mirror lamps and the windows.
  • the dotted line vehicle in Figure 5 represents the position of the vehicle at the previous moment
  • the solid line vehicle represents the current position of the vehicle
  • the dotted line arrow represents the vehicle from the front.
  • Driving direction from one moment to the current moment Specifically, as shown in (A) of FIG.
  • the server may send a second work instruction to the first vehicle after the vehicle to be detected switches lanes once, where the second work instruction includes the lane, so that the first vehicle can Switch to the above lanes to photograph the failure of the windows and mirror lamps of the vehicle to be detected. It can be understood that, by the above method, the components on the side of the vehicle can be detected, which improves the coverage of the detected components.
  • the vehicle to be detected may send a feedback that the execution of the first work instruction is completed to the server after completing the first work instruction.
  • the first vehicle captures the first image within the target time period according to the second work instruction.
  • the first image may be an image or a video. Understandably, a video is an image sequence composed of multiple images.
  • the second work order further includes a shooting angle of the camera.
  • the first vehicle can determine the target camera according to the shooting angle.
  • the target camera is determined as a front camera
  • the front camera is a vehicle-mounted camera located on the front side of the first vehicle, and then the angle of the camera is adjusted to the shooting angle, for example Rotate the front camera to the vehicle light with the shooting direction at the lower left, and then turn on the front camera within the target time to obtain the first image.
  • the first vehicle sends the first image to the server.
  • the server receives the first image.
  • the first vehicle may send the first image to the server after performing step 107, or may send each of the acquired first images to the server in real time when the first image is multiple images, There is no limitation here.
  • the server identifies the failure condition of the component to be detected from the first image.
  • the server may first identify the component to be inspected in the first image, and then identify the failure condition of the component to be inspected according to the component to be inspected in the first image.
  • the first image is an image.
  • the server may identify the location of the component to be inspected from the first image; when the location of the component to be inspected in the first image is at the target position, it is determined that the component to be inspected does not have a fault. For example, if the component to be detected is a car window, the server can first identify the position of the car window from the first image, and then compare the position of the car window in the first image with the target position. If the first work instruction is used to indicate the car window The glass is in a half-open state, and the server determines that the window is not faulty when it is detected in the first image that the window glass is at a half position of the window.
  • the server further identifies the component to be inspected from the first image; when the average grayscale value of the component to be inspected in the first image is within the target range, it is determined that the component to be inspected is not faulty. For example, if the component to be detected is a car light, and the first work instruction is used to control the car light to be in an on state, the server recognizes the average gray value of the car light in the first image and the gray level in the image when the car light is turned on. When the value is within the range, it is determined that the lamp is not faulty.
  • the first image includes multiple images.
  • the server may identify the position of the component to be detected in each frame of the first image; when the position of the component to be detected in multiple frames of images is inconsistent, it is determined that the component to be detected is not faulty.
  • the first work instruction is used to instruct the wiper to perform one operation within a target time period, and the target time period is the time required for the wiper to perform one operation.
  • the server can identify the water flow distribution of the wipers and water jets respectively in the multi-frame images, and when it recognizes that there are regular changes in the wiper position and water flow distribution in each frame of the multi-frame images, determine The wipers are not faulty.
  • the server can identify the component to be detected in each frame of the multi-frame image; identify the working state of the component to be detected according to the gray value of the component to be detected in each frame of image; When the obtained change rule of the working state of the component to be detected satisfies the preset rule, it is determined that the component to be detected does not have a fault, and the first work instruction is used to instruct the component to be detected to work with the preset rule.
  • the server may further determine that the component to be detected does not have a fault if the working state of the component to be detected is different in two frames of images in the multi-frame images.
  • the first work order is used to instruct the component to be inspected to be turned on and off once within the target time period.
  • the server may send the fault condition of the component to be detected to the vehicle to be detected.
  • the vehicle to be detected may send a query request to the server when the component to be detected is faulty.
  • the server may search the map for information of repair points within a preset distance from the vehicle to be detected, and send the information of the found repair points to the vehicle to be detected.
  • the server may also interact with a roadside fixed camera, acquire an image captured by the roadside camera, and identify the failure condition of the component to be detected according to the image captured by the roadside camera.
  • roadside cameras have a height advantage in capturing the condition of the target vehicle's roof body and roof sunroof.
  • the server may also jointly identify and detect the fault condition of the component to be detected based on the roadside fixed camera and the image of the vehicle to be detected obtained by the first vehicle.
  • the server may send a command to the vehicle to be detected, so that the vehicle to be detected keeps its lane unchanged and drives at a constant speed; at the same time, it sends a command to the first vehicle to accelerate or decelerate it.
  • the server After approaching the vehicle to be detected to the target distance, keep driving at a constant speed and send feedback to the server after driving at a constant speed.
  • the target distance may be determined according to the component to be detected, and the target distance is smaller when the area of the component to be detected is small. It can be understood that maintaining a uniform speed and a target distance between the first vehicle and the vehicle to be detected can improve the clarity of the first image captured by the first vehicle.
  • the detection request is a request generated by the vehicle to be detected when the component to be detected starts to work.
  • step 101 when the component to be detected is turned on, the vehicle to be detected sends a detection request for the component to be detected to the server.
  • the server may recognize that the component to be detected is in a working state according to the detection request, and further, the server may not execute steps 104 and 106.
  • the server may respond to the detection request preferentially, filter out a plurality of first vehicles according to the preferential preset conditions, and then complete the detection of the to-be-detected component through steps 105 , 107 , 108 and 109 .
  • the server simultaneously issues a second instruction to the plurality of first vehicles to capture images of corresponding angles, and the server may determine the fault condition of the components to be detected by combining the vehicle identifiers in the images and the working states of the corresponding components to be detected.
  • the process of screening the first vehicle by the server may not include the following steps 201-204. It can be understood that since the detection request is sent to the server when the component to be detected is turned on, the server determines the first vehicle in the shortest time and allows the first vehicle to complete the shooting of the vehicle to be detected. The detection of the components to be detected is completed within the time of detecting the components to avoid unnecessary influence on traffic. Among them, the server minimizes the steps of determining the first vehicle, which can shorten the time of the detection process, and avoid that the vehicle to be detected has stopped the working state of the component to be detected before the first vehicle takes pictures of the vehicle to be detected, resulting in errors in the detection results. .
  • the server may determine the vehicle to be detected according to the first instruction, and then determine the first vehicle according to the vehicle to be detected.
  • the server can issue the second instruction to the multiple first vehicles at the same time, and respectively to different first vehicles at different time periods. Different detection component ranges are allocated, and multiple first vehicles cooperate to carry out the detection of the vehicle to be detected at the same time or in a time-sharing manner.
  • FIG. 6 is a schematic diagram of a plurality of first vehicles according to an embodiment of the present application.
  • the first vehicle No. 1, the first vehicle No. 2, and the first vehicle No. 3 are the three first vehicles screened by the server in front of the vehicle to be detected, behind and at the left rear of the left lane.
  • the server can issue the detection command of the wiper and water spray components to the No. 1 first vehicle in front, the detection command of the double-jump light component to the No. 2 first vehicle in the rear, and to the No. 3 first vehicle on the left
  • the vehicle issues a detection instruction for the left window part, and simultaneously issues an instruction carrying vehicle speed information.
  • the instruction is used to make all the first vehicles and the vehicle to be detected maintain the same vehicle speed and simultaneously carry out detection.
  • the first vehicle No. 1 and the first vehicle No. 3 can exit the detection, and then the server sends a brake light detection instruction to the No. 2 first vehicle.
  • the brake light detection instruction can carry deceleration information. The information is used to widen the distance between the No. 2 first vehicle and the vehicle to be detected, so that the vehicle to be detected has space to perform braking action.
  • FIG. 7 is a flowchart of a method for determining a first vehicle according to an embodiment of the present application.
  • the method for the server to determine the first vehicle may include some or all of the following steps:
  • FIG. 8A is a schematic diagram of determining a vehicle in a first area according to an embodiment of the present application.
  • the server may determine vehicles around the vehicle to be detected as vehicles in the first area according to the position of the vehicle to be detected.
  • the vehicle in the first area may be a vehicle whose distance from the vehicle to be detected is less than the target threshold.
  • the relevant content of step 103 please refer to the relevant content of step 103 .
  • the server can separately obtain the path planning of each vehicle in the first area, and then compare the path planning of each vehicle in the first area with the path planning of the vehicle to be detected, and filter out the path to the vehicle to be detected.
  • Plan for partially overlapping vehicles is the road on which the vehicle plans to travel.
  • the selected vehicle and the path planning part of the vehicle to be detected overlap, so the selected vehicle detects the first vehicle on the overlapping planned road, which can avoid detection caused by different driving directions of the two vehicles during the detection process. fail, and avoid the first vehicle to deviate from the original path plan after completing the detection process.
  • the server may determine the first vehicle from the filtered vehicles, wherein the route planning on the overlapped path of the filtered vehicle and the vehicle to be detected within the target time period is planned as a straight road.
  • the component to be detected when the component to be detected is located on the front side of the vehicle to be detected, it is determined that the component located in front of the vehicle to be detected is the second vehicle.
  • the components to be detected are front wipers, front water spray equipment, front turn signals, front night headlights, front fog lights, double jump lights, and daytime running lights, etc.
  • the front is identified as the second vehicle.
  • the second vehicle is determined to be located behind the vehicle to be inspected.
  • the components to be detected are rear water spray equipment, rear wipers, rear turn signals, rear view mirror turn lights, rear night headlights, rear fog lights, double jump lights, and rear view mirrors.
  • the vehicle located behind the vehicle to be detected is determined as the second vehicle. It can be understood that the vehicle behind the vehicle to be detected can use the vehicle-mounted front camera to collect the image of the rear side of the vehicle to be detected.
  • FIG. 8B is a schematic diagram of determining a second vehicle according to an embodiment of the present application.
  • the components to be detected are located at the rear side of the vehicle to be detected, the vehicles No. 1 and 2 are located at the rear of the vehicle, and the vehicle No. 3 is located in the front of the vehicle, so the vehicles No. 1 and No. 2 are determined as the second vehicle. vehicle.
  • the server may determine the second vehicle as the first vehicle. It can be understood that the method can reduce the steps of determining the first vehicle, shorten the detection time, and thus improve the detection efficiency.
  • the server may send a shooting instruction to the screened vehicles, and the shooting instructions are used to instruct the screened vehicles to shoot a second image respectively, and then identify the identification of the vehicle to be detected from the second image, and shoot the selected vehicles.
  • the second image contains the identified vehicle identified as the first vehicle.
  • FIG. 8C is a schematic diagram of determining a first vehicle from a second vehicle according to an embodiment of the present application.
  • the second vehicle includes a vehicle No. 1 and a vehicle No. 2, and the server may acquire a second image captured by the vehicle No. 1 and the vehicle No. 2, and identify the identity of the vehicle to be detected from the second image. Since the vehicle No. 1 and the vehicle to be detected are blocked by the vehicle No. 2, the second image captured by the vehicle No. 1 does not include the identification of the vehicle to be detected, and the second image captured by the vehicle No. 2 contains the identification of the vehicle to be detected. , the server may determine the No. 2 vehicle as the first vehicle.
  • FIG. 9 is a flowchart of a method for determining a first vehicle according to a second image according to an embodiment of the present application. Specifically, the following steps may be included:
  • the server sends a shooting instruction to the second vehicle.
  • the shooting instruction may include a camera position and a camera angle.
  • the photographing instruction may be used to instruct the second vehicle to turn on the rear camera of the vehicle.
  • the second vehicle shoots a second image according to the shooting instruction.
  • the second vehicle may be one or more vehicles. Specifically, each vehicle receives a camera instruction, and the vehicle turns on the corresponding camera according to the camera instruction corresponding to the vehicle to obtain a second image, where the second image may be an image or an image sequence composed of multiple images.
  • the second vehicle sends the second image to the server.
  • the server determines the first vehicle from the second vehicles according to the second image and the identifier of the vehicle to be detected.
  • the server may identify the identifier of the vehicle to be detected in the second image, and if the identifier is included in the second image sent by the vehicle, the vehicle is the first vehicle.
  • the identification can be the license plate number of the vehicle and the serial number of the vehicle.
  • the server can send the image to the vehicle in the adjacent lane.
  • Vehicle detection can also be stopped by sending a command to switch to the same lane as the vehicle to be detected.
  • Determining the first vehicle through the second image and the identifier of the vehicle to be detected can ensure that the vehicle to be detected can be photographed by the first vehicle.
  • FIG. 10 is a schematic diagram of a hardware structure of a server in an embodiment of the present application.
  • the server 10 shown in FIG. 10 (the server 10 may specifically be a computer device) includes a memory 101 , a processor 102 , a communication interface 103 and a bus 104 .
  • the memory 101 , the processor 102 , and the communication interface 103 are connected to each other through the bus 104 for communication.
  • the memory 101 may be a read-only memory (Read Only Memory, ROM), a static storage device, a dynamic storage device, or a random access memory (Random Access Memory, RAM).
  • the memory 101 may store a program, and when the program stored in the memory 101 is executed by the processor 102, the processor 102 and the communication interface 103 are used to perform various steps of vehicle detection in the embodiments of the present application.
  • the processor 102 may adopt a general-purpose central processing unit (Central Processing Unit, CPU), a microprocessor, an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), a graphics processor (graphics processing unit, GPU) or one or more
  • the integrated circuit is used to execute the relevant program, so as to implement the vehicle detection method of the method embodiment of the present application.
  • the processor 102 may also be an integrated circuit chip with signal processing capability. In the implementation process, each step of the vehicle detection method of the present application can be completed by an integrated logic circuit of hardware in the processor 102 or instructions in the form of software.
  • the above-mentioned processor 102 can also be a general-purpose processor, a digital signal processor (Digital Signal Processing, DSP), an application-specific integrated circuit (ASIC), an off-the-shelf programmable gate array (Field Programmable Gate Array, FPGA) or other programmable logic devices. , discrete gate or transistor logic devices, discrete hardware components.
  • a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
  • the steps in combination with the methods provided in the embodiments of the present application may be directly embodied as being executed by a hardware decoding processor, or executed by a combination of hardware and software modules in the decoding processor.
  • the software module may be located in random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other storage media mature in the art.
  • the storage medium is located in the memory 101, and the processor 102 reads the information in the memory 101, and implements the vehicle fault detection method of the embodiment of the present application in combination with its hardware.
  • the communication interface 103 implements communication between the server 10 and other devices or a communication network using a transceiver device such as, but not limited to, a transceiver.
  • a transceiver device such as, but not limited to, a transceiver.
  • data (such as the first image in the embodiment of the present application) may be acquired through the communication interface 103 .
  • Bus 104 may include a pathway for communicating information between various components of server 10 (eg, memory 101, processor 102, communication interface 103).
  • all or part of the functions may be implemented by software, hardware, or a combination of software and hardware.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, all or part of the processes or functions described in the embodiments of the present application are generated.
  • the computer may be a general purpose computer, special purpose computer, computer network, or other programmable device.
  • the computer instructions may be stored in a computer-readable storage medium.
  • the computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device such as a server, a data center, or the like that includes an integration of one or more available media.
  • the usable media may be magnetic media (eg, floppy disks, hard disks, magnetic tapes), optical media (eg, DVDs), or semiconductor media (eg, solid state disks (SSDs)), and the like.
  • FIG. 11 is a schematic structural diagram of a vehicle detection device provided by an embodiment of the present application.
  • the vehicle detection device 300 is applied to a server, and the device includes:
  • a determining unit 301 configured to determine a first vehicle according to the position information of the vehicle to be detected, where the distance between the first vehicle and the vehicle to be detected is less than a target threshold;
  • a sending unit 302 configured to send a first work instruction to the vehicle to be tested, where the first work instruction is used to instruct the component to be tested to be turned on at least once within a target time period;
  • the sending unit 302 is configured to send a second work instruction to the first vehicle, where the second work instruction is used to instruct the first vehicle to take a first image and send the first image within the target time period. sent to said server;
  • the identifying unit 303 is configured to identify the fault condition of the component to be detected from the first image.
  • the vehicle detection device may further include a receiving unit 304, configured to receive a detection request for a to-be-detected component of the to-be-detected vehicle, where the detection request is used to instruct the server to send the first job to the to-be-detected vehicle instructing and sending a second work order to the first vehicle.
  • a receiving unit 304 configured to receive a detection request for a to-be-detected component of the to-be-detected vehicle, where the detection request is used to instruct the server to send the first job to the to-be-detected vehicle instructing and sending a second work order to the first vehicle.
  • the determining unit 301 is further configured to:
  • the first vehicle is determined from the filtered vehicles.
  • the determining unit 301 is further configured to:
  • the first vehicle is determined from the filtered vehicles according to the second images respectively captured by the filtered vehicles and the identification of the vehicle to be detected, and the second image captured by the first vehicle includes the logo.
  • the component to be inspected when the component to be inspected is located on the front side of the vehicle to be inspected, the first vehicle is located in front of the vehicle to be inspected.
  • the component to be inspected when the component to be inspected is located at the rear side of the vehicle to be inspected, the first vehicle is located behind the vehicle to be inspected.
  • the determining unit 301 is configured to:
  • the shooting angle of the camera is determined, and the second work instruction carries the shooting angle.
  • the identifying unit 303 is configured to:
  • the identifying unit 303 is configured to:
  • the first image includes multiple frames of images
  • the identifying unit 303 is configured to:
  • the first image includes multiple frames of images
  • the identifying unit 303 is configured to:
  • the change rule of the working state of the component to be detected identified according to the multi-frame images satisfies a preset rule, it is determined that the component to be detected has no fault, and the first work instruction is used to instruct the component to be detected Parts work in a preset pattern.
  • receiving unit 304 determining unit 301 , sending unit 302 and identifying unit 303 can be obtained directly by referring to the relevant descriptions in the method embodiment shown in FIG. 4 , which will not be repeated here.
  • FIG. 12 is a schematic structural diagram of a vehicle detection device provided by an embodiment of the present application.
  • the vehicle detection device 400 is applied to the vehicle to be detected, and the device includes:
  • the receiving unit 401 is configured to receive the first work instruction sent by the server, and the server is further configured to send a second work instruction to the first vehicle, the distance between the first vehicle and the vehicle to be detected is less than the target threshold, and the second work instruction is used to indicate the first work instruction.
  • a vehicle captures a first image within a target time period and sends the first image to the server, where the first image is used to identify a fault condition of a component to be inspected of the vehicle to be inspected;
  • the control unit 402 is configured to control the component to be detected to be turned on at least once within the target time period according to the first work instruction.
  • the vehicle detection apparatus may further include a sending unit 403, configured to send a detection request for the to-be-detected component of the vehicle to be detected to the server, where the detection request is used to instruct the server to send the first detection request to the vehicle to be detected A work order and sending a second work order to the first vehicle.
  • a sending unit 403 configured to send a detection request for the to-be-detected component of the vehicle to be detected to the server, where the detection request is used to instruct the server to send the first detection request to the vehicle to be detected A work order and sending a second work order to the first vehicle.
  • control unit 402 is configured to:
  • the component to be detected is controlled to work with a preset regularity within the target time period.
  • control unit 402 is configured to:
  • the component to be detected is controlled to be in a working state within the target time period.
  • the component to be detected is a vehicle lamp
  • the control unit 402 is configured to:
  • the vehicle light is turned off at a second time point, and the first time point and the second time point are two time points within the target time period.
  • FIG. 13 is a schematic structural diagram of a vehicle detection device provided by an embodiment of the present application.
  • the vehicle detection device 500 is applied to the vehicle to be detected, and the device includes:
  • the receiving unit 501 is configured to receive a second work instruction sent by a server, the server is further configured to send a first work instruction to the vehicle to be tested, and the first work instruction is used to indicate the vehicle to be tested of the vehicle to be tested.
  • the component is turned on at least once within a target time period; the distance between the first vehicle and the vehicle to be detected is less than a target threshold;
  • a photographing unit 502 configured to photograph a first image within the target time period according to the second work instruction
  • the sending unit 503 is configured to send the first image to the server, where the first image is used to identify the failure condition of the component to be detected.
  • the second work instruction is generated when the server responds to a detection request for the component to be detected sent by the vehicle to be detected, and the detection request is further used to indicate The server sends the first work instruction to the vehicle to be detected.
  • the receiving unit 501 is configured to receive a photographing instruction sent by the server; the photographing unit 503 is configured to photograph a second image according to the photographing instruction, and the second image is used for Determine the first vehicle; the sending unit 503 sends the second image to the server, where the second image includes the identification of the vehicle to be detected.
  • the second work instruction carries the shooting angle
  • the shooting unit 502 is configured to:
  • the camera is controlled to shoot the first image at the shooting angle, and the shooting angle is determined by the server according to the position of the vehicle to be detected, the position of the first vehicle and the component to be detected.
  • the second work instruction carries a lane
  • the apparatus further includes a control unit 504, and the control unit 504 is configured to:
  • the lane is determined by the server according to the position of the vehicle to be detected, the position of the first vehicle and the component to be detected.
  • An embodiment of the present application provides an electronic device, the electronic device includes: one or more processors and a memory; the memory is coupled to the one or more processors, and the memory is used to store computer program code, the computer program code includes computer instructions, a The processor or processors are used to invoke computer instructions to cause the electronic device to perform the method performed by the server in FIGS. 4-9 .
  • the embodiment of the present application further provides an electronic device, which may include a processor and a communication interface, the processor obtains program instructions through the communication interface, and when the program instructions are executed by the processor, the execution is performed as shown in FIG. 4 to FIG. 9 to be detected method performed by the vehicle.
  • an electronic device which may include a processor and a communication interface
  • the processor obtains program instructions through the communication interface, and when the program instructions are executed by the processor, the execution is performed as shown in FIG. 4 to FIG. 9 to be detected method performed by the vehicle.
  • Embodiments of the present application further provide an electronic device, and the electronic device may include a processing circuit, and the processing circuit is configured to perform the method performed by the first vehicle as shown in FIGS. 4 to 9 .
  • An embodiment of the present application provides a chip, which is applied to an electronic device, where the chip includes one or more processors, and the processor is configured to invoke computer instructions to cause the electronic device to execute the methods described in FIGS. 4 to 9 .
  • the chip system may further include a memory for storing necessary program instructions and data of the vehicle.
  • the chip system may be composed of chips, or may include chips and other discrete devices.
  • the chip system may include an application specific integrated circuit (ASIC), an off-the-shelf programmable gate array (field programmable gate array, FPGA), or other programmable logic devices.
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • the chip system may further include interface circuits and the like.
  • the memory is located within the processor; or the memory is located outside the processor.
  • An embodiment of the present application provides a computer program product containing instructions, when the computer program product is executed on an electronic device, the electronic device causes the electronic device to execute the methods described in FIG. 4 to FIG. 9 .
  • An embodiment of the present application provides a computer-readable storage medium, including instructions, when the above-mentioned instructions are executed on an electronic device, the above-mentioned electronic device causes the above-mentioned electronic device to execute the methods described in FIG. 4 to FIG. 9 .
  • all or part of the functions may be implemented by software, hardware, or a combination of software and hardware.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general purpose computer, special purpose computer, computer network, or other programmable device.
  • the computer instructions may be stored in a computer-readable storage medium.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that includes an integration of one or more available media.
  • the usable media may be magnetic media (eg, floppy disks, hard disks, magnetic tapes), optical media (eg, DVDs), or semiconductor media (eg, solid state disks (SSDs)), and the like.
  • the process can be completed by instructing the relevant hardware by a computer program, and the program can be stored in a computer-readable storage medium.
  • the program When the program is executed , which may include the processes of the foregoing method embodiments.
  • the aforementioned storage medium includes: ROM or random storage memory RAM, magnetic disk or optical disk and other mediums that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)

Abstract

一种车辆检测方法和车辆检测装置。在车辆检测方法中,101、待检测车辆(20)向服务器(10)发送检测请求;102、服务器(10)接收针对待检测车辆(20)的待检测部件的检测请求;103、服务器(10)响应于检测请求,根据待检测车辆(20)的位置信息,确定第一车辆(30);104、服务器(10)向待检测车辆(20)发送第一工作指令;105、服务器(10)向第一车辆(30)发送第二工作指令;106、待检测车辆(20)根据第一工作指令,在目标时间段内控制待检测部件开启至少一次;107、第一车辆(30)根据第二工作指令,在目标时间段内拍摄第一图像;108、第一车辆(30)向服务器(10)发送第一图像;109、服务器(10)从第一图像中识别待检测部件的故障情况。由此服务器可以通过控制待检测车辆与第一车辆互相配合,确定待检测部件的故障情况,提高了检测准确度。

Description

一种车辆检测方法和车辆检测装置
本申请要求于2021年01月06日提交中国国家知识产权局、申请号为202110015232.4、申请名称为“一种车辆检测方法和车辆检测装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本方案涉及计算机技术领域,尤其涉及一种车辆检测方法和车辆检测装置。
背景技术
随着自动驾驶技术的发展,对车辆的故障检测成为自动驾驶领域重要的内容。在自动驾驶车辆出现故障时,因为没有驾驶员往往难以发现,存在巨大的安全隐患。
目前,现有的车辆检测方法可以通过摄像头对待检测车辆进行拍摄,再根据拍摄得到的图像判断待检测车辆的故障情况。该方法无法准确得到待检测车辆的待检测部件的故障情况。
发明内容
本申请实施例提供了一种车辆检测方法和车辆检测装置,其中,车辆检测装置包括服务器、待检测车辆和第一车辆。该方法中,服务器可以响应于针对待检测车辆的待检测部件检测请求,根据待检测车辆的位置确定第一车辆,再向待检测车辆发送第一工作指令和向第一车辆发送第二工作指令,其中,第一工作指令用于指示待检测部件在目标时间段内开启至少一次;向第一车辆发送第二工作指令,第二工作指令用于指示第一车辆在目标时间段内拍摄第一图像和将第一图像发送至服务器,最后,服务器从第一图像中识别待检测部件的故障情况。实施本技术方案,服务器可以通过控制待检测车辆与第一车辆互相配合,确定待检测车辆的待检测部件的故障情况,提高了检测准确度。
第一方面,本申请实施例提供了一种车辆检测方法,其特征在于,应用于服务器,该方法包括:
根据待检测车辆的位置信息,确定第一车辆,第一车辆与待检测车辆的距离小于目标阈值;
向待检测车辆发送第一工作指令,第一工作指令用于指示待检测车辆的待检测部件在目标时间段内开启至少一次;
向第一车辆发送第二工作指令,第二工作指令用于指示第一车辆在目标时间段内拍摄第一图像和将第一图像发送至服务器;
从第一图像中识别待检测部件的故障情况。
结合第一方面,在一种可能的实现方式中,服务器可以接收针对待检测车辆的待检测部件的检测请求,检测请求用于指示服务器向待检测车辆发送第一工作指令和向第一车辆发送第二工作指令。
结合第一方面,在一种可能的实现方式中,服务器可以同时获取多个待检测车辆的检测请求,根据待检测车辆周围的第一车辆的数量和位置,确定对上述多个待检测车辆的检测顺 序。
结合第一方面,在一种可能的实现方式中,服务器获取至少一个车辆中每一个车辆的路径规划和该待检测车辆的路径规划,该至少一个车辆与该待检测车辆的距离均小于目标阈值;从该至少一个车辆中筛选出路径规划与该待检测车辆的路径规划部分重合的车辆;从筛选出的车辆中确定该第一车辆。
结合第一方面,在一种可能的实现方式中,筛选出的车辆和待检测车辆所重合的路径规划在目标时间段内的路径规划为直道。其中,直道为所述直道的车道线与其他方向的车道线不交叉的车道。
在一些实施例中,服务器可以筛选出和待检测车辆所重合的路径规划在目标时间段内的路径规划为直道的车辆作为第一车辆,可以避免在车辆检测方法执行结束后影响上述第一车辆的原始路径规划。
在另一些实施例中,服务器可以根据待检测部件,筛选出和待检测车辆所重合的路径规划在目标时间段内的路径规划为弯道的车辆作为第一车辆。例如,在待检测部件为右转向灯时,服务器可以筛选出和待检测车辆所重合的路径规划在目标时间段内的路径规划为前方右拐的弯道的车辆作为第一车辆。可以理解的,该方法可以避免车辆在直行路段开启左右转向灯等待检测部件,防止影响其他车辆正常行驶。
结合第一方面,在一种可能的实现方式中,服务器向筛选出的车辆发送拍摄指令,该拍摄指令用于指示该筛选出的车辆分别拍摄第二图像;根据该筛选出的车辆分别拍摄的第二图像和该待检测车辆的标识,从该筛选出的车辆中确定第一车辆,第一车辆拍摄的第二图像包含该标识。
结合第一方面,在一种可能的实现方式中,在待检测部件位为待检测车辆的前侧时,第一车辆位于待检测车辆的前方。
结合第一方面,在一种可能的实现方式中,在待检测部件位为待检测车辆的后侧时,第一车辆位于待检测车辆的后方。
结合第一方面,在一种可能的实现方式中,该方法还包括:
根据待检测车辆的位置、第一车辆的位置和待检测部件,确定相机的拍摄角度,第二工作指令携带该拍摄角度。
结合第一方面,在一种可能的实现方式中,服务器从第一图像中识别待检测部件;在待检测部件在第一图像中的平均灰度值在目标范围内时,确定待检测部件不存在故障。
结合第一方面,在一种可能的实现方式中,服务器从第一图像中识别待检测部件的位置;在待检测部件在该第一图像中的位置位于目标位置时,确定待检测部件不存在故障。
结合第一方面,在一种可能的实现方式中,第一图像包括多帧图像,服务器在该第一图像的每帧图像中分别识别待检测部件的位置;当待检测部件在上述多帧图像中的位置不一致时,确定该待检测部件不存在故障。
结合第一方面,在一种可能的实现方式中,第一图像包括多帧图像,服务器识别上述多帧图像的每帧图像中的待检测部件;根据上述每帧图像的中待检测部件的灰度值识别待检测部件的工作状态;若根据多帧图像识别到的待检测部件的工作状态的变化规律满足预设规律时,确定待检测部件不存在故障,该第一工作指令用于指示待检测部件以预设规律工作。
第二方面,本申请实施例提供了一种车辆检测方法,其特征在于,应用于待检测车辆,该方法包括:
接收服务器发送的第一工作指令,服务器还用于向第一车辆发送第二工作指令,第一车辆与待检测车辆的距离小于目标阈值,第二工作指令用于指示第一车辆在目标时间段内拍摄第一图像和将第一图像发送至服务器,第一图像用于识别待检测车辆的待检测部件的故障情况;
根据第一工作指令,在目标时间段内控制待检测部件开启至少一次。
结合第二方面,在一种可能的实现方式中,待检测车辆可以向服务器发送针对待检测车辆的待检测部件的检测请求,检测请求用于指示服务器向待检测车辆发送第一工作指令和向第一车辆发送第二工作指令。
结合第二方面,在一种可能的实现方式中,服务器根据第一工作指令,控制待检测部件在该目标时间段内以预设规律工作。
结合第二方面,在一种可能的实现方式中,待检测车辆控制待检测部件在该目标时间段内处于工作状态。
结合第二方面,在一种可能的实现方式中,待检测车辆在第一时间点开启待检测部件;在第二时间点关闭该待检测部件,该第一时间点和该第二时间点为该目标时间段内的两个时间点。
第三方面,本申请实施例提供了一种车辆检测方法,其特征在于,应用于第一车辆,该方法包括:
接收服务器发送的第二工作指令,服务器还用于向待检测车辆发送第一工作指令,第一工作指令用于指示待检测车辆的待检测部件在目标时间段内开启至少一次,第一车辆与待检测车辆的距离小于目标阈值;
根据第二工作指令,在目标时间段内拍摄第一图像;
将第一图像发送至服务器,第一图像用于识别待检测部件的故障情况。
结合第三方面,在一种可能的实现方式中,第二工作指令是在服务器响应于待检测车辆发送的针对待检测部件的检测请求时生成的,检测请求还用于指示服务器向待检测车辆发送第一工作指令。
结合第三方面,在一种可能的实现方式中,在该接收服务器发送的第二工作指令之前,第一车辆接收该服务器发送的拍摄指令;根据该拍摄指令,拍摄第二图像,该第二图像用于确定该第一车辆;将该第二图像发送至该服务器,该第二图像包括该待检测车辆的标识。
结合第三方面,在一种可能的实现方式中,第二工作指令携带拍摄角度,第一车辆可以控制相机以该拍摄角度拍摄该第一图像,该拍摄角度是该服务器根据该待检测车辆的位置、该第一车辆的位置和该待检测部件确定的。
结合第三方面,在一种可能的实现方式中,第二工作指令携带车道,第一车辆可以切换至上述车道,该车道是服务器根据待检测车辆的位置、第一车辆的位置和待检测部件确定的。
结合第三方面,在一种可能的实现方式中,第二工作指令携带速度信息。例如,待检测车辆切换至第一车辆的右车道,若第一车辆完成对待检测车辆左侧的待检测部件拍摄后,发现本车道前方为其他车辆时,第一车辆可以减速以留出空间,使得待检测车辆可以切换至本车道并进一步由本车道切换道左车道。
在一种可能的实现方式中,服务器可以根据待检测车辆确定多个第一车辆,以完成对上述待检测车辆的检测。
具体的,服务器可以根据第一指令确定待检测车辆,再根据待检测车辆确定第一车辆。在筛选出有多个第一车辆在目标阈值范围内且都能识别到待检测车辆标识时,服务器可以同时给多个第一车辆下发第二指令,在不同时段分别给不同的第一车辆分配不同的检测部件范围,多个第一车辆配合同时或分时开展对待检测车辆的检测。
例如,服务器可以在待检测车辆同车道的前方,后方和左边车道的左后方筛选出三辆第一车辆,可以给前方的第一车辆下发雨刷,喷水部件的检测,给后方的第一车辆下发双跳灯部件的检测,给左后方的第一车辆下发左侧车窗部件的检测,同时下发携带车速信息的指令,所述指令用于使所有第一车辆和待检测车辆维护相同车速,同时刻开展检测。在检测完成时,前方和左后方的第一车辆可以退出检测,进而,服务器给后方的第一车辆下发刹车灯的检测指令,该刹车灯的检测指令可以携带减速信息,该减速信息用于拉开第一车辆与待检测车辆的距离以使待检测车辆具备执行刹车动作的空间。
第四方面,本申请实施例提供了一种车辆检测装置,其特征在于,应用于服务器,该装置包括:
确定单元,用于根据待检测车辆的位置信息,确定第一车辆,第一车辆与待检测车辆的距离小于目标阈值;
发送单元,用于向待检测车辆发送第一工作指令,第一工作指令用于指示待检测车辆的待检测部件在目标时间段内开启至少一次;
发送单元,用于向第一车辆发送第二工作指令,第二工作指令用于指示第一车辆在目标时间段内拍摄第一图像和将第一图像发送至服务器;
识别单元,用于从第一图像中识别待检测部件的故障情况。
可选地,该车辆检测装置还可以包括接收单元,用于接收针对待检测车辆的待检测部件的检测请求,该检测请求用于指示服务器向待检测车辆发送第一工作指令和向第一车辆发送第二工作指令。
可选地,该车辆检测装置还可以包括存储单元,用于存储数据或计算机指令。
另外,该方面中,车辆检测装置其他可选的实施方式可参见上述第一方面的相关内容,此处不再详述。
作为示例,接收单元或发送单元可以为收发器或接口,存储单元可以为存储器,确定单元或识别单元可以为处理器。
第五方面,本申请实施例提供了一种车辆检测装置,其特征在于,应用于待检测车辆,该装置包括:
接收单元,用于接收服务器发送的第一工作指令,服务器还用于向第一车辆发送第二工作指令,第一车辆与待检测车辆的距离小于目标阈值,第二工作指令用于指示第一车辆在目标时间段内拍摄第一图像和将第一图像发送至服务器,第一图像用于识别待检测车辆的待检测部件的故障情况;
控制单元,用于根据第一工作指令,在目标时间段内控制待检测部件开启至少一次。
可选地,该车辆检测装置还可以包括发送单元,用于向服务器发送针对待检测车辆的待检测部件的检测请求,检测请求用于指示服务器向待检测车辆发送第一工作指令和向第一车辆发送第二工作指令。
可选地,该车辆检测装置还可以包括存储单元,用于存储数据或计算机指令。
另外,该方面中,车辆检测装置其他可选的实施方式可参见上述第二方面的相关内容,此处不再详述。
作为示例,接收单元或发送单元可以为收发器或接口,存储单元可以为存储器,控制单元可以为处理器。
第六方面,本申请实施例提供了一种车辆检测装置,其特征在于,应用于第一车辆,该装置包括:
接收单元,用于接收服务器发送的第二工作指令,所述服务器还用于向所述待检测车辆发送第一工作指令,所述第一工作指令用于指示所述待检测车辆的待检测部件在目标时间段内开启至少一次;所述第一车辆与所述待检测车辆的距离小于目标阈值;
拍摄单元,用于根据所述第二工作指令,在所述目标时间段内拍摄第一图像;
发送单元,用于将所述第一图像发送至所述服务器,所述第一图像用于识别所述待检测部件的故障情况。
可选地,该车辆检测装置还可以包括存储单元,用于存储数据或计算机指令。
另外,该方面中,车辆检测装置其他可选的实施方式可参见上述第三方面的相关内容,此处不再详述。
作为示例,接收单元或发送单元可以为收发器或接口,存储单元可以为存储器,拍摄单元可以为处理器。
第七方面,本申请实施例提供一种电子设备,其特征在于,电子设备包括:一个或多个处理器、存储器;存储器与一个或多个处理器耦合,存储器用于存储计算机程序代码,计算机程序代码包括计算机指令,一个或多个处理器用于调用计算机指令以使得电子设备执行如第一方面、第二方面或第三方面中任一方面所述的方法。
第八方面,本申请实施例提供一种电子设备,可以包括处理器和通信接口,处理器通过通信接口获取程序指令,当程序指令被处理器执行时实现第一方面、第二方面或第三方面中任一方面所述的方法。
第九方面,本申请实施例提供一种电子设备,该电子设备可以包括处理电路,处理电路配置为执行如第一方面、第二方面或第三方面中任一方面所述的方法。
第十方面,本申请实施例提供了一种芯片,该芯片应用于电子设备,该芯片包括一个或多个处理器,该处理器用于调用计算机指令以使得该电子设备执行如第一方面、第二方面或第三方面以及上述任一方面所述的方法。在执行上述第一方面或第二方面或第三方面所述的方法的过程中,上述方法中有关发送上述指令和接收上述指令的过程,可以理解为由处理器输出上述指令的过程,以及处理器接收输入的上述指令的过程。在输出上述指令时,处理器将该上述指令输出给收发器,以便由收发器进行发射。该上述指令在由处理器输出之后,还可能需要进行其他的处理,然后才到达收发器。类似的,处理器接收输入的上述指令时,收发器接收该上述指令,并将其输入处理器。更进一步的,在收发器收到该上述指令之后,该上述指令可能需要进行其他的处理,然后才输入处理器。
对于处理器所涉及的发射、发送和接收等操作,如果没有特殊说明,或者,如果未与其在相关描述中的实际作用或者内在逻辑相抵触,则均可以更加一般性的理解为处理器输出和接收、输入等操作,而不是直接由射频电路和天线所进行的发射、发送和接收操作。
在实现过程中,上述处理器可以是专门用于执行这些方法的处理器,也可以是执行存储 器中的计算机指令来执行这些方法的处理器,例如通用处理器。上述存储器可以为非瞬时性(non-transitory)存储器,例如只读存储器(Read Only Memory,ROM),其可以与处理器集成在同一块芯片上,也可以分别设置在不同的芯片上,本申请实施例对存储器的类型以及存储器与处理器的设置方式不做限定。
可选地,结合上述第十方面,在第一种可能的实施方式中,芯片***还可以包括存储器,存储器,用于保存车辆必要的程序指令和数据。该芯片***,可以由芯片构成,也可以包含芯片和其他分立器件。其中,芯片***可以包括专用集成电路(application specific integrated circuit,ASIC)、现成可编程门阵列(field programmable gate array,FPGA)或者其他可编程逻辑器件等。进一步,芯片***还可以包括接口电路等。
可选地,结合上述第十方面,存储器位于处理器之内;或存储器位于处理器之外。
第十一方面,本申请实施例还提供一种处理器,用于执行上述第一方面或第二方面或第三方面所述的方法。
第十二方面,本申请实施例提供一种包含指令的计算机程序产品,当上述计算机程序产品在电子设备上运行时,使得上述电子设备执行如第一方面、第二方面、或第三方面以及上述任一方面所述的方法。
第十三方面,本申请实施例提供一种计算机可读存储介质,包括指令,当上述指令在电子设备上运行时,使得上述电子设备执行如第一方面、第二方面、或第三方面以及上述任一方面所述的方法。
附图说明
下面对本申请实施例用到的附图进行介绍。
图1为本申请实施例提供的一种车辆检测***架构示意图;
图2为本申请实施例提供的另一种车辆检测***架构示意图;
图3为本申请实施例提供的一种车辆002的功能框图;
图4为本申请实施例提供的一种车辆检测方法的流程图;
图5为本申请实施例提供的一种车辆检测方法的示意图;
图6为本申请实施例提供的一种多个第一车辆的示意图;
图7为本申请实施例提供的一种确定第一车辆的方法流程图;
图8A为本申请实施例提供的一种确定第一区域内的车辆的示意图;
图8B为本申请实施例提供的一种确定第二车辆的示意图;
图8C为本申请实施例提供的一种从第二车辆中确定第一车辆的示意图;
图9为本申请实施例提供的一种根据第二图像确定第一车辆的方法流程图;
图10为本申请实施例提供的一种服务器的硬件结构图;
图11为本申请实施例提供的一种车辆检测装置的结构示意图;
图12为本申请实施例提供的一种车辆检测装置的结构示意图;
图13为本申请实施例提供的一种车辆检测装置的结构示意图。
具体实施方式
本申请以下实施例中所使用的术语只是为了描述特定实施例的目的,而并非旨在作为对本申请实施例的限制。如在本申请实施例的说明书和所附权利要求书中所使用的那样,单数 表达形式“一个”、“一种”、“所述”、“上述”、“该”和“这一”旨在也包括复数表达形式,除非其上下文中明确地有相反指示。还应当理解,本申请实施例中使用的术语“和/或”是指并包含一个或多个所列出项目的任何或所有可能组合。
为了更好地理解本申请实施例提供的一种车辆检测方法和车辆检测装置,下面先对本申请实施例使用的***架构进行描述。
请参见图1,图1为本申请实施例提供的一种车辆检测***架构示意图。如图1所示,该***架构包括服务器10、待检测车辆20和第一车辆30,其中,第一车辆30可以包括多个车辆。其中:
首先,待检测车辆20可以向服务器10发送针对待检测车辆20的待检测部件的检测请求,相应的,服务器10响应于该检测请求,根据待检测车辆20的位置确定第一车辆30。例如,服务器10可以将与待检测车辆20的距离小于目标阈值的车辆确定为第一车辆30。
在确定第一车辆30后,服务器10可以向待检测车辆20发送第一工作指令和向第一车辆30发送第二工作指令。待检测车辆20根据第一工作指令使待检测部位在目标时间段内开启至少一次;第一车辆30根据第二工作指令拍摄第一图像,再将第一图像发送至服务器10。最后,服务器10可以从第一图像中识别待检测部件的故障情况。
在一些实施例中,服务器10还可以向与待检测车辆20的距离小于目标阈值的车辆发送拍摄指令,该拍摄指令用于指示接收到该拍摄指令的车辆拍摄第二图像和将第二图像发送至服务器10。进而,服务器10可以将拍摄的第二图像中包含待检测车辆20的标识的车辆确定为第一车辆30。
其中,本申请实施例中的待检测车辆20和第一车辆30可以为车辆002,如图3所示。车辆002是通过车载传感***感知道路环境,自动规划行车路线并控制车辆到达预定目标的汽车。智能汽车集中运用了计算机、现代传感、信息融合、通讯、人工智能及自动控制等技术,是一个集环境感知、规划决策、多等级辅助驾驶等功能于一体的高新技术综合体。其中,本申请中的智能车辆可以是拥有以计算机***为主的智能驾驶仪的智能车辆,其中,该智能驾驶仪用于使车辆实现无人驾驶,也可以是拥有辅助驾驶***或者全自动驾驶***的智能车辆,还可以是轮式移动机器人等。
其中,本申请实施例中的服务器10可以用独立的服务器或者是多个服务器组成的服务器集群来实现。
请参见图2,图2为本申请实施例提供的另一种车辆检测***架构示意图。如图2所示,车辆检测***架构包括服务器、待检测车辆、第一车辆和路边摄像机,其中,服务器还可以与路边摄像机进行交互。具体的,在待检测车辆和第一车辆经过该路边摄像机所在的路段时,服务器可以向路边摄像机发送指令,以使路边摄像机开启拍摄和将拍摄的图像上传至服务器,最后,服务器可以根据路边摄像机拍摄的图像和第一图像来识别待检测车辆的待检测部件的故障情况。
可以理解的是,图1或图2中的车辆检测***架构只是本申请实施例中的一种示例性的实施方式,本申请实施例中的车辆检测***架构包括但不仅限于以上车辆检测***架构。
基于上述车辆检测***架构,本申请实施例提供了一种应用于上述车辆检测***架构中的车辆002,请参见图3,图3为本申请实施例提供的一种车辆002的功能框图。
在一个实施例中,可以将车辆002配置为完全或部分地自动驾驶模式。例如,车辆002可以在处于自动驾驶模式中的同时控制自身,并且可通过人为操作来确定车辆及其周边环境 的当前状态,确定周边环境中的至少一个其他车辆的可能行为,并确定该其他车辆执行可能行为的可能性相对应的置信水平,基于所确定的信息来控制车辆002。在车辆002处于自动驾驶模式中时,可以将车辆002置为在没有和人交互的情况下操作。
车辆002可包括各种子***,例如行进***202、传感器***204、控制***206、一个或多个***设备208以及电源210、计算机***212和用户接口216。可选地,车辆002可包括更多或更少的子***,并且每个子***可包括多个元件。另外,车辆002的每个子***和元件可以通过有线或者无线互连。
行进***202可包括为车辆002提供动力运动的组件。在一个实施例中,行进***202可包括引擎218、能量源219、传动装置220和车轮221。引擎218可以是内燃引擎、电动机、空气压缩引擎或其他类型的引擎组合,例如气油发动机和电动机组成的混动引擎,内燃引擎和空气压缩引擎组成的混动引擎。引擎218将能量源219转换成机械能量。
能量源219的示例包括汽油、柴油、其他基于石油的燃料、丙烷、其他基于压缩气体的燃料、乙醇、太阳能电池板、电池和其他电力来源。能量源219也可以为车辆002的其他***提供能量。
传动装置220可以将来自引擎218的机械动力传送到车轮221。传动装置220可包括变速箱、差速器和驱动轴。在一个实施例中,传动装置220还可以包括其他器件,比如离合器。其中,驱动轴可包括可耦合到一个或多个车轮221的一个或多个轴。
传感器***204可包括感测关于车辆002周边的环境的信息的若干个传感器。例如,传感器***204可包括全球定位***222(全球定位***可以是GPS***,也可以是北斗***或者其他定位***)、惯性测量单元(inertial measurement unit,IMU)224、雷达226、激光测距仪228以及相机230。传感器***204还可包括被监视车辆002的内部***的传感器(例如,车内空气质量监测器、燃油量表、机油温度表等)。来自这些传感器中的一个或多个的传感器数据可用于检测对象及其相应特性(位置、形状、方向、速度等)。这种检测和识别是自主车辆002的安全操作的关键功能。
全球定位***222可用于估计车辆002的地理位置。IMU 224用于基于惯性加速度来感测车辆002的位置和朝向变化。在一个实施例中,IMU 224可以是加速度计和陀螺仪的组合。例如:IMU 224可以用于测量车辆002的曲率。
雷达226可利用无线电信号来感测车辆002的周边环境内的物体。在一些实施例中,除了感测物体以外,雷达226还可用于感测物体的速度和/或前进方向。
激光测距仪228可利用激光来感测车辆002所位于的环境中的物体。在一些实施例中,激光测距仪228可包括一个或多个激光源、激光扫描器以及一个或多个检测器,以及其他***组件。
相机230可用于捕捉车辆002的周边环境的多个图像。相机230可以是静态相机或视频相机,也可以是可见光相机或红外相机,可以是任一用来获取图像的相机,本申请实施例对此不作限定。
本申请实施例中,相机230可以安装在车辆002的前侧、后侧以及左右两侧,相机230可以是通过旋转以调节拍摄角度的相机。另外,本申请实施例中的相机也可以通过伸缩杆安装在智能车辆上的任何位置,当需要获取图像时,伸缩杆伸展,以获取图像;当不需要获取图像时,伸缩杆收缩。本申请实施例中,相机230可以根据第一车辆接收的第二工作指令的指示下开启和关闭,并按照第二工作指令中携带的拍摄角度进行拍摄。
控制***206为控制车辆002及其组件的操作。控制***206可包括各种元件,其中包 括转向单元232、油门234、制动单元236、传感器融合算法单元238、计算机视觉***240、路线控制***242以及障碍物避免***244。
转向单元232可操作来调整车辆002的前进方向。例如在一个实施例中可以为方向盘***。
油门234用于控制引擎218的操作速度并进而控制车辆002的速度。
制动单元236用于控制车辆002减速。制动单元236可使用摩擦力来减慢车轮221。在其他实施例中,制动单元236可将车轮221的动能转换为电流。制动单元236也可采取其他形式来减慢车轮221转速从而控制车辆002的速度。
计算机视觉***240可以操作来处理和分析由相机230捕捉的图像以便识别车辆002周边环境中的物体和/或特征。所述物体和/或特征可包括交通信号、道路边界和障碍物。计算机视觉***240可使用物体识别算法、运动中恢复结构(Structure from Motion,SFM)算法、视频跟踪和其他计算机视觉技术。在一些实施例中,计算机视觉***240可以用于为环境绘制地图、跟踪物体、估计物体的速度等等。
路线控制***242用于确定车辆002的行驶路线。在一些实施例中,路线控制***242可结合来自传感器融合算法单元238、GPS 222和一个或多个预定地图的数据以为车辆002确定行驶路线。
障碍物避免***244用于识别、评估和避免或者以其他方式越过车辆002的环境中的潜在障碍物。
当然,在一个实例中,控制***206可以增加或替换地包括除了所示出和描述的那些以外的组件。或者也可以减少一部分上述示出的组件。
车辆002通过***设备208与外部传感器、其他车辆、其他计算机***或用户之间进行交互。***设备208可包括无线通信***246、车载电脑248、麦克风250和/或扬声器252。
在一些实施例中,***设备208提供车辆002的用户与用户接口216交互的手段。例如,车载电脑248可向车辆002的用户提供信息。用户接口216还可操作车载电脑248来接收用户的输入。车载电脑248可以通过触摸屏进行操作。在其他情况中,***设备208可提供用于车辆002与位于车内的其它设备通信的手段。例如,麦克风250可从车辆002的用户接收音频(例如,语音命令或其他音频输入)。类似地,扬声器252可向车辆002的用户输出音频。
无线通信***246可以直接地或者经由通信网络来与一个或多个设备无线通信。例如码分多址(code division multiple access,CDMA)、增强型多媒体盘片***(Enhanced Versatile Disk,EVD)、全球移动通信***(global system for mobile communications,GSM)/是通用分组无线服务技术(general packet radio service,GPRS),或者4G蜂窝通信,例如长期演进(long term evolution,LTE),或者5G蜂窝通信,或者新无线(new radio,NR)***,或者未来通信***等。无线通信***246可利用WiFi与无线局域网(wireless local area network,WLAN)通信。在一些实施例中,无线通信***246可利用红外链路、蓝牙或无线个域网(ZigBee)与设备直接通信。其他无线协议,例如:各种车辆通信***,例如,无线通信***246可包括一个或多个专用短程通信(dedicated short range communications,DSRC)设备,这些设备可包括车辆和/或路边台站之间的公共和/或私有数据通信。
电源210可向车辆002的各种组件提供电力。在一个实施例中,电源210可以为可再充电锂离子或铅酸电池。这种电池的一个或多个电池组可被配置为电源为车辆002的各种组件提供电力。在一些实施例中,电源210和能量源219可一起实现,例如一些全电动车中那样。
车辆002的部分或所有功能受计算机***212控制。计算机***212可包括至少一个处 理器213,处理器213执行存储在例如数据存储装置214这样的非暂态计算机可读介质中的指令215。计算机***212还可以是采用分布式方式控制车辆002的个体组件或子***的多个计算设备。
处理器213可以是任何常规的处理器,诸如商业可获得的CPU。替选地,该处理器可以是诸如ASIC或其它基于硬件的处理器的专用设备。尽管图3功能性地图示了处理器、存储器、和在相同块中的计算机120的其它元件,但是本领域的普通技术人员应该理解该处理器、计算机、或存储器实际上可以包括可以或者可以不存储在相同的物理外壳内的多个处理器、计算机、或存储器。例如,存储器可以是硬盘驱动器或位于不同于计算机120的外壳内的其它存储介质。因此,对处理器或计算机的引用将被理解为包括对可以或者可以不并行操作的处理器或计算机或存储器的集合的引用。不同于使用单一的处理器来执行此处所描述的步骤,诸如转向组件和减速组件的一些组件每个都可以具有其自己的处理器,所述处理器只执行与特定于组件的功能相关的计算。
在此处所描述的各个方面中,处理器可以位于远离该车辆并且与该车辆进行无线通信。在其它方面中,此处所描述的过程中的一些在布置于车辆内的处理器上执行而其它则由远程处理器执行,包括采取执行单一操纵的必要步骤。
在一些实施例中,存储器214可包含指令215(例如,程序逻辑),指令215可被处理器213执行来执行车辆002的各种功能,包括以上描述的那些功能。数据存储装置224也可包含额外的指令,包括向行进***202、传感器***204、控制***206和***设备208中的一个或多个发送数据、从其接收数据、与其交互和/或对其进行控制的指令。
除了指令215以外,存储器214还可存储数据,例如道路地图、路线信息,车辆的位置、方向、速度以及其它这样的车辆数据,以及其他信息。这种信息可在车辆002在自主、半自主和/或手动模式中操作期间被车辆002和计算机***212使用。例如:可以根据目标路段的道路信息,和接收的车辆速度范围和车辆曲率范围内对车辆的当前速度和当前曲率进行微调,以使智能车辆的速度和曲率在车辆速度范围和车辆曲率范围内。
用户接口216,用于向车辆002的用户提供信息或从其接收信息。可选地,用户接口216可包括在***设备208的集合内的一个或多个输入/输出设备,例如无线通信***246、车载电脑248、麦克风250和扬声器252。
计算机***212可基于从各种子***(例如,行进***202、传感器***204和控制***206)以及从用户接口216接收的输入来控制车辆002的功能。例如,计算机***212可利用来自控制***206的输入以便控制转向单元232来避免由传感器***204和障碍物避免***244检测到的障碍物。在一些实施例中,计算机***212可操作来对车辆002及其子***的许多方面提供控制。
可选地,上述这些组件中的一个或多个可与车辆002分开安装或关联。例如,存储器214可以部分或完全地与车辆002分开存在。上述组件可以按有线和/或无线方式来通信地耦合在一起。
可选地,上述组件只是一个示例,实际应用中,上述各个模块中的组件有可能根据实际需要增添或者删除,图3不应理解为对本申请实施例的限制。
在道路行进的自动驾驶汽车,如上面的车辆002,可以识别其周围环境内的物体以确定对当前速度的调整。所述物体可以是其它车辆、交通控制设备、或者其它类型的物体。在一些示例中,可以独立地考虑每个识别的物体,并且基于物体的各自的特性,诸如它的当前速度、加速度、与车辆的间距等,可以用来确定自动驾驶汽车所要调整的速度。
可选地,自动驾驶汽车车辆002或者与自动驾驶车辆002相关联的计算设备(如图3的计算机***212、计算机视觉***240、存储器214)可以基于所识别的物体的特性和周围环境的状态(例如,交通、雨、道路上的冰、等等)来预测所述识别的物体的行为。可选地,每一个所识别的物体都依赖于彼此的行为,因此还可以将所识别的所有物体全部一起考虑来预测单个识别的物体的行为。车辆002能够基于预测的所述识别的物体的行为来调整它的速度。在这个过程中,也可以考虑其它因素来确定车辆002的速度,诸如,车辆002在行驶的道路中的横向位置、道路的曲率、静态和动态物体的接近度等等。
除了提供调整自动驾驶汽车的速度的指令之外,计算设备还可以提供修改车辆002的转向角的指令,以使得自动驾驶汽车遵循给定的轨迹和/或维持与自动驾驶汽车附近的物体(例如,道路上的相邻车道中的轿车)的安全横向和纵向距离。
上述车辆002可以为轿车、卡车、摩托车、公共汽车、船、飞机、直升飞机、割草机、娱乐车、游乐场车辆、施工设备、电车、高尔夫球车、火车、和手推车等,本申请实施例不做特别的限定。
可以理解的是,图3中的智能车辆功能图只是本申请实施例中的一种示例性的实施方式,本申请实施例中的智能车辆包括但不仅限于以上结构。
请参考图4,图4为本申请实施例提供的一种车辆检测方法的流程图,该方法可应用于上述图1或图2所述的车辆检测***中,其中的服务器10可以用于支持并执行图4中所示的方法流程步骤102-步骤105以及步骤109,待检测车辆可以用于支持并执行图4中所示的方法流程步骤101以及步骤106,第一车辆可以用于支持并执行图4中所示的方法流程步骤107-步骤108。该方法可以包括以下部分或全部步骤。
101、待检测车辆向服务器发送检测请求。
可选的,待检测车辆可以定时向服务器发送检测请求以获取车辆的故障情况,也可以在待检测部件工作次数达到预设次数时向服务器发送针对该待检测部件的检测请求,此处不作限定。其中,该检测请求可以针对待检测车辆的一个或多个待检测部件。
在一些实施例中,待检测车辆可在捕捉到自身某待检测部件启动工作的信号触发的时候,向服务器发送检测请求,该检测请求用于指示服务器立即对该待检测部件进行检测。
例如,待检测车辆在右转开启右转向灯时,待检测车辆在感应到右转向灯开启的信号触发,向服务器发送检测请求,该检测请求用于指示服务器在接收到检测请求的时刻执行对右转向灯的检测。可以理解的,该方法可以在车辆的行驶过程中,利用车辆正常开启部件的时机完成对待检测部件的检测,可以避免待检测车辆为完成对待检测部件的检测做出不符合交通规则行为的情况,例如,待检测车辆为完成对转向灯的检测而在直行路段开启转向灯的情况。
可以理解的,步骤101为可选的步骤,服务器可以根据自身存储的待检测车辆实时位置信息来确定待检测车辆。
102、服务器接收针对待检测车辆的待检测部件的检测请求。
其中,检测请求可以是步骤101中待检测车辆向服务器发送的。
在一种实现中,该检测请求是由服务器生成的,服务器可以根据天气情况和时刻确定待检测部件,例如雨天时可以确定待检测部件为雨刷,又例如夜晚开启车灯等。可以理解的,该方法可以避免对其他车辆造成影响。
在另一种实现中,服务器接收到检测请求也可以是由其他服务器发送或人为输入的,此 处不做限定。
103、服务器响应于检测请求,根据待检测车辆的位置信息,确定第一车辆。
可选的,服务器在接收到检测请求后,可以根据待检测车辆的位置信息和预设条件查找车辆,将满足该预设条件的车辆确定为第一车辆。其中,预设条件可以为第一车辆与待检测车辆的距离小于目标阈值。例如,预设条件可以为第一车辆与待检测车辆的距离小于10米,则服务器可以先获取待检测车辆的位置,进而,将与待检测车辆的距离小于10米的车辆确定为第一车辆。其中,服务器可以实时获取待检测车辆的位置。可以理解的,在待检测车辆周围的车辆才具备拍摄到待检测车辆的条件,因而可以将待检测车辆周围的车辆作为第一车辆。
其中,位置信息可以为根据车辆上报的经纬度信息在地图上指示的位置。服务器可以根据该位置,结合地图上的车道方向信息,确定查找的第一车辆是位于车辆前进方向指示的前方或后方。位置信息还可以包括路径规划信息等,此处不作限定。
在一种实现中,预设条件还可以包括根据待检测部件确定的条件。例如,在待检测部件位于待检测车辆的左右两侧时,预设条件还包括待检测车辆行驶在至少有两条行车道的道路上,则服务器根据待检测车辆当前的位置获取待检测车辆当前的车道情况,在待检测车辆行驶在至少有两条行车道的道路上时,再查找与待检测车辆的距离小于目标阈值的车辆。可以理解的,在待检测部件位于待检测车辆的左右两侧时,若第一车辆和待检测车辆为同一车道的前后车辆,第一车辆能够拍摄到待检测部件的难度大。
在一些实施例中,服务器还可以根据其他预设条件确定第一车辆,此处不作限定。服务器根据待检测车辆确定第一车辆的方法具体可参见步骤201~204,此处不再赘述。
需要说明的是,服务器在未查询到满足预设条件的车辆时,服务器可以向待检测车辆反馈该情况,也可以继续获取待检测车辆的位置和查找第一车辆,在出现满足预设条件的车辆时,确定该车辆为第一车辆,进而,执行步骤104~108。
在一些实施例中,服务器可以同时接受来自多个待检测车辆的检测请求,进而,服务器分别确定上述多个待检测车辆的第一车辆,再根据上述多个待检测车辆的第一车辆的数量和位置,确定对上述多个待检测车辆的检测顺序,进而,分别对上述多个待检测车辆的每一个车辆进行检测。
104、服务器向待检测车辆发送第一工作指令。对应地,待检测车辆接收第一工作指令。
其中,第一工作指令用于指示待检测部件在目标时间段内开启至少一次。
在一些实施例中,第一工作指令可以是服务器根据待检测部件生成的,其中,第一工作指令还可以包括待检测部件的工作时间和工作顺序。例如,待检测部件为雨刷时,该第一工作指令可以用于指示待检测车辆在接收到第一工作指令时控制雨刷工作5秒,又例如,待检测部件为转向灯时,该第一工作指令可以用于待检测车辆在接收到第一工作指令时开启右转向灯,在右转向灯工作5秒后,关闭右转向灯并开启左转向灯,在左转向灯工作5秒后关闭左转向灯。可选地,服务器还可以根据天气状况等调整对待检测车辆的待检测部件的检测时间。
105、服务器向第一车辆发送第二工作指令。对应地,第一车辆接收第二工作指令。
其中,第二工作指令用于指示第一车辆在该目标时间段内拍摄第一图像和将第一图像发送至服务器。可以理解的,第一工作指令和第二工作指令用于使第一车辆能够拍摄到待检测部件的工作情况,得到第一图像。
在一些实施例中,第二工作指令还可以携带相机的拍摄角度。可选的,服务器可以根据待检测车辆的位置、第一车辆的位置和待检测部件,确定相机的拍摄角度,其中,待检测车 辆的位置和第一车辆的位置用于确定待检测车辆与第一车辆之间的相对位置。例如,服务器可以根据待检测车辆和第一车辆的位置,得到待检测车辆位于第一车辆的后方,再根据待检测部件为左转向灯以及左转向灯的具***置,进一步的确定相机的拍摄角度。可以理解的,本申请实施例中的待检测车辆、第一车辆以及待检测部件的位置可以为其空间地理坐标,因而得到的拍摄角度可以为三维空间中的一个方向。
在另一些实施例中,第一工作指令和第二工作指令还可以包括控制车辆速度、切换车道等指令,以使调整待检测车辆和第一车辆的位置。例如,待检测部件为刹车灯时,第一工作指令用于指示待检测车辆执行至少一次刹车,第二工作指令还用于指示第一车辆在待检测车辆每次执行刹车的前后调整与待检测车辆的距离。其中,第一车辆与待检测车辆的一次具体的配合过程可以是,第一车辆先根据第二工作指令与待检测车辆保持第一距离,待检测车辆再根据第一工作指令踩刹车,在维持2秒后松开,此时第一车辆与待检测车辆的距离为第二距离,第一距离大于第二距离。可以理解的,通过第一车辆来控制与待检测车辆之间的距离,不会影响待检测车辆的待检测部件的正常工作,还可以防止车辆之间距离过近产生碰撞以及距离过远拍摄的第一图像清晰度不够。
106、待检测车辆根据第一工作指令,在目标时间段内控制待检测部件开启至少一次。
具体的,待检测车辆可以根据第一工作指令,在目标时间段内开启待检测部件。例如,待检测车辆可以根据针对车灯的第一工作指令,在目标时间段的第一时间点开启该车灯;在第二时间点关闭该车灯,该第一时间点和该第二时间点为该目标时间段内的两个时间点,又例如,待检测车辆可以根据针对雨刷的第一工作指令,控制该雨刷在目标时间段内处于工作状态。
在一些实施例中,待检测车辆还可以根据第一工作指令,在目标时间段内控制待检测部件以预设规律进行工作。例如,待检测车辆在接收到第一工作指令后的一分钟内按照预设时间间隔开启和关闭雾灯。
在另一些实施例中,待检测车辆还可以根据第一工作指令,在目标时间段内控制待检测部件工作的同时,启动其他部件工作,以配合第一车辆拍摄第一图像。请参见图5,图5为本申请实施例提供的一种车辆检测方法的示意图。其中,待检测部件为车辆左右两侧的轮胎、车身、左右后视镜灯和车窗,图5中虚线车辆表示车辆前一时刻的位置,实线车辆表示车辆当前位置,虚线箭头表示车辆从前一时刻行驶至当前时刻的行驶方向。具体的,如图5中(A)所示,待检测车辆可以根据第一工作指令切换到左车道后,在左车道维持10秒的直行并在10秒期间降下所有车窗至最低位置后将其上升至最高位置;进而,如图5中的(B)所示,待检测车辆再根据第一工作指令切换回原车道后切换至右车道,在右车道维持10秒的直行并在10秒期间降下所有车窗至最低位置后将其上升至最高位置,期间第一车辆如图5所示保持直行并对待检测车辆进行拍摄。在待检测车辆所在车道只有左车道或右车道时,则服务器可以在待检测车辆切换车道一次后,向第一车辆发送第二工作指令,该第二工作指令包括车道,以使第一车辆进行切换至上述车道,以拍摄待检测车辆车窗和后视镜灯的故障情况。可以理解的,通过上述方法可以检测到车辆侧面的部件,提高了检测的部件的覆盖度。
可选地,待检测车辆可以在完成第一工作指令后,向服务器发送第一工作指令执行完毕的反馈。
107、第一车辆根据第二工作指令,在目标时间段内拍摄第一图像。
其中,第一图像可以为一张图像,也可以为一段视频。可以理解的,视频为多张图像组成的图像序列。
在一些实施例中,第二工作指令还包括相机的拍摄角度。具体的,第一车辆可以根据该拍摄角度确定目标相机,例如,确定目标相机为前相机,该前相机为位于第一车辆的前侧的车载相机,进而调整相机的角度为该拍摄角度,例如使前相机旋转至拍摄方向为左下方的车灯,再在目标时间内开启前相机,获取第一图像。
108、第一车辆向服务器发送第一图像。对应地,服务器接收第一图像。
可选的,第一车辆可以在执行步骤107后将第一图像发送至服务器,也可以在第一图像为多张图像时,将获取的第一图像中的每一张图像实时发送至服务器,此处不作限定。
109、服务器从第一图像中识别待检测部件的故障情况。
具体的,服务器可以先在第一图像中识别待检测部件,再根据第一图像中的待检测部件识别待检测部件的故障情况。
在一些实施例中,第一图像为一张图像。
在一种实现中,服务器可以从第一图像中识别待检测部件的位置;在待检测部件在第一图像中的位置位于目标位置时,确定待检测部件不存在故障。例如,待检测部件为车窗,则服务器可以先从第一图像中识别车窗的位置,再将第一图像中车窗的位置与目标位置进行对比,若第一工作指令用于指示车窗玻璃处于半开启的状态,服务器在第一图像中检测到车窗玻璃位于车窗一半位置时,确定车窗不存在故障。
在另一种实现中,服务器还从第一图像中识别待检测部件;在待检测部件在第一图像中的平均灰度值在目标范围内时,确定待检测部件不存在故障。例如,待检测部件为车灯,第一工作指令用于控制车灯处于开启状态,则服务器在识别到车灯在第一图像中的平均灰度值在车灯开启时在图像中的灰度值范围内时,确定该车灯不存在故障。
在另一些实施例中,第一图像包括多张图像。
在一种实现中,服务器可以在第一图像的每帧图像中分别识别待检测部件的位置;当待检测部件在多帧图像中的位置不一致时,确定待检测部件不存在故障。例如,第一工作指令用于指示雨刷在目标时间段内进行一次工作,该目标时间段为雨刷进行一次工作所需的时间,则服务器识别雨刷在多帧图像中的位置不一致时,确定雨刷不存在故障,或者,服务器可以在所述多帧图像中分别识别雨刷和喷水的水流分布情况,在识别到多帧图像中每一帧图像的雨刷位置和水流分布情况存在规律性变化时,确定雨刷不存在故障。
在另一种实现中,服务器可以识别多帧图像的每帧图像中的待检测部件;根据每帧图像的中待检测部件的灰度值识别待检测部件的工作状态;若根据多帧图像识别到的待检测部件的工作状态的变化规律满足预设规律时,确定待检测部件不存在故障,第一工作指令用于指示待检测部件以预设规律工作。
在又一种实现中,服务器还可以在若待检测部件的工作状态在多帧图像中的存在两帧图像中的不同时,确定待检测部件不存在故障。例如第一工作指令用于指示待检测部件在目标时间段内开启并关闭一次。
可选地,在服务器在得到待检测部件的故障情况后,可以向该待检测车辆发送该待检测部件的故障情况。待检测车辆可以在待检测部件存在故障时,向服务器发送查询请求。服务器根据该查询请求,可以在地图中查找距离该待检测车辆的预设距离内的修车点信息,向该待检测车辆发送查找到的修车点信息。
可选地,服务器还可以与路边固定的摄像机交互,获取的路边摄像机拍摄的图像,根据路边摄像机拍摄的图像识别待检测部件的故障情况。可以理解的是,路边摄像机有高度优势,可拍摄到目标车辆车顶车身和车顶天窗的状况。
可选地,服务器还可以基于路边固定的摄像机和第一车辆获取的待检测车辆的图像联合识别检测待检测部件的故障情况。
可选地,在执行步骤103后且执行步骤104之前,服务器可以向待检测车辆发送命令,以使待检测车辆保持车道不变并且匀速行驶;同时给第一车辆发送命令,使其加速或减速靠近待检测车辆至目标距离后保持匀速行驶和在匀速行驶后向服务器发送反馈。其中,目标距离可以是根据待检测部件决定的,在待检测部件面积较小时目标距离较小。可以理解的,第一车辆与待检测车辆保持匀速和目标距离,可以提高第一车辆拍摄的第一图像的清晰度。
以下介绍该检测请求为待检测车辆在待检测部件启动工作时生成的请求的一个实施例。
可选的,在步骤101中,待检测车辆在待检测部件开启时,向服务器发送针对该待检测部件的检测请求。在步骤102和步骤103中,服务器可以根据该检测请求识别到待检测部件处于工作状态,进而,服务器可以不执行步骤104和106。
服务器可以优先响应该检测请求,根据优先预设条件筛选出多个第一车辆,进而,通过步骤105、107、108和步骤109完成对该待检测部件的检测。可选的,服务器同时给上述多个第一车辆下发第二指令拍摄对应角度的图像,服务器可结合图像中的车辆标识和对应的待检测部件的工作状态,确定待检测部件的故障情况。
其中,服务器筛选第一车辆的过程可以不包括以下步骤201~204。可以理解的,由于该检测请求是在待检测部件开启工作时发送至服务器的,服务器在最短时间内确定第一车辆并让第一车辆完成对待检测车辆的拍摄,可以在待检测车辆正常开启待检测部件的时间内完成对待检测部件的检测而避免对交通造成不必要的影响。其中,服务器尽量减少确定第一车辆的步骤,可以缩短检测过程的时间,避免在第一车辆对待检测车辆进行拍摄前,该待检测车辆已经停止该待检测部件的工作状态,导致检测结果出现误差。
结合图6,以下介绍多个第一车辆对待检测车辆进行检测的一个实施例。
具体的,服务器可以根据第一指令确定待检测车辆,再根据待检测车辆确定第一车辆。在筛选出有多个第一车辆在目标阈值范围内且都能识别到待检测车辆标识时,服务器可以同时给多个第一车辆下发第二指令,在不同时段分别给不同的第一车辆分配不同的检测部件范围,多个第一车辆配合同时或分时开展对待检测车辆的检测。
请参见图6,图6为本申请实施例提供的一种多个第一车辆的示意图。如图6所示,一号第一车辆、二号第一车辆、三号第一车辆为服务器在待检测车辆同车道的前方,后方和左边车道的左后方筛选出的三辆第一车辆。具体的,服务器可以给前方的一号第一车辆下发雨刷,喷水部件的检测指令,给后方的二号第一车辆下发双跳灯部件的检测指令,给左后方的三号第一车辆下发左侧车窗部件的检测指令,同时下发携带车速信息的指令,所述指令用于使所有第一车辆和待检测车辆维护相同车速,同时刻开展检测。在检测完成时,一号第一车辆和三号第一车辆可以退出检测,进而,服务器给二号第一车辆下发刹车灯的检测指令,该刹车灯的检测指令可以携带减速信息,该减速信息用于拉开二号第一车辆与待检测车辆的距离以使待检测车辆具备执行刹车动作的空间。
请参见图7,图7为本申请实施例提供的一种确定第一车辆的方法流程图。如图7所示,服务器确定第一车辆的方法可以包括以下部分或全部步骤:
201、根据待检测车辆的位置,确定第一区域内的车辆。
请参见图8A,图8A为本申请实施例提供的一种确定第一区域内的车辆的示意图。如图8A所示,服务器可以根据待检测车辆的位置,将待检测车辆周围的车辆确定为第一区域内的车辆。其中,第一区域内的车辆可以为与待检测车辆的距离小于目标阈值的车辆,具体可参见步骤103的相关内容。
202、在第一区域的车辆中筛选出路径规划与待检测车辆的路径规划部分重合的车辆。
具体的,服务器可以分别获取第一区域内每一车辆的路径规划,进而,分别将第一区域内每一车辆的路径规划与待检测车辆的路径规划进行对比,筛选出与待检测车辆的路径规划部分重合的车辆。其中,路径规划为车辆规划行驶的道路。
可以理解的,筛选出的车辆与待检测车辆的路径规划部分重合,因此筛选出的车辆在重合的规划道路上对第一车辆进行检测,可以避免检测过程中两车的行驶方向不同而导致检测失败,以及避免完成检测过程后第一车辆脱离原始路径规划。
在一些实施例中,服务器可以从筛选出的车辆中确定第一车辆,其中,筛选出的车辆和待检测车辆所重合的路径规划在该目标时间段内的路径规划为直道。
203、根据待检测部件在待检测车辆的位置,从筛选出的车辆中确定第二车辆。
在一种实现中,在待检测部件位为待检测车辆的前侧时,将位于待检测车辆的前方确定为第二车辆。例如,待检测部件为前雨刷、前喷水设备、前转向灯、前夜大灯、前雾灯、双跳灯以及日间行车灯等,服务器可以将筛选出的车辆中位于待检测车辆的前方确定为第二车辆。
在另一种实现中,在待检测部件位为待检测车辆的后侧时,将位于待检测车辆的后方确定为第二车辆。例如,待检测部件为后喷水设备、后雨刷、后转向灯、后视镜转向灯、后夜大灯、后雾灯、双跳灯及后视镜等,服务器可以将筛选出的车辆中位于待检测车辆的后方确定为第二车辆。可以理解的,待检测车辆后方的车辆可以使用车载前相机采集到待检测车辆后侧的图像。
请参见图8B,图8B为本申请实施例提供的一种确定第二车辆的示意图。如图8B所示,待检测部件位于待检测车辆的后侧,一号车辆和二号车辆位于车辆的后方,三号车辆位于车辆的前方,因此将一号车辆和二号车辆确定为第二车辆。
在一些实施例中,服务器可以将第二车辆确定为第一车辆。可以理解的,该方法可以减少确定第一车辆的步骤,缩短检测时间,从而提高检测效率。
204、根据第二车辆分别拍摄的图像和待检测车辆的标识,从第二车辆中确定第一车辆。
可选的,服务器可以向筛选出的车辆发送拍摄指令,拍摄指令用于指示筛选出的车辆分别拍摄第二图像,再从第二图像中识别待检测车辆的标识,将筛选出的车辆中拍摄的第二图像包含标识的车辆确定为第一车辆。具体可以参见步骤2041~步骤2044的详细内容。
请参见图8C,图8C为本申请实施例提供的一种从第二车辆中确定第一车辆的示意图。如图8C所示,第二车辆包括一号车辆和二号车辆,服务器可以获取一号车辆和二号车辆拍摄的第二图像,从第二图像中识别待检测车辆的标识。由于一号车辆与待检测车辆之间被二号车辆所遮挡,一号车辆所拍摄的第二图像中不包括待检测车辆的标识,二号车辆拍摄的第二图像中包含待检测车辆的标识,服务器可以将二号车辆确定为第一车辆。
需要说明的是,服务器执行步骤201~步骤204的顺序可以根据具体实施例调整,不做限定。
请参见图9,图9为本申请实施例提供的一种根据第二图像确定第一车辆的方法流程图。 具体的,可以包括以下步骤:
2041、服务器向第二车辆发送拍摄指令。
该拍摄指令可以包括相机位置和相机角度。例如,待检测部件为前雨刷,该拍摄指令可以用于指示第二车辆开启车辆的后相机。
2042、第二车辆根据拍摄指令,拍摄第二图像。
其中,第二车辆可以为一个或多个车辆。具体的,每一车辆接收到一个摄像指令,车辆根据该车辆对应的摄像指令开启对应的相机,得到第二图像,其中,第二图像可以为一张图像或由多张图像组成的图像序列。
2043、第二车辆向服务器发送第二图像。
2044、服务器根据第二图像和待检测车辆的标识,从第二车辆中确定第一车辆。
可选地,服务器可以在第二图像中识别待检测车辆的标识,若该车辆发送的第二图像中包含该标识,则该车辆为第一车辆。该标识可以为车辆的车牌号和车辆的编号等。
可选地,若服务器发现与待检测车辆位于同一车道的前后车均拍摄不到包含标识的图像,但在相邻车道有车辆拍摄到包含标识的图像,则服务器可以向该相邻车道的车辆发送指令使其切换至与待检测车辆相同的车道,也可以停止车辆检测。
可以理解的,在第一车辆与待检测车辆之间存在障碍物的情况下,第一车辆无法拍摄到待检测车辆。通过第二图像和待检测车辆的标识确定第一车辆,可以保证第一车辆能够拍摄到待检测车辆。
图10为本申请实施例中一种服务器的硬件结构示意图。图10所示的服务器10(该服务器10具体可以是一种计算机设备)包括存储器101、处理器102、通信接口103以及总线104。其中,存储器101、处理器102、通信接口103通过总线104实现彼此之间的通信连接。存储器101可以是只读存储器(Read Only Memory,ROM),静态存储设备,动态存储设备或者随机存取存储器(Random Access Memory,RAM)。存储器101可以存储程序,当存储器101中存储的程序被处理器102执行时,处理器102和通信接口103用于执行本申请实施例中的车辆检测的各个步骤。
处理器102可以采用通用的中央处理器(Central Processing Unit,CPU),微处理器,应用专用集成电路(Application Specific Integrated Circuit,ASIC),图形处理器(graphics processing unit,GPU)或者一个或多个集成电路,用于执行相关程序,以实现执行本申请方法实施例的车辆检测方法。
处理器102还可以是一种集成电路芯片,具有信号的处理能力。在实现过程中,本申请的车辆检测方法的各个步骤可以通过处理器102中的硬件的集成逻辑电路或者软件形式的指令完成。上述的处理器102还可以是通用处理器、数字信号处理器(Digital Signal Processing,DSP)、专用集成电路(ASIC)、现成可编程门阵列(Field Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。可以实现或者执行本申请实施例中的提供的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。结合本申请实施例所提供的方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器101,处理器102读取存储器101中的信息,结合其硬件完成本申请实施例的车辆故障检测方法。
通信接口103使用例如但不限于收发器一类的收发装置,来实现服务器10与其他设备或通信网络之间的通信。例如,可以通过通信接口103获取数据(如本申请实施例中的第一图像)。
总线104可包括在服务器10各个部件(例如,存储器101、处理器102、通信接口103)之间传送信息的通路。在上述实施例中,全部或部分功能可以通过软件、硬件、或者软件加硬件的组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如,固态硬盘(solid state disk,SSD))等。
请参见图11,图11为本申请实施例提供的一种车辆检测装置的结构示意图。该车辆检测装置300应用于服务器,该装置包括:
确定单元301,用于根据所述待检测车辆的位置信息,确定第一车辆,所述第一车辆与所述待检测车辆的距离小于目标阈值;
发送单元302,用于向所述待检测车辆发送第一工作指令,所述第一工作指令用于指示所述待检测部件在目标时间段内开启至少一次;
发送单元302,用于向所述第一车辆发送第二工作指令,所述第二工作指令用于指示所述第一车辆在所述目标时间段内拍摄第一图像和将所述第一图像发送至所述服务器;
识别单元303,用于从所述第一图像中识别所述待检测部件的故障情况。
在一种可能的实现方式中,该车辆检测装置还可以包括接收单元304,用于接收针对待检测车辆的待检测部件的检测请求,该检测请求用于指示服务器向待检测车辆发送第一工作指令和向第一车辆发送第二工作指令。
在一种可能的实现方式中,所述确定单元301,还用于:
获取至少一个车辆中每一个车辆的路径规划和所述待检测车辆的路径规划,所述至少一个车辆与所述待检测车辆的距离均小于所述目标阈值;
从所述至少一个车辆中筛选出路径规划与所述待检测车辆的路径规划部分重合的车辆;
从筛选出的车辆中确定所述第一车辆。
在一种可能的实现方式中,所述确定单元301,还用于:
向所述筛选出的车辆发送拍摄指令,所述拍摄指令用于指示所述筛选出的车辆分别拍摄第二图像;
根据所述筛选出的车辆分别拍摄的第二图像和所述待检测车辆的标识,从所述筛选出的车辆中确定所述第一车辆,所述第一车辆拍摄的第二图像包含所述标识。
在一种可能的实现方式中,在所述待检测部件位于所述待检测车辆的前侧时,所述第一车辆位于所述待检测车辆的前方。
在一种可能的实现方式中,在所述待检测部件位于所述待检测车辆的后侧时,所述第一车辆位于所述待检测车辆的后方。
在一种可能的实现方式中,所述确定单元301,用于:
根据所述待检测车辆的位置、所述第一车辆的位置和所述待检测部件,确定相机的拍摄角度,所述第二工作指令携带所述拍摄角度。
在一种可能的实现方式中,所述识别单元303,用于:
从所述第一图像中识别所述待检测部件;
在所述待检测部件在所述第一图像中的平均灰度值在目标范围内时,确定所述待检测部件不存在故障。
在一种可能的实现方式中,所述识别单元303,用于:
从所述第一图像中识别所述待检测部件的位置;
在所述待检测部件在所述第一图像中的位置位于目标位置时,确定所述待检测部件不存在故障。
在一种可能的实现方式中,所述第一图像包括多帧图像,所述识别单元303,用于:
在所述第一图像的每帧图像中分别识别所述待检测部件的位置;
当所述待检测部件在所述多帧图像中的位置不一致时,确定所述待检测部件不存在故障。
在一种可能的实现方式中,所述第一图像包括多帧图像,所述识别单元303,用于:
识别所述多帧图像的每帧图像中的待检测部件;
根据所述每帧图像的中所述待检测部件的灰度值识别所述待检测部件的工作状态;
若根据所述多帧图像识别到的所述待检测部件的工作状态的变化规律满足预设规律时,确定所述待检测部件不存在故障,所述第一工作指令用于指示所述待检测部件以预设规律工作。
有关上述接收单元304、确定单元301、发送单元302和识别单元303更详细的描述可以直接参考上述图4所示的方法实施例中的相关描述直接得到,这里不加赘述。
请参见图12,图12为本申请实施例提供的一种车辆检测装置的结构示意图。该车辆检测装置400应用于待检测车辆,该装置包括:
接收单元401,用于接收服务器发送的第一工作指令,服务器还用于向第一车辆发送第二工作指令,第一车辆与待检测车辆的距离小于目标阈值,第二工作指令用于指示第一车辆在目标时间段内拍摄第一图像和将第一图像发送至服务器,第一图像用于识别待检测车辆的待检测部件的故障情况;
控制单元402,用于根据所述第一工作指令,在所述目标时间段内控制所述待检测部件开启至少一次。
在一种可能的实现方式中,该车辆检测装置还可以包括发送单元403,用于向服务器发送针对待检测车辆的待检测部件的检测请求,检测请求用于指示服务器向待检测车辆发送第一工作指令和向第一车辆发送第二工作指令。
在一种可能的实现方式中,所述控制单元402,用于:
根据所述第一工作指令,控制所述待检测部件在所述目标时间段内以预设规律工作。
在一种可能的实现方式中,所述控制单元402,用于:
控制待检测部件在所述目标时间段内处于工作状态。
在一种可能的实现方式中,所述待检测部件为车灯,所述控制单元402,用于:
在第一时间点开启所述车灯;
在第二时间点关闭所述车灯,所述第一时间点和所述第二时间点为所述目标时间段内的两个时间点。
有关上述发送单元403、接收单元401和控制单元402更详细的描述可以直接参考上述图4所示的方法实施例中的相关描述直接得到,这里不加赘述。
请参见图13,图13为本申请实施例提供的一种车辆检测装置的结构示意图。该车辆检测装置500应用于待检测车辆,该装置包括:
接收单元501,用于接收服务器发送的第二工作指令,所述服务器还用于向所述待检测车辆发送第一工作指令,所述第一工作指令用于指示所述待检测车辆的待检测部件在目标时间段内开启至少一次;所述第一车辆与所述待检测车辆的距离小于目标阈值;
拍摄单元502,用于根据所述第二工作指令,在所述目标时间段内拍摄第一图像;
发送单元503,用于将所述第一图像发送至所述服务器,所述第一图像用于识别所述待检测部件的故障情况。
在一种可能的实现方式中,所述第二工作指令是在所述服务器响应于所述待检测车辆发送的针对所述待检测部件的检测请求时生成的,所述检测请求还用于指示所述服务器向所述待检测车辆发送所述第一工作指令。
在一种可能的实现方式中,所述接收单元501用于接收所述服务器发送的拍摄指令;所述拍摄单元503用于根据所述拍摄指令,拍摄第二图像,所述第二图像用于确定所述第一车辆;所述发送单元503将所述第二图像发送至所述服务器,所述第二图像包括所述待检测车辆的标识。
在一种可能的实现方式中,所述第二工作指令携带所述拍摄角度,所述拍摄单元502,用于:
控制相机以所述拍摄角度拍摄所述第一图像,所述拍摄角度是所述服务器根据所述待检测车辆的位置、所述第一车辆的位置和所述待检测部件确定的。
在一种可能的实现方式中,所述第二工作指令携带车道,所述装置还包括控制单元504,所述控制单元504,用于:
切换至所述车道,所述车道是服务器根据所述待检测车辆的位置、所述第一车辆的位置和所述待检测部件确定的。
有关上述接收单元501、拍摄单元502、发送单元503和控制单元504更详细的描述可以直接参考上述图4所示的方法实施例中的相关描述直接得到,这里不加赘述。
本申请实施例提供一种电子设备,该电子设备包括:一个或多个处理器、存储器;存储器与一个或多个处理器耦合,存储器用于存储计算机程序代码,计算机程序代码包括计算机指令,一个或多个处理器用于调用计算机指令以使得电子设备执行如图4至图9中服务器所执行的方法。
本申请实施例还提供一种电子设备,该电子设备可以包括处理器和通信接口,处理器通过通信接口获取程序指令,当程序指令被处理器执行时实现执行如图4至图9中待检测车辆所执行的方法。
本申请实施例又提供一种电子设备,该电子设备可以包括处理电路,处理电路配置为执行如图4至图9中第一车辆所执行的方法。
本申请实施例提供一种芯片,该芯片应用于电子设备,该芯片包括一个或多个处理器,该处理器用于调用计算机指令以使得该电子设备执行如图4至图9所描述的方法。
可选地,在第一种可能的实施方式中,芯片***还可以包括存储器,存储器,用于保存 车辆必要的程序指令和数据。该芯片***,可以由芯片构成,也可以包含芯片和其他分立器件。其中,芯片***可以包括专用集成电路(application specific integrated circuit,ASIC)、现成可编程门阵列(field programmable gate array,FPGA)或者其他可编程逻辑器件等。进一步,芯片***还可以包括接口电路等。
可选地,存储器位于处理器之内;或存储器位于处理器之外。
本申请实施例提供一种包含指令的计算机程序产品,当上述计算机程序产品在电子设备上运行时,使得上述电子设备执行如图4至图9中所描述的方法。
本申请实施例提供一种计算机可读存储介质,包括指令,当上述指令在电子设备上运行时,使得上述电子设备执行如图4至图9中所描述的方法。
在上述实施例中,全部或部分功能可以通过软件、硬件、或者软件加硬件的组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如,固态硬盘(solid state disk,SSD))等。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,该流程可以由计算机程序来指令相关的硬件完成,该程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如上述各方法实施例的流程。而前述的存储介质包括:ROM或随机存储记忆体RAM、磁碟或者光盘等各种可存储程序代码的介质。

Claims (46)

  1. 一种车辆检测方法,其特征在于,应用于服务器,所述方法包括:
    根据待检测车辆的位置信息,确定第一车辆,所述第一车辆与所述待检测车辆的距离小于目标阈值;
    向所述待检测车辆发送第一工作指令,所述第一工作指令用于指示所述待检测车辆的待检测部件在目标时间段内开启至少一次;
    向所述第一车辆发送第二工作指令,所述第二工作指令用于指示所述第一车辆在所述目标时间段内拍摄第一图像和将所述第一图像发送至所述服务器;
    从所述第一图像中识别所述待检测部件的故障情况。
  2. 根据权利要求1所述的方法,其特征在于,在所述根据待检测车辆的位置信息,确定第一车辆之前,所述方法还包括:
    接收针对所述待检测车辆的待检测部件的检测请求,所述检测请求用于指示所述服务器向所述待检测车辆发送所述第一工作指令和向所述第一车辆发送所述第二工作指令。
  3. 根据权利要求1或2所述的方法,其特征在于,所述根据待检测车辆的位置信息,确定第一车辆,包括:
    获取至少一个车辆中每一个车辆的路径规划和所述待检测车辆的路径规划,所述至少一个车辆与所述待检测车辆的距离均小于所述目标阈值;
    从所述至少一个车辆中筛选出路径规划与所述待检测车辆的路径规划部分重合的车辆;
    从筛选出的车辆中确定所述第一车辆。
  4. 根据权利要求3所述的方法,其特征在于,所述根据所述从筛选出的车辆中确定所述第一车辆,包括:
    向所述筛选出的车辆发送拍摄指令,所述拍摄指令用于指示所述筛选出的车辆分别拍摄第二图像;
    根据所述筛选出的车辆分别拍摄的第二图像和所述待检测车辆的标识,从所述筛选出的车辆中确定所述第一车辆,所述第一车辆拍摄的第二图像包含所述标识。
  5. 根据权利要求1-4任一项所述的方法,其特征在于,在所述待检测部件位于所述待检测车辆的前侧时,所述第一车辆位于所述待检测车辆的前方。
  6. 根据权利要求1-4任一项所述的方法,其特征在于,在所述待检测部件位于所述待检测车辆的后侧时,所述第一车辆位于所述待检测车辆的后方。
  7. 根据权利要求1-6任一项所述的方法,其特征在于,所述方法还包括:
    根据所述待检测车辆的位置、所述第一车辆的位置和所述待检测部件,确定相机的拍摄角度,所述第二工作指令携带所述拍摄角度。
  8. 根据权利要求1-7任一项所述的方法,其特征在于,所述识别所述第一图像中所述待检测部件的故障情况,包括:
    从所述第一图像中识别所述待检测部件;
    在所述待检测部件在所述第一图像中的平均灰度值在目标范围内时,确定所述待检测部件不存在故障。
  9. 根据权利要求1-7任一项所述的方法,其特征在于,所述识别所述第一图像中所述待检测部件的故障情况,包括:
    从所述第一图像中识别所述待检测部件的位置;
    在所述待检测部件在所述第一图像中的位置位于目标位置时,确定所述待检测部件不存在故障。
  10. 根据权利要求1-7任一项所述的方法,其特征在于,所述第一图像包括多帧图像,所述识别所述第一图像中所述待检测部件的故障情况,包括:
    在所述第一图像的每帧图像中分别识别所述待检测部件的位置;
    当所述待检测部件在所述多帧图像中的位置不一致时,确定所述待检测部件不存在故障。
  11. 根据权利要求1-7任一项所述的方法,其特征在于,所述第一图像包括多帧图像,所述识别所述第一图像中所述待检测部件的故障情况,包括:
    识别所述多帧图像的每帧图像中的待检测部件;
    根据所述每帧图像的中所述待检测部件的灰度值识别所述待检测部件的工作状态;
    若根据所述多帧图像识别到的所述待检测部件的工作状态的变化规律满足预设规律时,确定所述待检测部件不存在故障,所述第一工作指令用于指示所述待检测部件以预设规律工作。
  12. 一种车辆检测方法,其特征在于,应用于待检测车辆,所述方法包括:
    接收服务器发送的第一工作指令,所述服务器还用于向第一车辆发送第二工作指令,所述第一车辆与所述待检测车辆的距离小于目标阈值,所述第二工作指令用于指示所述第一车辆在目标时间段内拍摄第一图像和将所述第一图像发送至所述服务器,所述第一图像用于识别所述待检测车辆的待检测部件的故障情况;
    根据所述第一工作指令,在所述目标时间段内控制所述待检测部件开启至少一次。
  13. 根据权利要求12所述的方法,其特征在于,在接收所述服务器发送的第一工作指令前,所述方法还包括:
    向所述服务器发送针对所述待检测车辆的待检测部件的检测请求,所述检测请求用于指示所述服务器向所述待检测车辆发送所述第一工作指令和向所述第一车辆发送所述第二工作指令。
  14. 根据权利要求12或13所述的方法,其特征在于,所述在所述目标时间段内控制所述待检测部件开启至少一次,包括:
    根据所述第一工作指令,控制所述待检测部件在所述目标时间段内以预设规律工作。
  15. 根据权利要求12或13所述的方法,其特征在于,所述在所述目标时间段内控制所述 待检测部件开启至少一次,包括:
    控制所述待检测部件在所述目标时间段内处于工作状态。
  16. 根据权利要求12或13所述的方法,其特征在于,所述在所述目标时间段内控制所述待检测部件开启至少一次,包括:
    在第一时间点开启所述待检测部件;
    在第二时间点关闭所述待检测部件,所述第一时间点和所述第二时间点为所述目标时间段内的两个时间点。
  17. 一种车辆检测方法,其特征在于,应用于第一车辆,所述方法包括:
    接收服务器发送的第二工作指令,所述服务器还用于向待检测车辆发送第一工作指令,所述第一工作指令用于指示所述待检测车辆的待检测部件在目标时间段内开启至少一次,所述第一车辆与所述待检测车辆的距离小于目标阈值;
    根据所述第二工作指令,在所述目标时间段内拍摄第一图像;
    将所述第一图像发送至所述服务器,所述第一图像用于识别所述待检测部件的故障情况。
  18. 根据权利要求17所述的方法,其特征在于,所述第二工作指令是在所述服务器响应于所述待检测车辆发送的针对所述待检测部件的检测请求时生成的,所述检测请求用于指示所述服务器向所述待检测车辆发送所述第一工作指令和向所述第一车辆发送所述第二工作指令。
  19. 根据权利要求17或18所述的方法,其特征在于,在所述接收服务器发送的第二工作指令之前,所述方法还包括:
    接收所述服务器发送的拍摄指令;
    根据所述拍摄指令,拍摄第二图像,所述第二图像用于确定所述第一车辆;
    将所述第二图像发送至所述服务器,所述第二图像包括所述待检测车辆的标识。
  20. 根据权利要求17-19任一项所述的方法,其特征在于,所述第二工作指令携带所述拍摄角度,所述方法还包括:
    控制相机以所述拍摄角度拍摄所述第一图像,所述拍摄角度是所述服务器根据所述待检测车辆的位置、所述第一车辆的位置和所述待检测部件确定的。
  21. 根据权利要求17-19任一项的方法,其特征在于,所述第二工作指令携带车道,所述方法还包括:
    切换至所述车道,所述车道是服务器根据所述待检测车辆的位置、所述第一车辆的位置和所述待检测部件确定的。
  22. 一种车辆检测装置,其特征在于,应用于服务器,所述装置包括:
    确定单元,用于根据待检测车辆的位置信息,确定第一车辆,所述第一车辆与所述待检测车辆的距离小于目标阈值;
    发送单元,用于向所述待检测车辆发送第一工作指令,所述第一工作指令用于指示所述 待检测车辆的待检测部件在目标时间段内开启至少一次;
    发送单元,用于向所述第一车辆发送第二工作指令,所述第二工作指令用于指示所述第一车辆在所述目标时间段内拍摄第一图像和将所述第一图像发送至所述服务器;
    识别单元,用于从所述第一图像中识别所述待检测部件的故障情况。
  23. 根据权利要求22所述的装置,其特征在于,所述装置还包括接收单元,所述接收单元,用于:
    接收针对所述待检测车辆的待检测部件的检测请求,所述检测请求用于指示所述服务器向所述待检测车辆发送所述第一工作指令和向所述第一车辆发送所述第二工作指令。
  24. 根据权利要求22或23所述的装置,其特征在于,所述确定单元,还用于:
    获取至少一个车辆中每一个车辆的路径规划和所述待检测车辆的路径规划,所述至少一个车辆与所述待检测车辆的距离均小于所述目标阈值;
    从所述至少一个车辆中筛选出路径规划与所述待检测车辆的路径规划部分重合的车辆;
    从筛选出的车辆中确定所述第一车辆。
  25. 根据权利要求24所述的装置,其特征在于,所述确定单元,还用于:
    向所述筛选出的车辆发送拍摄指令,所述拍摄指令用于指示所述筛选出的车辆分别拍摄第二图像;
    根据所述筛选出的车辆分别拍摄的第二图像和所述待检测车辆的标识,从所述筛选出的车辆中确定所述第一车辆,所述第一车辆拍摄的第二图像包含所述标识。
  26. 根据权利要求22-25任一项所述的装置,其特征在于,在所述待检测部件位于所述待检测车辆的前侧时,所述第一车辆位于所述待检测车辆的前方。
  27. 根据权利要求22-25任一项所述的装置,其特征在于,在所述待检测部件位于所述待检测车辆的后侧时,所述第一车辆位于所述待检测车辆的后方。
  28. 根据权利要求22-27任一项所述的装置,其特征在于,所述确定单元,用于:
    根据所述待检测车辆的位置、所述第一车辆的位置和所述待检测部件,确定相机的拍摄角度,所述第二工作指令携带所述拍摄角度。
  29. 根据权利要求22-28任一项所述的装置,其特征在于,所述识别单元,用于:
    从所述第一图像中识别所述待检测部件;
    在所述待检测部件在所述第一图像中的平均灰度值在目标范围内时,确定所述待检测部件不存在故障。
  30. 根据权利要求22-28任一项所述的装置,其特征在于,所述识别单元,用于:从所述第一图像中识别所述待检测部件的位置;
    在所述待检测部件在所述第一图像中的位置位于目标位置时,确定所述待检测部件不存在故障。
  31. 根据权利要求22-28任一项所述的装置,其特征在于,所述第一图像包括多帧图像,所述识别单元,用于:
    在所述第一图像的每帧图像中分别识别所述待检测部件的位置;
    当所述待检测部件在所述多帧图像中的位置不一致时,确定所述待检测部件不存在故障。
  32. 根据权利要求22-28任一项所述的装置,其特征在于,所述第一图像包括多帧图像,所述识别单元,用于:
    识别所述多帧图像的每帧图像中的待检测部件;
    根据所述每帧图像的中所述待检测部件的灰度值识别所述待检测部件的工作状态;
    若根据所述多帧图像识别到的所述待检测部件的工作状态的变化规律满足预设规律时,确定所述待检测部件不存在故障,所述第一工作指令用于指示所述待检测部件以预设规律工作。
  33. 一种车辆检测装置,其特征在于,应用于待检测车辆,所述装置包括:
    接收单元,用于接收服务器发送的第一工作指令,所述服务器还用于向第一车辆发送第二工作指令,所述第一车辆与所述待检测车辆的距离小于目标阈值,所述第二工作指令用于指示所述第一车辆在目标时间段内拍摄第一图像和将所述第一图像发送至所述服务器,所述第一图像用于识别所述待检测车辆的待检测部件的故障情况;
    控制单元,用于根据所述第一工作指令,在所述目标时间段内控制所述待检测部件开启至少一次。
  34. 根据权利要求30所述的装置,其特征在于,所述装置还包括发送单元,所述发送单元,用于:
    向所述服务器发送针对所述待检测车辆的待检测部件的检测请求,所述检测请求用于指示所述服务器向所述待检测车辆发送所述第一工作指令和向所述第一车辆发送所述第二工作指令。
  35. 根据权利要求33所述的装置,其特征在于,所述控制单元,用于:
    根据所述第一工作指令,控制所述待检测部件在所述目标时间段内以预设规律工作。
  36. 根据权利要求33或34所述的装置,其特征在于,所述控制单元,用于:
    控制所述待检测部件在所述目标时间段内处于工作状态。
  37. 根据权利要求33或34所述的装置,其特征在于,所述控制单元,用于:
    在第一时间点开启所述待检测部件;
    在第二时间点关闭所述待检测部件,所述第一时间点和所述第二时间点为所述目标时间段内的两个时间点。
  38. 一种车辆检测装置,其特征在于,应用于第一车辆,所述装置包括:
    接收单元,用于接收服务器发送的第二工作指令,所述服务器还用于向所述待检测车辆 发送第一工作指令,所述第一工作指令用于指示所述待检测车辆的待检测部件在目标时间段内开启至少一次,所述第一车辆与所述待检测车辆的距离小于目标阈值;
    拍摄单元,用于根据所述第二工作指令,在所述目标时间段内拍摄第一图像;
    发送单元,用于将所述第一图像发送至所述服务器,所述第一图像用于识别所述待检测部件的故障情况。
  39. 根据权利要求38所述的装置,其特征在于,所述第二工作指令是在所述服务器响应于所述待检测车辆发送的针对所述待检测部件的检测请求时生成的,所述检测请求还用于指示所述服务器向所述待检测车辆发送所述第一工作指令。
  40. 根据权利要求38或39所述的装置,其特征在于,包括:
    所述接收单元用于接收所述服务器发送的拍摄指令;
    所述拍摄单元用于根据所述拍摄指令,拍摄第二图像,所述第二图像用于确定所述第一车辆;
    所述发送单元将所述第二图像发送至所述服务器,所述第二图像包括所述待检测车辆的标识。
  41. 根据权利要求38-40任一项所述的装置,其特征在于,所述第二工作指令携带所述拍摄角度,所述拍摄单元,用于:
    控制相机以所述拍摄角度拍摄所述第一图像,所述拍摄角度是所述服务器根据所述待检测车辆的位置、所述第一车辆的位置和所述待检测部件确定的。
  42. 根据权利要求38-40任一项所述的装置,其特征在于,所述第二工作指令携带车道,所述装置还包括控制单元,所述控制单元,用于:
    切换至所述车道,所述车道是服务器根据所述待检测车辆的位置、所述第一车辆的位置和所述待检测部件确定的。
  43. 一种包含指令的计算机程序产品,其特征在于,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行如权利要求1至11中任一项所述的方法。
  44. 一种包含指令的计算机程序产品,其特征在于,当所述计算机程序产品在电子设备上运行时,使得所述待检测车辆执行如权利要求12至16中任一项所述的方法。
  45. 一种包含指令的计算机程序产品,其特征在于,当所述计算机程序产品在电子设备上运行时,使得所述第一车辆执行如权利要求17至21中任一项所述的方法。
  46. 一种计算机可读存储介质,包括指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行如权利要求1至21中任一项所述的方法。
PCT/CN2021/120343 2021-01-06 2021-09-24 一种车辆检测方法和车辆检测装置 WO2022148068A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110015232.4A CN114724272A (zh) 2021-01-06 2021-01-06 一种车辆检测方法和车辆检测装置
CN202110015232.4 2021-01-06

Publications (1)

Publication Number Publication Date
WO2022148068A1 true WO2022148068A1 (zh) 2022-07-14

Family

ID=82233995

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/120343 WO2022148068A1 (zh) 2021-01-06 2021-09-24 一种车辆检测方法和车辆检测装置

Country Status (2)

Country Link
CN (1) CN114724272A (zh)
WO (1) WO2022148068A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024022674A1 (de) * 2022-07-29 2024-02-01 Bayerische Motoren Werke Aktiengesellschaft Verfahren zum analysieren eines äusseren fahrzeugzustandes und computerprogramm
CN117912289A (zh) * 2024-03-19 2024-04-19 西安九天数智信息科技有限公司 一种基于图像识别的车辆群行驶预警方法、装置及***

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105992130A (zh) * 2015-03-19 2016-10-05 现代自动车株式会社 车辆、车辆的通信方法、及车辆内的无线通信设备
CN108449583A (zh) * 2018-05-09 2018-08-24 爱驰汽车有限公司 车辆之间相互监控的方法、***、设备及存储介质
US10121240B2 (en) * 2014-04-17 2018-11-06 Denso Corporation Failure detection system, information processing device, and vehicle-mounted device
CN109835349A (zh) * 2017-11-28 2019-06-04 丰田自动车株式会社 故障车辆推断***及其推断方法、以及非临时性存储介质
CN110519382A (zh) * 2019-08-30 2019-11-29 成都康普斯北斗科技有限公司 一种汽车智能监控***
CN111063054A (zh) * 2019-12-23 2020-04-24 智车优行科技(上海)有限公司 故障信息发送方法和装置、车辆故障处理方法和装置
US20200279443A1 (en) * 2019-02-28 2020-09-03 Toyota Jidosha Kabushiki Kaisha Vehicle

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10121240B2 (en) * 2014-04-17 2018-11-06 Denso Corporation Failure detection system, information processing device, and vehicle-mounted device
CN105992130A (zh) * 2015-03-19 2016-10-05 现代自动车株式会社 车辆、车辆的通信方法、及车辆内的无线通信设备
CN109835349A (zh) * 2017-11-28 2019-06-04 丰田自动车株式会社 故障车辆推断***及其推断方法、以及非临时性存储介质
CN108449583A (zh) * 2018-05-09 2018-08-24 爱驰汽车有限公司 车辆之间相互监控的方法、***、设备及存储介质
US20200279443A1 (en) * 2019-02-28 2020-09-03 Toyota Jidosha Kabushiki Kaisha Vehicle
CN110519382A (zh) * 2019-08-30 2019-11-29 成都康普斯北斗科技有限公司 一种汽车智能监控***
CN111063054A (zh) * 2019-12-23 2020-04-24 智车优行科技(上海)有限公司 故障信息发送方法和装置、车辆故障处理方法和装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024022674A1 (de) * 2022-07-29 2024-02-01 Bayerische Motoren Werke Aktiengesellschaft Verfahren zum analysieren eines äusseren fahrzeugzustandes und computerprogramm
CN117912289A (zh) * 2024-03-19 2024-04-19 西安九天数智信息科技有限公司 一种基于图像识别的车辆群行驶预警方法、装置及***

Also Published As

Publication number Publication date
CN114724272A (zh) 2022-07-08

Similar Documents

Publication Publication Date Title
CN110550029B (zh) 障碍物避让方法及装置
JP7097439B2 (ja) 調節可能な垂直視野
US9928431B2 (en) Verifying a target object with reverse-parallax analysis
WO2021103511A1 (zh) 一种设计运行区域odd判断方法、装置及相关设备
US11496707B1 (en) Fleet dashcam system for event-based scenario generation
WO2022148068A1 (zh) 一种车辆检测方法和车辆检测装置
WO2021065626A1 (ja) 交通制御システム、交通制御方法及び制御装置
US20240017719A1 (en) Mapping method and apparatus, vehicle, readable storage medium, and chip
CN115042821B (zh) 车辆控制方法、装置、车辆及存储介质
CN112810603B (zh) 定位方法和相关产品
US20210362742A1 (en) Electronic device for vehicles
WO2022061702A1 (zh) 驾驶提醒的方法、装置及***
CN113859265A (zh) 一种驾驶过程中的提醒方法及设备
CN112829762A (zh) 一种车辆行驶速度生成方法以及相关设备
WO2022127502A1 (zh) 控制方法和装置
WO2022151839A1 (zh) 一种车辆转弯路线规划方法及装置
CN115082886B (zh) 目标检测的方法、装置、存储介质、芯片及车辆
WO2022061725A1 (zh) 交通元素的观测方法和装置
CN115123304A (zh) 故障追踪方法、装置、介质及芯片
CN115220932A (zh) 通信进程执行方法、装置、车辆、可读存储介质及芯片
CN115685294A (zh) 一种信号干扰处理方法及装置
CN115139946A (zh) 车辆落水检测方法、车辆、计算机可读存储介质及芯片
CN115407344A (zh) 栅格地图创建方法、装置、车辆及可读存储介质
CN116802580A (zh) 一种路径约束方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21917110

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21917110

Country of ref document: EP

Kind code of ref document: A1