CN112132995A - Data processing method, device and system, vehicle-mounted equipment and server - Google Patents

Data processing method, device and system, vehicle-mounted equipment and server Download PDF

Info

Publication number
CN112132995A
CN112132995A CN202011126305.9A CN202011126305A CN112132995A CN 112132995 A CN112132995 A CN 112132995A CN 202011126305 A CN202011126305 A CN 202011126305A CN 112132995 A CN112132995 A CN 112132995A
Authority
CN
China
Prior art keywords
vehicle
detected
collision
acceleration
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011126305.9A
Other languages
Chinese (zh)
Inventor
李友增
李国镇
卢美奇
杨宏达
吴若溪
戚龙雨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Didi Infinity Technology and Development Co Ltd
Original Assignee
Beijing Didi Infinity Technology and Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Didi Infinity Technology and Development Co Ltd filed Critical Beijing Didi Infinity Technology and Development Co Ltd
Priority to CN202011126305.9A priority Critical patent/CN112132995A/en
Publication of CN112132995A publication Critical patent/CN112132995A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data
    • G07C5/085Registering performance data using electronic data carriers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M17/00Testing of vehicles
    • G01M17/007Wheeled or endless-tracked vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P15/00Measuring acceleration; Measuring deceleration; Measuring shock, i.e. sudden change of acceleration
    • G01P15/18Measuring acceleration; Measuring deceleration; Measuring shock, i.e. sudden change of acceleration in two or more dimensions
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data
    • G07C5/085Registering performance data using electronic data carriers
    • G07C5/0866Registering performance data using electronic data carriers the electronic data carrier being a digital video recorder in combination with video camera

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the application provides a data processing method, a data processing device, a data processing system, vehicle-mounted equipment and a server, and relates to the technical field of data processing. The data processing method is applied to the vehicle-mounted equipment, and comprises the following steps: firstly, acquiring acceleration data and image data of a vehicle to be detected; secondly, judging whether the vehicle to be detected is collided or not according to the acceleration data; and then, if yes, carrying out target recognition on the image data according to a preset target recognition model to obtain a collision target. Through the arrangement, most of collision events can be identified, and collision targets are obtained while collision detection is carried out, so that the problem of low comprehensiveness of collision detection in the prior art is solved.

Description

Data processing method, device and system, vehicle-mounted equipment and server
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a data processing method, an apparatus, a system, a vehicle-mounted device, and a server.
Background
With the continuous development of the automobile industry, the number of automobiles is more and more, various accidents are increased, and the requirements for alarming when collision occurs and retaining evidences in time are stronger. However, the inventors have found that the conventional technology has a problem that the overall performance of collision detection is low.
Disclosure of Invention
In view of the above, an object of the present application is to provide a data processing method, device, system, vehicle-mounted device and server, so as to solve the problems in the prior art.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical solutions:
a data processing method is applied to vehicle-mounted equipment and comprises the following steps:
acquiring acceleration data and image data of a vehicle to be detected;
judging whether the vehicle to be detected is collided or not according to the acceleration data;
and if so, carrying out target recognition on the image data according to a preset target recognition model to obtain a collision target.
In a preferred selection of the embodiment of the present application, the acceleration data includes vehicle acceleration and gravitational acceleration, and the step of determining whether the vehicle to be detected collides according to the acceleration data includes:
calculating to obtain a horizontal acceleration and an integral value of the vehicle to be detected according to the vehicle acceleration and the gravity acceleration, wherein the integral value represents the speed change of the vehicle to be detected in the horizontal direction;
judging whether the horizontal acceleration is greater than or equal to a preset horizontal acceleration threshold value or not and whether the integral value is greater than or equal to a preset integral value threshold value or not;
and if the horizontal acceleration is greater than or equal to a preset horizontal acceleration threshold value and the integral value is greater than or equal to a preset integral value threshold value, judging that the vehicle to be detected collides.
In a preferable selection of the embodiment of the present application, the step of calculating the horizontal acceleration and the integral value of the vehicle to be detected according to the vehicle acceleration and the gravitational acceleration includes:
carrying out vector outer product calculation on the vehicle acceleration and the gravity acceleration to obtain the horizontal acceleration of the vehicle to be detected;
and carrying out window integral calculation on the vehicle acceleration and the gravity acceleration to obtain an integral value of the vehicle to be detected.
In a preferred option of the embodiment of the present application, the step of performing target recognition on the image data according to a preset target recognition model to obtain a collision target includes:
identifying the image data according to a preset target identification model to obtain at least one identification target;
calculating the collision grade of the vehicle to be detected according to the vehicle acceleration and the gravity acceleration;
and screening the at least one recognition target according to the collision grade of the vehicle to be detected to obtain the collision target.
In a preferred option of the embodiment of the present application, the data processing method further includes:
acquiring angular speed data of the vehicle to be detected;
and performing drop detection according to the vehicle acceleration and angular velocity data.
The embodiment of the application further provides a data processing method, which is applied to a server, wherein the server is in communication connection with the vehicle-mounted equipment, and the data processing method comprises the following steps:
acquiring a collision target and real-time positioning data of a vehicle to be detected from the vehicle-mounted equipment, wherein the collision target is obtained by performing target recognition on image data of the vehicle to be detected by the vehicle-mounted equipment according to a preset target recognition model;
generating a motion track of the vehicle to be detected according to the real-time positioning data;
and judging whether the vehicle to be detected collides with the collision target or not according to the motion track.
In a preferable selection of the embodiment of the application, the step of determining whether the vehicle to be detected collides with the collision target according to the motion trajectory includes:
acquiring the collision grade of the vehicle to be detected from the vehicle-mounted equipment;
acquiring a motion track of a preset duration of the vehicle to be detected according to the collision grade of the vehicle to be detected;
judging whether the motion track is smaller than a preset motion track threshold value or not;
and if so, judging that the vehicle to be detected collides with the collision target.
In a preferred option of the embodiment of the present application, the data processing method further includes:
and when the vehicle to be detected is judged to collide with the collision target, storing the vehicle to be detected and the collision target.
In a preferred option of the embodiment of the present application, the data processing method further includes:
acquiring a collision event reported by a user, wherein the collision event comprises a collision vehicle;
and matching the collision vehicle with the stored vehicle to be detected, and judging whether the collision vehicle is the same vehicle.
The embodiment of the application also provides a data processing system, which comprises vehicle-mounted equipment and a server which are in communication connection;
the vehicle-mounted equipment is used for acquiring acceleration data and image data of a vehicle to be detected, judging whether the vehicle to be detected collides or not according to the acceleration data, and performing target recognition on the image data according to a preset target recognition model when the vehicle to be detected collides to obtain a collision target;
the server is used for obtaining a collision target and real-time positioning data of a vehicle to be detected from the vehicle-mounted equipment, generating a motion track of the vehicle to be detected according to the real-time positioning data, and judging whether the vehicle to be detected collides with the collision target or not according to the motion track.
The embodiment of the present application further provides a data processing apparatus, which is applied to a vehicle-mounted device, where the data processing apparatus includes:
the data acquisition module is used for acquiring acceleration data and image data of the vehicle to be detected;
the judging module is used for judging whether the vehicle to be detected is collided or not according to the acceleration data;
and the target recognition module is used for carrying out target recognition on the image data according to a preset target recognition model when the vehicle to be detected collides to obtain a collision target.
In a preferred selection of the embodiment of the present application, the acceleration data includes a vehicle acceleration and a gravitational acceleration, and the determination module is specifically configured to:
calculating to obtain a horizontal acceleration and an integral value of the vehicle to be detected according to the vehicle acceleration and the gravity acceleration, wherein the integral value represents the speed change of the vehicle to be detected in the horizontal direction;
judging whether the horizontal acceleration is greater than or equal to a preset horizontal acceleration threshold value or not and whether the integral value is greater than or equal to a preset integral value threshold value or not;
and if the horizontal acceleration is greater than or equal to a preset horizontal acceleration threshold value and the integral value is greater than or equal to a preset integral value threshold value, judging that the vehicle to be detected collides.
In a preferred option of the embodiment of the present application, the determining module is further specifically configured to:
carrying out vector outer product calculation on the vehicle acceleration and the gravity acceleration to obtain the horizontal acceleration of the vehicle to be detected;
and carrying out window integral calculation on the vehicle acceleration and the gravity acceleration to obtain an integral value of the vehicle to be detected.
In a preferred option of the embodiment of the present application, the target identification module is specifically configured to:
identifying the image data according to a preset target identification model to obtain at least one identification target;
calculating the collision grade of the vehicle to be detected according to the vehicle acceleration and the gravity acceleration;
and screening the at least one recognition target according to the collision grade of the vehicle to be detected to obtain the collision target.
In a preferred option of the embodiment of the present application, the data processing apparatus further includes:
the angular speed acquisition module is used for acquiring angular speed data of the vehicle to be detected;
and the drop detection module is used for carrying out drop detection according to the vehicle acceleration and angular velocity data.
The embodiment of the application also provides the vehicle-mounted device, which comprises a memory and a processor, wherein the processor is used for executing the executable computer program stored in the memory so as to realize the data processing method executed by the vehicle-mounted device.
The embodiment of the present application further provides a server, which includes a memory and a processor, where the processor is configured to execute the executable computer program stored in the memory, so as to implement the data processing method executed by the server.
The embodiment of the application also provides a storage medium, wherein a computer program is stored on the storage medium, and when the computer program is executed, the steps of the data processing method are realized.
According to the data processing method, the data processing device, the data processing system, the vehicle-mounted equipment and the server, whether collision occurs or not is judged according to the acceleration data of the vehicle to be detected, when collision occurs, the image data are subjected to target recognition according to the target recognition model to obtain the collision target, recognition of most collision events is achieved, collision detection is carried out, and meanwhile the collision target is obtained, so that the problem that the collision detection in the prior art is low in comprehensiveness is solved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a block diagram of a data processing system according to an embodiment of the present disclosure.
Fig. 2 is a block diagram of a structure of an in-vehicle device according to an embodiment of the present application.
Fig. 3 is a schematic flowchart of a data processing method according to an embodiment of the present application.
Fig. 4 is another schematic flow chart of the data processing method according to the embodiment of the present application.
Fig. 5 is another schematic flow chart of the data processing method according to the embodiment of the present application.
Fig. 6 is a schematic structural diagram of a target recognition model provided in the embodiment of the present application.
Fig. 7 is another schematic flow chart of a data processing method according to an embodiment of the present application.
Fig. 8 is another schematic flow chart of a data processing method according to an embodiment of the present application.
Fig. 9 is another schematic flow chart of a data processing method according to an embodiment of the present application.
Fig. 10 is another schematic flow chart of a data processing method according to an embodiment of the present application.
Fig. 11 is another schematic flow chart of a data processing method according to an embodiment of the present application.
Fig. 12 is another schematic flow chart of a data processing method according to an embodiment of the present application.
Fig. 13 is a block diagram of a data processing apparatus according to an embodiment of the present application.
Fig. 14 is another block diagram of a data processing apparatus according to an embodiment of the present application.
Icon: 10-a data processing system; 100-vehicle mounted equipment; 110-a first processor; 120-external memory interface; 121-internal memory; 180-a sensor module; 193-camera; 194-a display screen; 200-a server; 1300-a data processing apparatus; 1310-a data acquisition module; 1320-judging module; 1330 — a target identification module; 1340-angular velocity acquisition module; 1350-drop detection module.
Detailed Description
In the prior art, the research direction of collision detection mainly focuses on detection of an accident, but research on accident causes and the like of the accident is relatively lacking, and the accident comprises a collision event. For example, only collision detection and recognition between vehicles are supported, and there is a problem that a minor accident such as a minor damage to a vehicle body, such as a pedestrian, a non-motor vehicle, or a guardrail, or a major accident such as a vehicle rollover cannot be detected, and an accident object (for example, a motor vehicle (car, truck, van), a pedestrian, or a tricycle) cannot be recognized, and thus the overall performance of collision detection is low.
In order to improve at least one of the above technical problems proposed by the present application, embodiments of the present application provide a data processing method, an apparatus, a system, an on-board device, and a server, and the following describes technical solutions of the present application through possible implementation manners.
The defects of the above solutions are the results of the inventor after practice and careful study, and therefore, the discovery process of the above problems and the solution proposed by the present application to the above problems should be the contribution of the inventor to the present application in the process of the present application.
For purposes of making the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions of the embodiments of the present application will be described in detail below with reference to the drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are for illustrative and descriptive purposes only and are not used to limit the scope of the present application. Additionally, it should be understood that the schematic drawings are not necessarily drawn to scale. The flowcharts used in this application illustrate operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be performed out of order, and steps without logical context may be performed in reverse order or simultaneously. One skilled in the art, under the guidance of this application, may add one or more other operations to, or remove one or more operations from, the flowchart.
In addition, the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
In order to enable a person skilled in the art to make use of the present disclosure, the following embodiments are given. It will be apparent to those skilled in the art that the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the application. Applications of the system or method of the present application may include web pages, plug-ins for browsers, client terminals, customization systems, internal analysis systems, or artificial intelligence robots, among others, or any combination thereof.
It should be noted that in the embodiments of the present application, the term "comprising" is used to indicate the presence of the features stated hereinafter, but does not exclude the addition of further features.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
FIG. 1 is a block diagram of a data processing system 10 that provides one possible implementation of data processing system 10, according to an embodiment of the present disclosure, where data processing system 10 may be an online transportation service platform for transportation services such as taxi, designated driving service, express, carpool, bus service, driver rental, or regular bus service, or any combination thereof. Referring to fig. 1, the data processing system 10 may include an in-vehicle device 100 and a server 200 that are communicatively connected.
The vehicle-mounted device 100 is arranged in a vehicle to be detected and used for acquiring acceleration data and image data of the vehicle to be detected, judging whether the vehicle to be detected collides or not according to the acceleration data, and when the vehicle to be detected collides, performing target recognition on the image data according to a preset target recognition model to obtain a collision target. The server 200 is configured to obtain the collision target and the real-time positioning data of the vehicle to be detected from the vehicle-mounted device 100, generate the motion trajectory of the vehicle to be detected according to the real-time positioning data, and determine whether the vehicle to be detected collides with the collision target according to the motion trajectory.
With respect to the vehicle-mounted device 100, it should be noted that fig. 2 shows a schematic diagram of exemplary hardware and software components of the vehicle-mounted device 100, which may implement the concepts of the present application according to some embodiments of the present application. The in-vehicle apparatus 100 may include a first processor 110, an external memory interface 120, an internal memory 121, a sensor module 180, a camera 193, a display screen 194, and the like. Wherein the sensor module 180 may include a gyro sensor, an acceleration sensor, and the like.
It is to be understood that the structure illustrated in the embodiment of the present application does not constitute a specific limitation to the vehicle-mounted device 100. In other embodiments of the present application, the in-vehicle apparatus 100 may include more or fewer components than those shown, or combine some components, or split some components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The first processor 110 may include one or more processing units, such as: the first processor 110 may include an Application Processor (AP), a modem processor, a Graphic Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The first processor 110 may be a neural center and a command center of the vehicle-mounted device 100. The first processor 110 may generate an operation control signal according to the instruction operation code and the timing signal, and perform instruction fetching and execution control.
A memory may also be provided in the first processor 110 for storing instructions and data. In some embodiments, the memory in the first processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the first processor 110. If the first processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the first processor 110, thereby increasing the efficiency of the system.
In some embodiments, the first processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, and the like.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, the first processor 110 may include multiple sets of I2C buses. The first processor 110 may be coupled to the camera 193 and other devices through different I2C bus interfaces.
The MIPI interface may be used to connect the first processor 110 with peripheral devices such as the display screen 194, the camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI) and the like. In some embodiments, the first processor 110 and the camera 193 communicate through a CSI interface to implement the photographing function of the in-vehicle apparatus 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the first processor 110 with the camera 193, the display screen 194, and the like.
It should be understood that the interface connection relationship between the modules illustrated in the embodiment of the present application is only an exemplary illustration, and does not constitute a structural limitation on the vehicle-mounted device 100. In other embodiments of the present application, the vehicle-mounted device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The in-vehicle apparatus 100 may implement a photographing function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the in-vehicle apparatus 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
For example, in the data processing method provided by the present application, the camera 193 may collect an image in front of the vehicle to be detected, and display the collected image in the preview interface. The photosensitive element converts the collected optical signal into an electrical signal, and then transmits the electrical signal to the ISP to be converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for relevant image processing.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can realize applications such as intelligent cognition of the vehicle-mounted device 100, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The first processor 110 executes various functional applications of the in-vehicle apparatus 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The stored data area may store data (such as audio data, a phonebook, etc.) created during use of the in-vehicle apparatus 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The gyro sensor may be used to determine the motion attitude of the in-vehicle apparatus 100. In some embodiments, the angular velocity of the in-vehicle apparatus 100 about three axes (i.e., x, y, and z axes) may be determined by the gyro sensor.
The acceleration sensor may detect the magnitude of acceleration of the vehicle-mounted device 100 in various directions (generally, three axes). The acceleration sensor may also detect the magnitude and direction of gravity when the in-vehicle apparatus 100 is stationary.
It should be noted that the gyro sensor and the acceleration sensor may be separate sensors or may be part of an Inertial Measurement Unit (IMU).
For the server 200, it should be noted that the server 200 may be a single server 200 or a server group. The server group may be centralized or distributed (e.g., server 200 may be a distributed system). In some embodiments, the server 200 may be local or remote to the terminal. For example, the server 200 may access information and/or data stored in the vehicle-mounted device 100 via a network. As another example, the server 200 may be directly connected to the in-vehicle device 100 to access stored information and/or data. In some embodiments, the server 200 may be implemented on a cloud platform. By way of example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a resilient cloud, a community cloud (community cloud), a distributed cloud, a cross-cloud (inter-cloud), a multi-cloud (multi-cloud), and the like, or any combination thereof. In some embodiments, the server 200 may be implemented on the in-vehicle device 100 having one or more of the components shown in FIG. 2 in the present application.
A database may be included in server 200 that may store data and/or instructions. In some embodiments, the database may store data obtained from the in-vehicle device 100. In some embodiments, the database may store data and/or instructions for the exemplary methods described herein. In some embodiments, the database may include mass storage, removable storage, volatile Read-write Memory, or Read-Only Memory (ROM), among others, or any combination thereof. By way of example, mass storage may include magnetic disks, optical disks, solid state drives, and the like; removable memory may include flash drives, floppy disks, optical disks, memory cards, zip disks, tapes, and the like; volatile read-write Memory may include Random Access Memory (RAM); the RAM may include Dynamic RAM (DRAM), Double data Rate Synchronous Dynamic RAM (DDR SDRAM); static RAM (SRAM), Thyristor-Based Random Access Memory (T-RAM), Zero-capacitor RAM (Zero-RAM), and the like. By way of example, ROMs may include Mask Read-Only memories (MROMs), Programmable ROMs (PROMs), Erasable Programmable ROMs (PERROMs), Electrically Erasable Programmable ROMs (EEPROMs), compact disk ROMs (CD-ROMs), digital versatile disks (ROMs), and the like. In some embodiments, the database may be implemented on a cloud platform. By way of example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, cross-cloud, multi-cloud, elastic cloud, or the like, or any combination thereof.
In some embodiments, the database may be connected to a network to communicate with one or more components in the data processing system 10 (e.g., the server 200 and the in-vehicle device 100). One or more components in data processing system 10 may access data or instructions stored in a database via a network. In some embodiments, the database may be directly connected to one or more components in the data processing system 10 (e.g., the server 200 and the in-vehicle device 100). Alternatively, in some embodiments, the database may also be part of the server 200.
In some embodiments, one or more components in data processing system 10 (e.g., server 200 and in-vehicle device 100) may have access to a database. In some embodiments, one or more components in data processing system 10 may read and/or modify information related to data processing when certain conditions are met. For example, server 200 may read and/or modify information for one or more users after receiving a service request.
Fig. 3 shows one of flowcharts of a data processing method provided in an embodiment of the present application, which is applicable to the vehicle-mounted device 100 shown in fig. 1 and is executed by the vehicle-mounted device 100 in fig. 1. It should be understood that, in other embodiments, the order of some steps in the data processing method of this embodiment may be interchanged according to actual needs, or some steps may be omitted or deleted. The flow of the data processing method shown in fig. 3 is described in detail below.
Step S310, acceleration data and image data of the vehicle to be detected are acquired.
In detail, acceleration data can be acquired through an acceleration sensor, and image data in front of a vehicle to be detected is acquired through a camera.
And step S320, judging whether the vehicle to be detected collides or not according to the acceleration data.
In the embodiment of the application, when the vehicle to be detected does not collide, the vehicle to be detected is judged not to have a collision event; when the vehicle to be detected collides, it is determined that the vehicle to be detected has a collision event, and step S330 is performed.
And step S330, performing target recognition on the image data according to a preset target recognition model to obtain a collision target.
According to the method, whether collision occurs is judged according to the acceleration data of the vehicle to be detected, when collision occurs, the target recognition is carried out on the image data according to the target recognition model to obtain the collision target, the recognition of most collision events is realized, the collision detection is carried out, and meanwhile, the collision target is obtained, so that the problem that the comprehensiveness of the collision detection in the prior art is low is solved.
For step S320, it should be noted that the specific step of determining whether a collision occurs is not limited, and may be set according to the actual application requirement. For example, in an alternative example, when the acceleration data includes a vehicle acceleration and a gravitational acceleration, step S320 may include a step of calculating a horizontal acceleration and an integral value from the vehicle acceleration and the gravitational acceleration. The vehicle acceleration refers to acceleration data of the vehicle to be detected, and the gravity acceleration refers to gravity acceleration data of the local position of the vehicle to be detected. Therefore, on the basis of fig. 3, fig. 4 is a schematic flowchart of another data processing method provided in the embodiment of the present application, and referring to fig. 4, step S320 may include:
and S321, calculating to obtain the horizontal acceleration and the integral value of the vehicle to be detected according to the vehicle acceleration and the gravity acceleration.
Wherein the integral value is indicative of a change in velocity of the vehicle to be inspected in a horizontal direction.
In step S322, it is determined whether the horizontal acceleration is greater than or equal to a preset horizontal acceleration threshold and the integral value is greater than or equal to a preset integral value threshold.
In the embodiment of the application, when the horizontal acceleration is smaller than a preset horizontal acceleration threshold and/or the integral value is smaller than a preset integral value threshold, it is determined that the vehicle to be detected does not have a collision event; when the horizontal acceleration is greater than or equal to the preset horizontal acceleration threshold value and the integral value is greater than or equal to the preset integral value threshold value, step S323 is executed.
In step S323, it is determined that the vehicle to be detected has collided.
For step S321, it should be noted that the specific steps of calculating the horizontal acceleration and the integral value of the vehicle to be detected according to the vehicle acceleration and the gravity acceleration are not limited, and may be set according to the actual application requirements. For example, in an alternative example, step S321 may include the step of integral calculation. Therefore, on the basis of fig. 4, fig. 5 is a schematic flowchart of another data processing method provided in the embodiment of the present application, and referring to fig. 5, step S321 may include:
step S3211, performing vector outer product calculation on the vehicle acceleration and the gravity acceleration to obtain the horizontal acceleration of the vehicle to be detected.
In detail, the horizontal acceleration may be calculated by the following formula:
F2=F1×G;
|F2|=|F1|·|G|·sinθ;
wherein, the vector F2Representing horizontal acceleration, vector F1Representing the vehicle acceleration, vector G representing the gravitational acceleration, and theta representing the angle between the vehicle acceleration and the gravitational acceleration.
And S3212, performing window integral calculation on the vehicle acceleration and the gravity acceleration to obtain an integral value of the vehicle to be detected.
In detail, the integral value represents the change of the horizontal direction speed of the vehicle to be detected, i.e., the integral of the horizontal acceleration, and the integral window is 50 times within 0.5s, and the integral value can be calculated by the following formula:
Figure BDA0002733717720000101
where Δ V represents an integral value, k represents a start time, t represents a time of the tth time, and a vector F1Representing vehicle acceleration, and vector G representing gravitational acceleration.
For step S322, it should be noted that the specific value of the horizontal acceleration threshold is not limited, and may be set according to the actual application requirement. For example, in an alternative example, the specific value of the horizontal acceleration threshold may be 12.5m/s 2.
In detail, the specific value of the integral value threshold may be the product of the horizontal acceleration threshold, the window 50 and a preset proportion. Optionally, the specific value of the preset ratio is not limited, and may be set according to the actual application requirement. For example, in an alternative example, in order to obtain a robust integration value, the preset ratio may be 0.8, and the integration value threshold may be 12.5 × 50 × 0.8 — 500. For another example, in another alternative example, in order to obtain a sensitive integral value, the preset ratio may be 0.6, and the integral value threshold may be 12.5 × 50 × 0.6 ═ 375.
Before step S330, the data processing method provided in the embodiment of the present application may further include a step of building a target recognition model. In detail, the edge calculation can be performed by using a lightweight deep learning network and an object detection framework, and the object recognition can be realized on the vehicle-mounted device 100. In order to run machine learning on low-end devices, a lightweight deep learning network can be used, which mainly comprises the following structures:
a shuffleNet V2-0.5 network is adopted as a backbone feature extraction network (backbone) of a target recognition model, and on the basis of the network, the used front part mainly comprises a Conv1, a Max-Pool, a Stage2, a Stage3 and a Stage4 convolution layer, and the shuffleNet V2 network behind the Stage4 convolution layer is not used. The target recognition model can adopt an ssd detection framework to respectively select the Stage2, Stage3 and Stage4 convolutional layers of the shuffleNet V2-0.5 network and recognize and detect targets with different dimensions by using the set sizes of recognition frames. In conjunction with fig. 6, the target recognition model may include stages 2, Stage3, Stage4 convolutional layers for feature extraction, headers 1, 2, 3 and 4 represent target detection and classification network branches, cls1, cls2, cls3 and cls4 represent classifiers, and reg1, reg2, reg3 and reg4 represent regression modules.
Further, the attention mechanism of the Squeeze-and-Excitation Networks network can be combined with the Stage convolutional layer structure of shufflentV 2. In detail, in the convolutional layer, a corresponding weight is assigned to a result obtained by classifying and regressing the feature network, and the result is input to the next feature network for processing.
The objects can be classified into three-scale object categories according to the comparison between the area of the identified objects in the image and the area of the set identification frame, for example, the objects can comprise small-scale objects, medium-scale objects and large-scale objects, the small-scale objects can comprise pedestrians, the medium-scale objects can comprise bicycles, motorcycles and tricycles, and the large-scale objects can comprise cars, trucks and buses. Based on the current scheme, three target categories (pedestrian, bicycle, motorcycle, tricycle, car, truck and bus) can be selected, the category classification is modified into 7 categories, and the background is 8 categories, so that the target identification is carried out. After the target recognition model is established, the target recognition model may be trained according to training data including 8 types of targets, and target recognition may be performed according to the trained target recognition model.
For step S330, it should be noted that the specific step of performing the target identification is not limited, and may be set according to the actual application requirement. For example, in an alternative example, step S330 may include the step of screening. Therefore, on the basis of fig. 3, fig. 7 is a schematic flowchart of another data processing method provided in the embodiment of the present application, and referring to fig. 7, step S330 may include:
step S331, recognizing the image data according to a preset target recognition model to obtain at least one recognition target.
And S332, calculating the collision grade of the vehicle to be detected according to the acceleration and the gravity acceleration of the vehicle.
In detail, the collision grade can be calculated by the following formula:
Figure BDA0002733717720000121
where level represents the collision level, vector F1Representing vehicle acceleration, and vector G representing gravitational acceleration.
And S333, screening the at least one recognition target according to the collision grade of the vehicle to be detected to obtain the collision target.
It should be noted that the collision grade of accident objects such as most pedestrians, non-motor vehicles, railings and the like is relatively low, and meanwhile, the collision grade corresponding to a relatively light accident between vehicles such as scraping and rubbing of the vehicles is also relatively low. In order to improve the identification accuracy, accident target identification screening can be performed in combination with the collision level, for example, when the collision level is less than 2, the collision target can include obstacles such as pedestrians, vehicles, non-motor vehicles, railings and the like; when the collision grade is greater than 3, the collision target can comprise an obstacle such as a vehicle, a non-motor vehicle, a railing and the like; at collision levels >5, the collision target may comprise an obstacle such as a vehicle, a railing, etc.
It should be noted that, after step S310, the data processing method provided in the embodiment of the present application may further include a step of performing drop detection. Therefore, on the basis of fig. 3, fig. 8 is a schematic flowchart of another data processing method provided in the embodiment of the present application, and referring to fig. 8, the data processing method may further include:
and step S340, acquiring angular speed data of the vehicle to be detected.
In detail, the angular velocity data may be acquired by a gyro sensor.
In step S350, drop detection is performed based on the vehicle acceleration and angular velocity data.
It should be noted that when the vehicle-mounted device 100 is dropped or the vehicle to be detected is subjected to roll-over weight loss, the angular velocity data is large, and the vehicle acceleration is close to 0, so that drop detection can be performed according to the vehicle acceleration and the angular velocity data.
Further, the data processing method provided in the embodiment of the application may further perform data anomaly detection, for example, when the value of the sensor cannot be detected or the value of the gravitational acceleration detected by the acceleration sensor is too large, the vehicle-mounted device 100 may send an alarm signal to remind the driver.
To improve the accuracy of collision detection, the server 200 may also confirm a collision target of the vehicle to be detected. With reference to fig. 9, an embodiment of the present application further provides a data processing method, which may be applied to the server 200 shown in fig. 1, where the data processing method may include:
in step S910, the collision target and the real-time positioning data of the vehicle to be detected are obtained from the vehicle-mounted device 100.
The collision target is obtained by the vehicle-mounted device 100 performing target recognition on the image data of the vehicle to be detected according to a preset target recognition model.
And step S920, generating a motion track of the vehicle to be detected according to the real-time positioning data.
In detail, after the real-time positioning data of the vehicle to be detected is obtained through step S910, the motion trail may be generated according to the real-time positioning data.
And step S930, judging whether the vehicle to be detected collides with the collision target according to the motion trail.
In detail, the collision target of the vehicle to be detected is obtained through step S910, and after the motion trajectory is obtained through step S920, whether to send a collision with the collision target may be determined according to the motion trajectory.
According to the method, when collision occurs, the collision target is obtained by carrying out target recognition on the image data according to the target recognition model, recognition of most collision events is realized, the collision target is obtained while collision detection is carried out, and therefore the problem that the comprehensiveness of the collision detection is low in the prior art is solved. And, judging again whether a collision occurs or not according to the movement locus after the vehicle-mounted device 100 judges that a collision occurs improves the accuracy of collision detection.
For step S930, it should be noted that the specific step of determining whether a collision occurs is not limited, and may be set according to the actual application requirement. For example, in an alternative example, step S930 may include the step of determining whether a collision occurs according to the motion trajectory of the vehicle to be detected. Therefore, on the basis of fig. 9, fig. 10 is a schematic flowchart of another data processing method provided in the embodiment of the present application, and referring to fig. 10, step S930 may include:
in step S931, the collision rank of the vehicle to be detected is acquired from the vehicle-mounted device 100.
And step 932, obtaining the motion track of the vehicle to be detected for the preset duration according to the collision grade of the vehicle to be detected.
In step S933, it is determined whether the motion trajectory is smaller than a preset motion trajectory threshold.
In the embodiment of the application, when the motion track is not smaller than a preset motion track threshold value, it is determined that the vehicle to be detected does not collide with the collision target; when the motion track is smaller than the preset motion track threshold, step S934 is executed.
Step S934, it is determined that the vehicle to be detected collides with the collision target.
In detail, when the collision grade is less than 2, the motion track of the vehicle to be detected for 2 minutes can be obtained, when the motion track is smaller than a first motion track threshold value, collision is determined, cloud backup of an accident site is carried out, and the accident reply system on the server 200 is entered. When the collision grade is greater than 3, the motion trail of the vehicle to be detected for 1 minute can be obtained, when the motion trail is smaller than a second motion trail threshold value, collision is determined, accident scene cloud backup is carried out, and the accident reply system on the server 200 is entered. And when the collision grade is greater than 5, determining collision and directly entering an accident reply system.
Optionally, specific values of the first motion trajectory threshold and the second motion trajectory threshold are not limited, and may be set according to actual application requirements. For example, in one alternative example, the first motion profile threshold may be 50 meters and the second motion profile threshold may be 30 meters.
It should be noted that, after step S934, when it is determined that the vehicle to be detected collides with the collision target according to the motion trajectory, the data processing method provided in the embodiment of the present application may further include a step of storing data. Therefore, on the basis of fig. 10, fig. 11 is a schematic flowchart of another data processing method provided in the embodiment of the present application, and referring to fig. 11, the data processing method may further include:
step S940 stores the vehicle to be detected and the collision target.
That is, the vehicle-mounted device 100 performs data filtering on data input by the sensor module 180, inputs the data into the gravity estimator for gravity estimation, calculates a horizontal acceleration and an integral value of the vehicle to be detected, and reports the collision event to the server 200 when the horizontal acceleration and the integral value meet conditions. After the collision event is reported, the event replication system on the server 200 performs trajectory analysis, and then confirms that the relevant data (for example, the identification information of the vehicle to be detected, the type of the collision target, and the like) of the vehicle to be detected having the collision is stored.
It should be noted that, after step S940, when the user actively reports the collision event, the data processing method provided in the embodiment of the present application may further match the collision vehicle included in the collision event with the stored vehicle to be detected. Therefore, on the basis of fig. 11, fig. 12 is a schematic flowchart of another data processing method provided in the embodiment of the present application, and referring to fig. 12, the data processing method may further include:
step S950, the collision event reported by the user is obtained.
Wherein the crash event comprises a crashing vehicle.
And step S960, matching the collision vehicle with the stored vehicle to be detected, and judging whether the collision vehicle is the same vehicle.
It should be noted that, not only the vehicle-mounted device 100 may automatically report the collision event, but also the user, such as a driver or an insurance company worker, may report the collision event to the accident review system, and may match the vehicle to be detected included in the collision event reported by the vehicle-mounted device 100 with the collision vehicle included in the collision event reported by the user, to determine whether the vehicle is the same vehicle, and when the vehicle is determined to be the same vehicle, the vehicle may be merged and stored, so as to reduce the storage space. For the collision events of platform driver incoming lines, insurance cases and the like entering the accident review system, effective data verification processing can be carried out, and whether the new data are not stored within 7 days can be specifically judged. For the collision event reported by the vehicle-mounted device 100, candidate events meeting the conditions may be screened, and specifically, the collision event closest to the reporting time of the driver may be screened.
For the collision event reported by the vehicle-mounted device 100 and the collision event reported by the user, the accident review system may perform aggregation matching on the collision event by using an aggregation matching algorithm, where the aggregation matching algorithm mainly performs the following functions: with same accident, different channel information sources aggregate together, carry out the structuralized storage, if: a collision event reported by the vehicle-mounted device 100, a collision event reported by a driver, a collision event for insurance, and the like; and dynamically maintaining the aggregation binding relationship, and dynamically managing the aggregation binding relationship of different collision events when different source information is changed, for example, when a vehicle license plate included in a certain collision event is changed, the license plate before the change is A, and the license plate after the change is B, the collision event should be unbound from the accident of the current license plate A, and the accident that the license plate is B is bound.
It should be noted that after the data validity check is performed on the collision event reported by the user, it is necessary to determine whether there is a collision event reported by the corresponding vehicle-mounted device 100, and if so, the collision event reported by the user is matched with the collision event reported by the vehicle-mounted device 100; and if not, entering a mining engine to mine the collision event reported by the corresponding vehicle-mounted equipment 100.
In detail, existing case information can be mined by combining case clues, and the recall rate of accidents is improved. Some accidents are slight in real life and are difficult to detect by collision detection, such as: the vehicle presses the feet of the pedestrian during running, and the passenger opens the door and collides with the pedestrian or the rider. The mining engine can perform accident scene mining by combining vehicle driving behaviors (rapid acceleration, rapid deceleration, rapid braking, rapid turning, rapid lane changing, stopping, vehicle position and the like) through information such as a driver, insurance emergence and the like, determine the accident occurrence time and place, and simultaneously extract the video of the vehicle-mounted equipment 100 and identify the object of the accident scene.
Further, the server 200 may also support incident review, incident tagging capabilities. After the accident happens, the environmental information (time, road and weather), the accident scene (vehicle speed, vehicle track and the categories of the three), the training information, the driving behavior and the like of the whole accident can be displayed, and the accident can be labeled and classified.
With reference to fig. 13, an embodiment of the present application further provides a data processing apparatus 1300, where the functions implemented by the data processing apparatus 1300 correspond to the steps executed by the foregoing method. The data processing device 1300 may be understood as a processor of the vehicle-mounted device 100, or may be understood as a component that is independent of the vehicle-mounted device 100 or the processor and implements the functions of the present application under the control of the vehicle-mounted device 100. The data processing apparatus 1300 may include a data obtaining module 1310, a determining module 1320, and an object identifying module 1330.
A data obtaining module 1310 for obtaining acceleration data and image data of the vehicle to be detected. In the embodiment of the present application, the data obtaining module 1310 may be configured to perform step S310 shown in fig. 3, and reference may be made to the foregoing description of step S310 regarding the relevant content of the data obtaining module 1310.
And a judging module 1320, configured to judge whether the vehicle to be detected collides according to the acceleration data. In the embodiment of the present application, the determining module 1320 may be configured to perform step S320 shown in fig. 3, and for the relevant content of the determining module 1320, reference may be made to the description of step S320.
The target recognition module 1330 is configured to, when the vehicle to be detected collides, perform target recognition on the image data according to a preset target recognition model to obtain a collision target. In the embodiment of the present application, the target identifying module 1330 may be configured to perform step S330 shown in fig. 3, and reference may be made to the foregoing description of step S330 for relevant contents of the target identifying module 1330.
The acceleration data includes vehicle acceleration and gravitational acceleration, and the determining module 1320 is specifically configured to: calculating to obtain the horizontal acceleration and the integral value of the vehicle to be detected according to the acceleration and the gravity acceleration of the vehicle, wherein the integral value represents the speed change of the vehicle to be detected in the horizontal direction; judging whether the horizontal acceleration is greater than or equal to a preset horizontal acceleration threshold value or not and whether the integral value is greater than or equal to a preset integral value threshold value or not; and if the horizontal acceleration is greater than or equal to a preset horizontal acceleration threshold value and the integral value is greater than or equal to a preset integral value threshold value, judging that the vehicle to be detected collides.
The determining module 1320 is further specifically configured to: carrying out vector outer product calculation on the acceleration of the vehicle and the acceleration of gravity to obtain the horizontal acceleration of the vehicle to be detected; and carrying out window integral calculation on the acceleration and the gravity acceleration of the vehicle to obtain an integral value of the vehicle to be detected.
The target identification module 1330 is specifically configured to: identifying the image data according to a preset target identification model to obtain at least one identification target; calculating the collision grade of the vehicle to be detected according to the acceleration and the gravity acceleration of the vehicle; and screening the at least one recognition target according to the collision grade of the vehicle to be detected to obtain the collision target.
Referring to fig. 14, after the data obtaining module 1310, the data processing apparatus 1300 may further include: the angular velocity obtaining module 1340 is configured to obtain angular velocity data of the vehicle to be detected; a drop detection module 1350 configured to perform drop detection according to the acceleration data and the angular velocity data.
It should be noted that, for part or all of the functions of the data processing apparatus 1300 and the data processing system 10 provided in the embodiment of the present application, reference may be made to the foregoing detailed description of the data processing method, which is not described herein again.
In addition, an embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program performs the steps of the data processing method.
The computer program product of the data processing method provided in the embodiment of the present application includes a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute steps of the data processing method in the above method embodiment, which may be referred to specifically in the above method embodiment, and are not described herein again.
In summary, the data processing method, the data processing device, the data processing system, the vehicle-mounted device and the server provided by the embodiment of the application judge whether a collision occurs according to the acceleration data of the vehicle to be detected, and perform target recognition on the image data according to the target recognition model when the collision occurs to obtain the collision target, so that the recognition of most collision events is realized, the collision detection is performed, and the collision target is obtained at the same time, thereby solving the problem of low comprehensiveness of the collision detection in the prior art.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. A data processing method is characterized by being applied to vehicle-mounted equipment, and comprises the following steps:
acquiring acceleration data and image data of a vehicle to be detected;
judging whether the vehicle to be detected is collided or not according to the acceleration data;
and if so, carrying out target recognition on the image data according to a preset target recognition model to obtain a collision target.
2. The data processing method of claim 1, wherein the acceleration data includes vehicle acceleration and gravitational acceleration, and the step of determining whether the vehicle to be detected has a collision based on the acceleration data includes:
calculating to obtain a horizontal acceleration and an integral value of the vehicle to be detected according to the vehicle acceleration and the gravity acceleration, wherein the integral value represents the speed change of the vehicle to be detected in the horizontal direction;
judging whether the horizontal acceleration is greater than or equal to a preset horizontal acceleration threshold value or not and whether the integral value is greater than or equal to a preset integral value threshold value or not;
and if the horizontal acceleration is greater than or equal to a preset horizontal acceleration threshold value and the integral value is greater than or equal to a preset integral value threshold value, judging that the vehicle to be detected collides.
3. The data processing method according to claim 2, wherein the step of calculating the horizontal acceleration and the integral value of the vehicle to be detected according to the vehicle acceleration and the gravitational acceleration comprises:
carrying out vector outer product calculation on the vehicle acceleration and the gravity acceleration to obtain the horizontal acceleration of the vehicle to be detected;
and carrying out window integral calculation on the vehicle acceleration and the gravity acceleration to obtain an integral value of the vehicle to be detected.
4. The data processing method of claim 2, wherein the step of performing the target recognition on the image data according to a preset target recognition model to obtain the collision target comprises:
identifying the image data according to a preset target identification model to obtain at least one identification target;
calculating the collision grade of the vehicle to be detected according to the vehicle acceleration and the gravity acceleration;
and screening the at least one recognition target according to the collision grade of the vehicle to be detected to obtain the collision target.
5. The data processing method of claim 2, wherein the data processing method further comprises:
acquiring angular speed data of the vehicle to be detected;
and performing drop detection according to the vehicle acceleration and angular velocity data.
6. A data processing method is applied to a server, the server is in communication connection with vehicle-mounted equipment, and the data processing method comprises the following steps:
acquiring a collision target and real-time positioning data of a vehicle to be detected from the vehicle-mounted equipment, wherein the collision target is obtained by performing target recognition on image data of the vehicle to be detected by the vehicle-mounted equipment according to a preset target recognition model;
generating a motion track of the vehicle to be detected according to the real-time positioning data;
and judging whether the vehicle to be detected collides with the collision target or not according to the motion track.
7. The data processing method according to claim 6, wherein the step of determining whether the vehicle to be detected collides with the collision target according to the motion trajectory comprises:
acquiring the collision grade of the vehicle to be detected from the vehicle-mounted equipment;
acquiring a motion track of a preset duration of the vehicle to be detected according to the collision grade of the vehicle to be detected;
judging whether the motion track is smaller than a preset motion track threshold value or not;
and if so, judging that the vehicle to be detected collides with the collision target.
8. The data processing method of claim 7, wherein the data processing method further comprises:
and when the vehicle to be detected is judged to collide with the collision target, storing the vehicle to be detected and the collision target.
9. The data processing method of claim 8, wherein the data processing method further comprises:
acquiring a collision event reported by a user, wherein the collision event comprises a collision vehicle;
and matching the collision vehicle with the stored vehicle to be detected, and judging whether the collision vehicle is the same vehicle.
10. A data processing system is characterized by comprising a vehicle-mounted device and a server which are in communication connection;
the vehicle-mounted equipment is used for acquiring acceleration data and image data of a vehicle to be detected, judging whether the vehicle to be detected collides or not according to the acceleration data, and performing target recognition on the image data according to a preset target recognition model when the vehicle to be detected collides to obtain a collision target;
the server is used for obtaining a collision target and real-time positioning data of a vehicle to be detected from the vehicle-mounted equipment, generating a motion track of the vehicle to be detected according to the real-time positioning data, and judging whether the vehicle to be detected collides with the collision target or not according to the motion track.
CN202011126305.9A 2020-10-20 2020-10-20 Data processing method, device and system, vehicle-mounted equipment and server Pending CN112132995A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011126305.9A CN112132995A (en) 2020-10-20 2020-10-20 Data processing method, device and system, vehicle-mounted equipment and server

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011126305.9A CN112132995A (en) 2020-10-20 2020-10-20 Data processing method, device and system, vehicle-mounted equipment and server

Publications (1)

Publication Number Publication Date
CN112132995A true CN112132995A (en) 2020-12-25

Family

ID=73852739

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011126305.9A Pending CN112132995A (en) 2020-10-20 2020-10-20 Data processing method, device and system, vehicle-mounted equipment and server

Country Status (1)

Country Link
CN (1) CN112132995A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113487912A (en) * 2021-06-21 2021-10-08 上汽通用五菱汽车股份有限公司 Traffic accident early warning and protecting method, automobile and readable storage medium
CN114332731A (en) * 2021-12-24 2022-04-12 阿波罗智联(北京)科技有限公司 City event identification method and device, automatic driving vehicle and cloud server
CN115131930A (en) * 2022-06-20 2022-09-30 沙龙机甲科技有限公司 Automobile alarm control method and device and vehicle
CN115223276A (en) * 2022-07-18 2022-10-21 南京四维智联科技有限公司 Driving collision detection method and device, storage medium and electronic equipment

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030002713A1 (en) * 2001-06-20 2003-01-02 Yang Chen Vision-based highway overhead structure detection system
CN201540600U (en) * 2009-11-27 2010-08-04 德尔福电子(苏州)有限公司 Vehicular video recording device
CN103568945A (en) * 2012-07-20 2014-02-12 纬创资通股份有限公司 Vehicle collision accident notification system and method
CN103593885A (en) * 2012-08-14 2014-02-19 能晶科技股份有限公司 Driving assisting apparatus and accident notification method thereof
CN106030336A (en) * 2014-02-17 2016-10-12 丰田自动车株式会社 Vehicle surrounding situation recognizing device and vehicle control device
CN106230940A (en) * 2016-08-01 2016-12-14 深圳市智行畅联科技有限公司 A kind of vehicle collision detection method based on vehicle intelligent terminal and system
CN106981184A (en) * 2017-03-20 2017-07-25 上海共佰克智能科技有限公司 Bicycle collision checking method and system
CN108248604A (en) * 2017-12-21 2018-07-06 上海新案数字科技有限公司 A kind of method and apparatus for judging vehicle collision
CN108806019A (en) * 2017-05-03 2018-11-13 上海蔚来汽车有限公司 Driving recording data processing method based on acceleration transducer and device
US20180362028A1 (en) * 2017-06-14 2018-12-20 Toyota Motor Engineering & Manufacturing North America, Inc. Vehicle collision mitigation
CN109670085A (en) * 2018-12-17 2019-04-23 成都路行通信息技术有限公司 A kind of method of discrimination of vehicle collision accident grade
CN109703558A (en) * 2018-12-28 2019-05-03 吉林大学 Automobile early warning security system based on Internet of Things
CN109903422A (en) * 2019-04-03 2019-06-18 北京首汽智行科技有限公司 A kind of vehicle-running recording system
CN110053577A (en) * 2019-04-11 2019-07-26 浙江吉利汽车研究院有限公司 A kind of vehicle collision accident processing method and processing system and vehicle
CN111627042A (en) * 2019-02-28 2020-09-04 丰田自动车株式会社 Collision determination server, program, and recording medium

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030002713A1 (en) * 2001-06-20 2003-01-02 Yang Chen Vision-based highway overhead structure detection system
CN201540600U (en) * 2009-11-27 2010-08-04 德尔福电子(苏州)有限公司 Vehicular video recording device
CN103568945A (en) * 2012-07-20 2014-02-12 纬创资通股份有限公司 Vehicle collision accident notification system and method
CN103593885A (en) * 2012-08-14 2014-02-19 能晶科技股份有限公司 Driving assisting apparatus and accident notification method thereof
CN106030336A (en) * 2014-02-17 2016-10-12 丰田自动车株式会社 Vehicle surrounding situation recognizing device and vehicle control device
CN106230940A (en) * 2016-08-01 2016-12-14 深圳市智行畅联科技有限公司 A kind of vehicle collision detection method based on vehicle intelligent terminal and system
CN106981184A (en) * 2017-03-20 2017-07-25 上海共佰克智能科技有限公司 Bicycle collision checking method and system
CN108806019A (en) * 2017-05-03 2018-11-13 上海蔚来汽车有限公司 Driving recording data processing method based on acceleration transducer and device
US20180362028A1 (en) * 2017-06-14 2018-12-20 Toyota Motor Engineering & Manufacturing North America, Inc. Vehicle collision mitigation
CN108248604A (en) * 2017-12-21 2018-07-06 上海新案数字科技有限公司 A kind of method and apparatus for judging vehicle collision
CN109670085A (en) * 2018-12-17 2019-04-23 成都路行通信息技术有限公司 A kind of method of discrimination of vehicle collision accident grade
CN109703558A (en) * 2018-12-28 2019-05-03 吉林大学 Automobile early warning security system based on Internet of Things
CN111627042A (en) * 2019-02-28 2020-09-04 丰田自动车株式会社 Collision determination server, program, and recording medium
CN109903422A (en) * 2019-04-03 2019-06-18 北京首汽智行科技有限公司 A kind of vehicle-running recording system
CN110053577A (en) * 2019-04-11 2019-07-26 浙江吉利汽车研究院有限公司 A kind of vehicle collision accident processing method and processing system and vehicle

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113487912A (en) * 2021-06-21 2021-10-08 上汽通用五菱汽车股份有限公司 Traffic accident early warning and protecting method, automobile and readable storage medium
CN113487912B (en) * 2021-06-21 2023-03-28 上汽通用五菱汽车股份有限公司 Traffic accident early warning and protecting method, automobile and readable storage medium
CN114332731A (en) * 2021-12-24 2022-04-12 阿波罗智联(北京)科技有限公司 City event identification method and device, automatic driving vehicle and cloud server
CN115131930A (en) * 2022-06-20 2022-09-30 沙龙机甲科技有限公司 Automobile alarm control method and device and vehicle
CN115131930B (en) * 2022-06-20 2024-03-08 长城汽车股份有限公司 Automobile alarm control method and device and vehicle
CN115223276A (en) * 2022-07-18 2022-10-21 南京四维智联科技有限公司 Driving collision detection method and device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
CN112132995A (en) Data processing method, device and system, vehicle-mounted equipment and server
RU2701051C2 (en) Method, system and machine-readable storage media for detecting objects using recurrent neural network and linked feature map
US10417816B2 (en) System and method for digital environment reconstruction
CN109800633B (en) Non-motor vehicle traffic violation judgment method and device and electronic equipment
JP6546271B2 (en) Image processing apparatus, object detection apparatus, and image processing method
US9881221B2 (en) Method and system for estimating gaze direction of vehicle drivers
WO2020042984A1 (en) Vehicle behavior detection method and apparatus
WO2019223655A1 (en) Detection of non-motor vehicle carrying passenger
US9697430B2 (en) Method and apparatus for identifying road signs
CN110909587A (en) Scene classification
CN108496178A (en) System and method for estimating Future Path
US20170301107A1 (en) Image processing device, in-vehicle display system, display device, image processing method, and computer readable medium
CN112349144B (en) Monocular vision-based vehicle collision early warning method and system
CN106781458A (en) A kind of traffic accident monitoring method and system
CN110741424B (en) Dangerous information collecting device
KR102246706B1 (en) Autonomous driving device and method for operating autonomous driving device in anomaly situation
DE112018004953T5 (en) INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING PROCESS, PROGRAM AND MOVING BODY
CN112289037B (en) Motor vehicle illegal parking detection method and system based on high visual angle under complex environment
KR20210152025A (en) On-Vehicle Active Learning Method and Apparatus for Learning Perception Network of Autonomous Vehicle
CN113239827A (en) Intelligent anti-collision method, system, equipment and storage medium based on deep learning
CN112241963A (en) Lane line identification method and system based on vehicle-mounted video and electronic equipment
Ahmed et al. Faster RCNN based Vehicle Detection and Counting Framework for Undisciplined Traffic Conditions
US20230103670A1 (en) Video analysis for efficient sorting of event data
WO2016157277A1 (en) Method and device for generating travelling environment abstract image
CN113752940B (en) Control method, equipment, storage medium and device for tunnel entrance and exit lamp

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201225