CN111382774B - Data processing method and device - Google Patents

Data processing method and device Download PDF

Info

Publication number
CN111382774B
CN111382774B CN201811652064.4A CN201811652064A CN111382774B CN 111382774 B CN111382774 B CN 111382774B CN 201811652064 A CN201811652064 A CN 201811652064A CN 111382774 B CN111382774 B CN 111382774B
Authority
CN
China
Prior art keywords
data
identification information
abstract
sensor
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811652064.4A
Other languages
Chinese (zh)
Other versions
CN111382774A (en
Inventor
周铮
倪慧
万蕾
高永强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201811652064.4A priority Critical patent/CN111382774B/en
Priority to EP19907076.4A priority patent/EP3896915A4/en
Priority to PCT/CN2019/130403 priority patent/WO2020140890A1/en
Publication of CN111382774A publication Critical patent/CN111382774A/en
Application granted granted Critical
Publication of CN111382774B publication Critical patent/CN111382774B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/40Bus networks
    • H04L12/40169Flexible bus arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/40Bus networks
    • H04L2012/40267Bus for use in transportation systems
    • H04L2012/40273Bus for use in transportation systems the transportation system being a vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Mobile Radio Communication Systems (AREA)
  • Arrangements For Transmission Of Measured Signals (AREA)

Abstract

The data processing method and device can be applied to the field of automatic driving and used for providing more inputs for data fusion and achieving a better data fusion effect. The method comprises the following steps: the first equipment acquires the identification information of the second equipment; the first equipment receives a first message sent by second equipment, wherein the first message contains original data; and the first message indicates the identification information. Through the identification information indicated in the first message, the first device can obtain the data source of the original data, can provide more input for data fusion, and achieves a better data fusion effect. The scheme provided by the embodiment can be applied to a communication system, such as V2X, LTE-V, V2V, internet of vehicles, MTC, ioT, LTE-M, M2M, internet of things and the like.

Description

Data processing method and device
Technical Field
The embodiment of the application relates to the technical field of communication, in particular to a data processing method and device.
Background
With the development of intelligent interconnection, automatic driving, electric vehicles and shared traveling, electronic systems formed by software, computing power and advanced sensors are becoming important. As the degree of automatic driving of vehicles increases, the complexity of electronic systems is continuously increasing, and data generated by the vehicles themselves will become more and more huge. Therefore, a new on-board computing architecture is needed to address the processing of massive data using existing high-performance processors.
A typical autopilot-oriented on-board computing architecture of the prior art is a centralized computing architecture, i.e., a central control unit containing a high performance processor, one or more distributed sensors, multiple vehicle control units, etc. The whole automatic driving data processing framework comprises four parts of perception, fusion, planning decision and control. The sensor data processing flow is approximately that a single sensor finishes the data conversion of the sensor, the data conversion result is sent to a central processing unit for data fusion, the most effective input is provided for the subsequent behavior arbitration and path planning decision, and the decision result is output to an execution control unit for controlling the vehicle. The processing procedure for sensor data fusion comprises a sensor abstraction processing and a data fusion processing, namely a sensing and fusion processing. The function of data fusion may be implemented in a central control unit, also called Mobile Data Center (MDC), DATA CENTER. The data abstraction process of the sensor may be a process performed on a distributed sensor or a process performed at a central control unit.
In a simple autopilot scenario, such as adaptive cruise, the relevant function may be accomplished using only one sensor. The sensor processes own data and directly outputs a control command to control the vehicle. However, in advanced scenes of automatic driving, control decisions of vehicles depend on perception of real environments, and it is particularly critical to establish an environment model by inputting data of sensors, wherein the environment model is established by integrating data input of various sensors by the automatic driving vehicles according to different weather, road conditions and the like depending on the input data of the sensors of different types, so that an environment model close to the real environment can be restored in the digital world.
With the continuous development of sensor technology, the types of sensors in vehicles are also becoming more and more abundant. The various sensors communicate with the central control unit via a high-speed bus. However, the sensor data fusion may input different data, such as data from different sensors, or data from different instances of the same sensor, or different types of data from the same sensor. For large data inputs of different types, how to achieve the correlation between sensor data, providing more efficient input for data fusion becomes a problem to be solved.
Disclosure of Invention
The embodiment of the application provides a data processing method and device, which are used for solving the problem of how to realize sensor data fusion for different data inputs.
The specific technical scheme provided by the embodiment of the application is as follows:
In a first aspect, a data processing method is provided, where an execution body of the method is a first device, and the method may be implemented by the following steps: the first equipment acquires the identification information of the second equipment; the first equipment receives a first message sent by second equipment, wherein the first message contains original data; and the first message indicates the identification information. Through the identification information indicated in the first message, the first device can obtain the data source of the original data, can provide more input for data fusion, and achieves a better data fusion effect.
In one possible design, the first device receives a second message sent by a third device, the second message containing first abstract data; and the second message indicates the identification information. Through the identification information indicated in the second message, the first device can obtain the data source of the first abstract data, can provide more input for data fusion, and achieves a better data fusion effect.
In one possible design, the second device and the third device are the same device. When the second device and the third device are the same device, the first device can also obtain the association relationship between the original data and the first abstract data according to the identification information indicated by the first message and the identification information indicated by the second message, so that more inputs are provided for data fusion, and a better data fusion effect is achieved.
In one possible design, the first device establishes an association between the original data and the first abstract data according to the identification information. By establishing the association relationship, the first device can better distinguish different data and better perform data fusion.
In one possible design, the first device determines second abstract data from the raw data.
In one possible design, the first device updates abstract data based on the original data; wherein the abstract data comprises at least one of the first abstract data and the second abstract data. Therefore, when the abstract data is inaccurate, the original data corresponding to the abstract data can be found, and the abstract data is updated according to the original data so as to obtain the accurate abstract data.
In one possible design, the first device establishes a data path of the original data according to the identification information. By binding the identification information and the data path of the original data, the first device is able to determine the source of the received data when it receives the data of the data path of the original data.
In one possible design, the first device establishes a data path for the first abstract data according to the identification information. By binding the identification information and the data path of the first abstract data, the first device is able to determine the source of the received data upon receiving the data of the data path of the first abstract data.
In one possible design, the identification information includes any one or more of the following: an inherent identification ID of the second device, an application layer ID assigned by the first device to the second device, an internet protocol IP address, a medium access control layer MAC address, a port number, or a timestamp.
In a second aspect, a data processing apparatus is provided and applied to a first device, where the apparatus includes a data fusion module and a receiving module, where the data fusion module is configured to obtain identification information of a second device; the receiving module is used for receiving a first message sent by the second equipment, wherein the first message contains original data; and the first message indicates the identification information. Through the identification information indicated in the first message, the first device can obtain the data source of the original data, can provide more input for data fusion, and achieves a better data fusion effect.
In one possible design, the receiving module is further configured to: receiving a second message sent by third equipment, wherein the second message contains first abstract data; and the second message indicates the identification information. Through the identification information indicated in the second message, the first device can obtain the data source of the first abstract data, can provide more input for data fusion, and achieves a better data fusion effect.
In one possible design, the second device and the third device are the same device. When the second device and the third device are the same device, the first device can also obtain the association relationship between the original data and the first abstract data according to the identification information indicated by the first message and the identification information indicated by the second message, so that more inputs are provided for data fusion, and a better data fusion effect is achieved.
In one possible design, the data fusion module is further configured to: and establishing an association relation between the original data and the first abstract data according to the identification information. By establishing the association relationship, the first device can better distinguish different data and better perform data fusion.
In one possible design, the apparatus further includes a data abstraction module to determine second abstract data from the raw data.
In one possible design, the data fusion module is further configured to: updating the abstract data according to the original data; wherein the abstract data comprises at least one of the first abstract data and the second abstract data. Therefore, when the abstract data is inaccurate, the original data corresponding to the abstract data can be found, and the abstract data is updated according to the original data so as to obtain the accurate abstract data.
In one possible design, the data fusion module is further configured to: and establishing a data path of the original data according to the identification information. By binding the identification information and the data path of the original data, the first device is able to determine the source of the received data when it receives the data of the data path of the original data.
In one possible design, the data fusion module is further configured to: and establishing a data path of the first abstract data according to the identification information. By binding the identification information and the data path of the first abstract data, the first device is able to determine the source of the received data upon receiving the data of the data path of the first abstract data.
In one possible design, the identification information includes any one or more of the following: an inherent identification ID of the second device, an application layer ID assigned by the first device to the second device, an internet protocol IP address, a medium access control layer MAC address, a port number, or a timestamp.
In one possible design, the device may be a chip or an integrated circuit. The receiving module is an input/output circuit or interface of the chip or the integrated circuit.
In one possible design, the device is an MDC.
In a third aspect, a chip is provided and is denoted as a first chip, where the first chip includes a data fusion module and a receiving module, where the data fusion module is configured to obtain identification information of a second device; the receiving module is used for receiving a first message sent by the second equipment, wherein the first message contains original data; and the first message indicates the identification information. Through the identification information indicated in the first message, the first device can obtain the data source of the original data, can provide more input for data fusion, and achieves a better data fusion effect.
In one possible design, the receiving module is further configured to: receiving a second message sent by third equipment, wherein the second message contains first abstract data; and the second message indicates the identification information. Through the identification information indicated in the second message, the first device can obtain the data source of the first abstract data, can provide more input for data fusion, and achieves a better data fusion effect.
In one possible design, the second device and the third device are the same device. When the second device and the third device are the same device, the first device can also obtain the association relationship between the original data and the first abstract data according to the identification information indicated by the first message and the identification information indicated by the second message, so that more inputs are provided for data fusion, and a better data fusion effect is achieved.
In one possible design, the data fusion module is further configured to: and establishing an association relation between the original data and the first abstract data according to the identification information. By establishing the association relationship, the first device can better distinguish different data and better perform data fusion.
In one possible design, the data fusion module is further configured to: second abstract data is received from the second chip.
In one possible design, the data fusion module is further configured to: updating the abstract data according to the original data; wherein the abstract data comprises at least one of the first abstract data and the second abstract data. Therefore, when the abstract data is inaccurate, the original data corresponding to the abstract data can be found, and the abstract data is updated according to the original data so as to obtain the accurate abstract data.
In one possible design, the data fusion module is further configured to: and establishing a data path of the original data according to the identification information. By binding the identification information and the data path of the original data, the first device is able to determine the source of the received data when it receives the data of the data path of the original data.
In one possible design, the data fusion module is further configured to: and establishing a data path of the first abstract data according to the identification information. By binding the identification information and the data path of the first abstract data, the first device is able to determine the source of the received data upon receiving the data of the data path of the first abstract data.
In one possible design, the identification information includes any one or more of the following: an inherent identification ID of the second device, an application layer ID assigned by the first device to the second device, an internet protocol IP address, a medium access control layer MAC address, a port number, or a timestamp.
In a fourth aspect, there is provided a data processing apparatus having the functionality to implement any one of the possible designs of the first aspect and the second aspect. The functions may be implemented by hardware, or may be implemented by hardware executing corresponding software. The hardware or software includes one or more modules corresponding to the functions described above.
Alternatively, the data processing device may be a chip or an integrated circuit.
In one possible design, when part or all of the functions are implemented in software, the data processing apparatus includes: a processor for executing a program, and a transceiver for communicating with other devices, e.g. receiving a first message sent by a second device, the data processing apparatus being capable of implementing the method as described in any one of the possible designs of the first aspect and the first aspect described above. Optionally, the system further comprises a memory for storing programs executed by the processor.
In the alternative, the memory may be physically separate units or may be integrated with the processor.
In one possible design, the data processing device includes a processor when part or all of the functions are implemented in software. The memory for storing the program is located outside the data processing device and the processor is connected to the memory via a circuit/wire for reading and executing the program stored in the memory.
In a fifth aspect, a computer storage medium is provided, storing a computer program comprising instructions for performing the methods of the above aspects.
In a sixth aspect, there is provided a computer program product containing instructions which, when run on a computer, cause the computer to perform the method of the above aspects.
In a seventh aspect, a communication system is provided comprising an MDC for performing the method in any of the possible designs of the first aspect and the first aspect described above and one or more sensors.
Drawings
FIG. 1 is a schematic diagram of a computing architecture for automated driving data processing in an embodiment of the present application;
FIG. 2 is a schematic diagram of a system architecture according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a data processing model in an embodiment of the present application;
FIG. 4 is a flow chart of a data processing method according to an embodiment of the application;
FIG. 5 is a schematic diagram of a method for acquiring sensor identification information according to an embodiment of the present application;
FIG. 6 is a schematic flow chart of a data processing method for an application scenario in an embodiment of the application;
FIG. 7 is a second flowchart of a data processing method for an application scenario in an embodiment of the present application;
FIG. 8 is a flowchart of a method for processing application scenario two data in an embodiment of the present application;
fig. 9 is a flow chart of a method for processing three data of an application scenario in an embodiment of the application;
FIG. 10 is a schematic diagram of a data processing apparatus according to an embodiment of the present application;
FIG. 11 is a schematic diagram of a second embodiment of a data processing apparatus.
Detailed Description
Embodiments of the present application will be described in detail below with reference to the accompanying drawings.
The data processing method provided by the embodiment of the application can be used in an intelligent traffic system. The data processing method is applied to the terminal. In an intelligent transportation system, the terminal may be a vehicle, for example an intelligent vehicle. The method provided by the application can be applied to intelligent traffic services, such as automatic driving, electric automobile driving and sharing traveling. An electronic system is arranged in the terminal, and the electronic system realizes intelligent traffic service through intelligent interconnection, software, artificial intelligent elements, advanced analysis tools or operating systems and the like. And the user interaction interface and the experience level are improved through software and electronic technology. Of course, the method provided by the application is not limited to the above scenario.
Taking autopilot data processing as an example, as shown in fig. 1, a computing architecture of autopilot data processing includes: a set of sensors 101, an MDC102, and an actuator set 103.
The sensor group 101 includes one or more sensors. Different sensors have different functions. For example, the functions of the sensor may include global positioning system (global positioning system, GPS), sonar, camera, radar, and lidar. Existing in-vehicle sensors come in a wide variety of types, and in-vehicle sensors are broadly divided into three categories: an internal sensor, an external sensor, and a metadata sensor.
Internal sensors are typically mounted on or in automobiles and are focused on the vehicle itself. For measuring dynamic conditions of automobiles and other internal data. Typical internal sensors include gyroscopes, accelerometers, steering angle sensors, wiper activity sensors, steering indicators, and the like.
External sensors are typically mounted on or in the vehicle, focusing on the vehicle surroundings and measuring the vehicle environment. Typical external sensors include radar, lasers, ultrasonic sensors, and cameras.
Metadata sensors are typically used to obtain a measured data source. Typical metadata sensors include cloud data, navigational maps, and vehicle-to-outside (vehicle to everything, V2X).
The MDC102 is configured to process the data of the sensor group 101 and generate an execution command for controlling the actuator group 103. The MDC102 may be configured as a different management module for performing different functions. For example, the MDC102 includes a sensor data fusion module 1021, a contextual behavior arbitration module 1022, a motion management module 1023, a human-machine interface (HMI) management module 1024, a security management module 1025.
The sensor data fusion module 1021 is used for solving a computing environment model and a model of automobile state and capability. The whole environment model is established through abstract data input of the sensor, and the environment model is a key input of a subsequent planning decision. The closer the real world environment model is constructed, the more helpful the planning decision module makes the correct decisions. The construction of the environmental model includes positioning, vehicle position, vehicle dynamics, objects in the environment, fleet tracks, mesh fusion, occupied mesh, road and lane fusion, road and lane, vehicle databases, and the like.
The actuator group 103 includes one or more actuators. The executor is used for acquiring the execution command from the MDC and executing the corresponding scheme. Such as braking, steering.
In one possible implementation, the method provided by embodiments of the present application involves communication between the sensor group 101 and the MDC 102. Based on the partial modules in the system architecture to which the present application relates, as shown in fig. 2, a system architecture to which the method provided by the embodiment of the present application is applicable includes one or more sensors 201 and an MDC202.
A sensor 201 for acquiring raw data corresponding to the sensor function, for example, a camera sensor acquires image capturing or photographing data. Optionally, the sensor 201 may also have a data processing function, where the collected raw data is processed, for example, feature extraction or object extraction is performed on the raw data, so as to obtain abstract data. The sensor sends the raw data and/or abstract data to the MDC. The sensor 201 may employ an electronic control unit (electronic control unit, ECU) located on the sensor to perform the operations of the functions described above.
The MDC202 is a central high-performance computing platform on the terminal. The method is used for acquiring the original data and/or abstract data of the sensor 201, fusing different data and establishing an environment model. And reconstructing the real environment by using the original data and the abstract data. The MDC202 may also perform data on the original data, for example, feature extraction or object extraction on the original data, to obtain abstract data.
Optionally, the system architecture may also include a control EDC203. The control ECU203 refers to an electronic control unit responsible for control of the vehicle body and power.
The above description of the architecture is an environment architecture to which the data processing method provided by the present application may be applied, and is provided for convenience in understanding the data processing method provided by the present application, and is not limited to the above environment architecture.
More generally, the execution body of the data processing method provided by the embodiment of the present application may be described by a first device, and the first device may communicate with other devices such as a second device and a third device. For example, the first device may be the MDC above and the second and third devices may be the sensors above. The following describes in detail a data processing method provided by an embodiment of the present application with reference to the accompanying drawings.
As shown in FIG. 3, the original data and the abstract data are understood in the present application as follows. After the sensor captures the physical signal, the original electrical signal is obtained. The raw data is obtained after signal processing, e.g. analog-to-digital conversion, of the raw electrical signal. The original data is processed to obtain abstract data, for example, feature extraction or object extraction is performed on the original data to obtain abstract data.
As shown in fig. 4, the flow of the data processing method provided by the embodiment of the present application is as follows.
S401, the first device acquires identification information of the second device.
The identification information is any information that can identify the second device. The identification information includes any one or more of the following: an inherent Identification (ID) of a second device, an application layer ID assigned by the first device for the second device, an internet protocol (internet protocol, IP) address, a Media Access Control (MAC) address, a port number, or a timestamp.
Taking the second device as a sensor for example, the identification information is described in detail, and the identification information of any type of second device can be similarly interpreted through the identification information of the sensor.
1) The identification information of the sensor may be a transport layer or a data link layer address. The following is an example.
For example, an ID inherent to the sensor device. The sensor factory equipment ID may include a sensor type in a bit, manufacturer information in b bit, product model information in c bit, and product serial number information in d bit.
As well as an IP address. Including IPv4/IPv6 transport layer addresses.
Also for example, MAC addresses.
As another example, a random number generated from a key.
For another example, the sensor assigns itself a temporary ID.
For another example, a sensor ID is assigned to the sensor by the system of profile acquisition. Including sensor type + serial number, which may be generated based on sensor mounting location information or registration order or capability, etc.
2) The identification information of the sensor may be an application layer ID. Specifically, the application layer ID allocated to the sensor by the central processing unit during the sensor registration process is used to identify the sensor during the communication process. The central processing unit is the MDC. The method of acquiring the 2 nd kind of sensor identification information is described in detail below by fig. 5.
S402, the first device receives a first message sent by the second device.
The first message includes original data, and the first message indicates the identification information.
The manner in which the first message indicates the identification information may include, but is not limited to, two types.
First, the first message carries identification information.
Second, the identification information is indicated through a channel through which the first message is transmitted.
Based on the second, in the embodiment of the present application, before the first device receives the first message sent by the second device, a data path of the original data is established according to the identification information of the second device. The data path of the original data is used for transmitting the original data. And in the process of establishing the data path of the original data, configuring a first path identifier for the data path of the original data, wherein the first path identifier is determined according to the identifier information of the second device, or the first path identifier is the identifier information of the second device. For example, the first path is identified as the ID of the second device. When the first device receives the first message through the data path of the original data, the first device can determine that the original data carried in the first message comes from the second device according to the first path identifier. Taking the second device as an example of a sensor, the first device may receive raw data sent by different types of sensors, and may also receive raw data sent by different instances of the same sensor. If each data path corresponds to a first path identifier, the first device may distinguish the data source of the original data according to the first path identifier.
In the present application, the second device (or other device such as the third device) may determine the abstract data and transmit the abstract data to the first device, and the first device may receive the abstract data from the second device (or other device such as the third device) and record the abstract data as the first abstract data. The abstract data may also be determined by the first device class and noted as second abstract data. It is also possible that the first device receives the first abstract data from the second device (or other devices such as the third device) and determines the second abstract data according to the original data, and uses the first abstract data and the second abstract data in combination or alternatively. If the second device (or other devices such as the third device) determines the abstract data and sends the abstract data to the first device, the order of the first device receiving the abstract data from the second device (or other devices such as the third device) and the original data from the second device is not limited, and the original data may be received first, or the abstract data may be received first. Based on this, in the embodiment of the present application, the first device acquires the identification information of the second device, and the first device receives the second message sent by the third device, where the second message includes the first abstract data, and the second message indicates the identification information, which may independently form the scheme claimed in the present application. And the first device receives a first message sent by the second device, the first message containing the original data, and the first message indicating the identification information as a subordinate scheme of the scheme.
Based on the scheme that the second device (or other devices such as the third device) determines the first abstract data, in the embodiment of the application, the first device may also receive a second message sent by the other device (which may be denoted as the third device), where the second message includes the first abstract data, and the second message indicates the identification information. The identification information indicated by the second message may be the same as the identification information indicated by the first message, e.g. both are IDs of the second device. The second message may be a second message, and the first message may be a second message, and the second message may be a second message. The first device determines that the original data in the first message and the first abstract data in the second message are derived from the same device according to the identification information indicated by the first message and the identification information indicated by the second message, namely the second device and the third device are the same device. It may also be determined that the original data in the first message and the first abstract data in the second message have an association, thereby establishing an association between the original data in the first message and the first abstract data in the second message.
Similarly, the scheme of the first abstract data is determined based on the second device (or other devices such as the third device) and a data path of the first abstract data is established according to the identification information of the second device before the first device receives the second message sent by the second device. The data path of the first abstract data is used for transmitting the first abstract data. And establishing a data path of the first abstract data, and configuring a second path identifier for the data path of the first abstract data, wherein the second path identifier is determined according to the identification information of the second device, or the second path identifier is the identification information of the second device. For example, the second path is identified as the ID of the second device. When the first device receives the second message through the data path of the first abstract data, the first abstract data carried in the second message can be determined to come from the second device according to the second path identification. Taking the second device as a sensor for example, the first device may receive the first abstract data sent by different types of sensors, or may receive the first abstract data sent by different instances of the same sensor. If each data path of the abstract data corresponds to one second path identifier, the first device may distinguish a data source of the first abstract data according to the second path identifier. Here, the identification information indicated by the second message may also be a second path identification, and the identification information indicated by the first message may also be a first path identification. In this way, the first device can determine that the original data in the first message and the first abstract data in the second message originate from the same device and have an association relationship according to the first path identifier and the second path identifier.
If the first device determines the abstract data (denoted as second abstract data), the first device obtains the original data from the second device, processes the original data, and obtains the second abstract data.
Whether first abstract data or second abstract data, may be collectively referred to as abstract data. The first device may update the abstract data based on the original data. For example, according to the association relation between the first abstract data and the original data, the first device determines the original data with the association relation from the first abstract data, and updates or perfects the first abstract data, so that the abstract data is more accurate.
When the first device obtains the first abstract data from the second device and determines the second abstract data according to the original data, the first device obtains two abstract data of the first abstract data and the second abstract data. The first device may then verify the second abstract data from the first abstract data or may verify the first abstract data from the second abstract data.
In a word, through the identification information, the first device can obtain the association relation between at least two of the original data, the first abstract data or the second abstract data and the source of at least one of the original data, the first abstract data or the second abstract data, so that more input information is provided for data fusion, and a better data fusion effect is achieved.
When the identification information is a timestamp, the first device may also obtain data of different types of sensors at the same time, or obtain data of different sensor instances at the same time, or obtain raw data and abstract data at the same time. According to the time stamp, the association relation between at least two of the original data, the first abstract data or the second abstract data at the same moment is obtained, for example, the two sensors are the left camera and the right camera of the terminal respectively, and the data of the two cameras at the same moment are obtained, so that more input information is provided for data fusion, and a better data fusion effect is achieved.
Hereinafter, as shown in fig. 5, the method for acquiring the sensor identification information of the 2 nd) type is as follows.
S501, the sensor sends a sensor registration request to the MDC.
The sensor registration request carries one or more of all examples of the 1 st sensor identification information above. The sensor registration request may also include information such as the installation location, sensor capabilities, etc.
S502, the MDC may allocate a unique sensor identifier in a system, i.e. an application layer ID, to the sensor according to the registration request of the sensor.
The application layer ID may include a sensor type + serial number, which may be generated based on sensor installation location information or registration order or capability, etc. If the sensor sends a temporary ID assigned by the sensor itself in S501, the MDC may also generate a corresponding temporary ID, with the subsequent sensor and MDC being identified by the two temporary ID pairs.
Based on the description of the above scheme, the data processing method according to the embodiment of the present application is further described in detail below in conjunction with a specific application scenario. Taking the first device as MDC and the second device as a sensor as an example, one or more sensors may each transmit data by establishing a data path. As described above, there are three application scenarios. The following description will be given respectively.
For an increasing number of sensor types, sensor data fusion involves fusion of different data. For example, data from different sensors, or data from different instances of the same sensor, or different data from the same sensor. Therefore, different data are required to be associated, so that more information input is provided for sensor data fusion, and the sensor data fusion is better realized.
The sensor can obtain abstract data for the original data processing in the first application scene and send the abstract data to the MDC.
As shown in fig. 6, a specific procedure of the data processing method in the application scenario is as follows.
S601a, the sensor sends a registration request to the MDC, and the MDC receives the registration request sent by the sensor.
The description of the registration request may refer to the description of the registration request in S501, and in addition, the registration request may further include information such as the type of the sensor, the capability of the sensor, or the installation location of the sensor.
Optionally, before S601a, the sensor is powered up for initialization.
S601b, MDC sends a confirmation response to the sensor.
The confirmation response is used for responding to the registration request and notifying a registration result.
Alternatively, if the method of acquiring the sensor identification information according to the 2 nd) is followed, the MDC may also perform an operation as S502. That is, the MDC may also assign a unique sensor identifier in a system to the sensor according to the registration request sent by the sensor acquired in S601 a. Specific methods for allocating identification information to the sensors may refer to S502, and specific details are not described herein.
S602, the MDC establishes or configures a data path of the raw data between the sensor and the MDC.
This step may be understood as including two steps S601a and S601b, or S601a and S601b may be understood as steps independent of S602, corresponding to the preparation steps before S602. Optionally, S602 may further include other steps, which are not specifically described in the present application.
The MDC establishes or configures a data path of the original data according to the identification information of the sensor included in the registration request.
The data path of the original data may be a specific connection of the data plane, a specific memory block, or a proprietary data path. And establishing or configuring a data path of the original data, namely binding the identification information of the sensor with the data path of the original data. Thus, when the MDC receives the original data through the data path of the original data, the source of the original data can be determined according to the identification information bound with the data path of the original data.
S603, the MDC establishes or configures a data path for abstract data between the sensor and the MDC.
Similarly, the step S603 may be understood as including two steps S601a and S601b, and may be understood as steps S601a and S601b being independent of S603, corresponding to the preparation steps before S603. Optionally, S603 may further include other steps, which are not specifically described in the present application.
The MDC establishes or configures a data path of the abstract data according to the identification information of the sensor included in the registration request.
The data path for abstracting data may be a specific connection of the data plane, a specific memory block, or a proprietary data path. And establishing or configuring a data path of the abstract data, namely binding the identification information of the sensor with the data path of the abstract data. Thus, when the MDC receives the abstract data through the data path of the abstract data, the source of the abstract data can be determined according to the identification information bound with the data path of the abstract data.
So far, the data path of the original data and the data path of the abstract data between the sensor and the MDC are already established, and the method can further comprise steps S604 to S606 in the first application scenario.
S604, the sensor sends the original data to the MDC, and the MDC receives the original data sent by the sensor.
Specifically, the sensor sends the original data to the MDC through the data path of the original data, that is, the MDC receives the original data sent by the sensor through the data path of the original data. The MDC may receive the data of multiple sensors, and thus, the MDC determines the source of the received raw data as the sensor according to the identification information of the data path of the raw data.
S605, the sensor sends abstract data to the MDC, and the MDC receives the abstract data sent by the sensor.
Specifically, the sensor sends the abstract data to the MDC through the data path of the abstract data, that is, the MDC receives the abstract data sent by the sensor through the data path of the abstract data. The MDC may receive the data of multiple sensors, and thus, the MDC determines the source of the received abstract data as the sensor according to the identification information of the data path of the abstract data.
S606, the MDC establishes an association relation between the received original data and the abstract data according to the identification information of the data path of the original data and the identification information of the data path of the abstract data.
The original data and the abstract data may each carry time-stamped information, or the identification information may be a time stamp. The MDC may also correlate the original data with the abstract data based on the time-stamped information. The data of the sensors of different types can be associated according to the time stamp, and the data of the different instances of the sensors of the same type can be associated according to the time stamp. The type refers to the difference of the function or model of the sensor.
By utilizing the association relation, the MDC can update or perfect the associated abstract data according to the original data.
In one possible implementation, the sensor includes two functional modules, a signal processing module and a data processing module. The signal processing module is used for outputting original data, and the data processing module is used for outputting abstract data.
Based on this, as shown in fig. 7, the specific procedure of the data processing method in the application scenario is refined as follows. The method described in fig. 6 is only different from the method described in fig. 6 in that the two modules are divided, the functions of the sensor in fig. 6 are implemented by two functional modules, for example, the description of various messages such as registration request, acknowledgement response, etc. and the detailed description of each step are not different from the method shown in fig. 6, and specifically refer to the method shown in fig. 6 and are not repeated.
S701a, the signal processing module of the sensor sends a registration request to the MDC, and the MDC receives the registration request sent by the signal processing module of the sensor.
S701b, the MDC sends a confirmation response to the signal processing module of the sensor.
S702, the MDC establishes or configures a data path of the raw data between the signal processing module of the sensor.
S703a, the data processing module of the sensor sends a registration request to the MDC, and the MDC receives the registration request sent by the data processing module of the sensor.
S703b, the MDC sends a confirmation response to the data processing module of the sensor.
S704, the MDC establishes or configures a data path of abstract data with the data processing module of the sensor.
Optionally, S705 to S707 may also be performed.
S705, the signal processing module of the sensor sends the original data to the MDC, and the MDC receives the original data sent by the signal processing module of the sensor.
S706, the data processing module of the sensor sends abstract data to the MDC, and the MDC receives the abstract data sent by the data processing module of the sensor.
S707, the MDC establishes an association relation between the received original data and the abstract data according to the identification information of the data path of the original data and the identification information of the data path of the abstract data.
And the sensor only sends the original data to the MDC, and the MDC processes the original data to obtain abstract data.
The specific process of the data processing method in the application scenario two is the same as the process in the application scenario shown in fig. 6. When the function of the sensor is implemented by two functional modules, namely, a signal processing module and a data processing module, and the function of the MDC is implemented by two modules, namely, a data abstraction module and a data fusion module, as shown in fig. 8, a specific method in the second application scenario is as follows. Similarly, the method shown in fig. 8 is implemented by using different functional modules only for implementing the execution of the sensor and the MDC, for example, the description of various messages such as registration request, acknowledgement response, etc. and the detailed description of each step are not different from the method shown in fig. 6, specifically, refer to the method shown in fig. 6, and will not be repeated.
S801a, a signal processing module of the sensor sends a registration request to a data fusion module of the MDC, and the data fusion module of the MDC receives the registration request sent by the signal processing module of the sensor.
S801b, a data fusion module of the MDC sends a confirmation response to a signal processing module of the sensor.
S802, a data fusion module of the MDC establishes or configures a data path of original data between the data fusion module and a signal processing module of the sensor.
In this way, the signal processing module of the MDC may transmit the raw data to the data fusion module of the MDC through the data path.
S803a, a signal processing module of the sensor sends a registration request to a data abstraction module of the MDC, and the data abstraction module of the MDC receives the registration request sent by the signal processing module of the sensor.
S803b, the data abstraction module of the MDC sends a confirmation response to the signal processing module of the sensor.
The data abstraction module of the MDC suggests or configures a data path for the raw data between the signal processing module of the sensor.
In this way, the signal processing module of the MDC may transmit the raw data to the data abstraction module of the MDC via the data path.
The data abstraction module of the MDC may process the raw data to obtain abstract data.
S805a, a data abstraction module of the MDC sends a registration request to a data fusion module of the MDC, and the data fusion module of the MDC receives the registration request sent by the data abstraction module of the MDC.
S805b, the data fusion module of the MDC sends a confirmation response to the data abstraction module of the MDC.
S806, the data fusion module of the MDC suggests or configures a data path of abstract data between the data abstraction module of the MDC.
In this way, the data abstraction module of the MDC may transmit the process-acquired abstract data to the data fusion module of the MDC via the data path.
Optionally, S807 to S809 may also be performed.
S807, a signal processing module of the sensor sends original data to a data fusion module of the MDC, and the data fusion module of the MDC receives the original data sent by the signal processing module of the sensor.
S808, the signal processing module of the sensor sends the original data to the data abstraction module of the MDC, and the data abstraction module of the MDC receives the original data sent by the signal processing module of the sensor.
The data fusion module processes the received original data to obtain abstract data.
S809, the data abstraction module of the MDC sends the obtained abstract data to the data fusion module of the MDC, and the data fusion module of the MDC receives the abstract data from the data abstraction module of the MDC.
S810, a data fusion module of the MDC establishes an association relation between the original data and the abstract data according to the identification information.
And (3) application scene III. In combination with the first application scenario and the second application scenario, not only the sensor sends the original data and the abstract data to the MDC, but also the MDC processes the original data to obtain the abstract data.
The specific procedure of the data processing method in the third application scenario is the same as that in the application scenario shown in fig. 6. When the function of the sensor is implemented by two functional modules, namely, a signal processing module and a data processing module, and the function of the MDC is implemented by two modules, namely, a data abstraction module and a data fusion module, as shown in fig. 9, a specific method in the third application scenario is as follows. Similarly, the method shown in fig. 9 is implemented by using different functional modules only for implementing the execution of the sensor and the MDC, for example, the description of various messages such as registration request, acknowledgement response, etc. and the detailed description of each step are not different from the method shown in fig. 6, specifically, refer to the method shown in fig. 6, and will not be repeated.
S901a, a signal processing module of a sensor sends a registration request to a data fusion module of an MDC, and the data fusion module of the MDC receives the registration request sent by the signal processing module of the sensor.
And S901b, the data fusion module of the MDC sends a confirmation response to the signal processing module of the sensor.
The data fusion module of the MDC suggests or configures a data path of the raw data with the signal processing module of the sensor S902.
In this way, the signal processing module of the MDC may transmit the raw data to the data fusion module of the MDC through the data path.
S903a, a signal processing module of the sensor sends a registration request to a data abstraction module of the MDC, and the data abstraction module of the MDC receives the registration request sent by the signal processing module of the sensor.
And S903b, the data abstraction module of the MDC sends a confirmation response to the signal processing module of the sensor.
The data abstraction module of the MDC suggests or configures the data path of the raw data between the signal processing module of the sensor.
In this way, the signal processing module of the MDC may transmit the raw data to the data abstraction module of the MDC via the data path.
The data abstraction module of the MDC may process the raw data to obtain abstract data.
S905a, a data abstraction module of the MDC sends a registration request to a data fusion module of the MDC, and the data fusion module of the MDC receives the registration request sent by the data abstraction module of the MDC.
And S905b, the data fusion module of the MDC sends a confirmation response to the data abstraction module of the MDC.
S906, the data fusion module of the MDC suggests or configures a data path of abstract data between the data abstraction module of the MDC.
In this way, the data abstraction module of the MDC may transmit the process-acquired abstract data to the data fusion module of the MDC via the data path.
S907a, the data processing module of the sensor sends a registration request to the data fusion module of the MDC, and the data fusion module of the MDC receives the registration request sent by the data processing module of the sensor.
And S907b, the data fusion module of the MDC sends a confirmation response to the data processing module of the sensor.
S908, the data fusion module of the MDC establishes or configures a data path of abstract data with the data processing module of the sensor.
Optionally, S909 to S912 may also be performed.
S909, the data processing module of the sensor sends abstract data to the data fusion module of the MDC, and the data fusion module of the MDC receives the abstract data sent by the data processing module of the sensor. Recorded as first abstract data.
S910, a signal processing module of the sensor sends original data to a data fusion module of the MDC, and the data fusion module of the MDC receives the original data sent by the signal processing module of the sensor.
S911, a signal processing module of the sensor sends original data to a data abstraction module of the MDC, and the data abstraction module of the MDC receives the original data sent by the signal processing module of the sensor.
The data fusion module processes the received original data to obtain abstract data. Recorded as second abstract data.
S912, the data abstraction module of the MDC sends the obtained second abstract data to the data fusion module of the MDC, and the data fusion module of the MDC receives the second abstract data from the data abstraction module of the MDC.
S913, a data fusion module of the MDC establishes an association relation between at least two of the original data, the first abstract data and the second abstract data according to the identification information.
The order of establishing the data paths in the application scenario one to the application scenario three is not strictly sequential, and may be exchanged in addition to the order described in the above embodiment.
Based on the same concept as the above-described method embodiment, as shown in fig. 10, an embodiment of the present application further provides a data processing apparatus 1000, where the data processing apparatus 1000 is configured to perform an operation performed by the first device in the above-described data processing method. The data processing apparatus 1000 comprises a data fusion module 1001 and a receiving module 1002. Wherein:
A data fusion module 1001, configured to obtain identification information of a second device;
A receiving module 1002, configured to receive a first message sent by a second device, where the first message includes original data; and the first message indicates the identification information.
Optionally, the receiving module 1002 is further configured to: receiving a second message sent by third equipment, wherein the second message contains first abstract data; and the second message indicates identification information.
Optionally, the second device and the third device are the same device.
Optionally, the data fusion module 1001 is further configured to: and establishing an association relation between the original data and the first abstract data according to the identification information.
Optionally, the data processing apparatus 1000 further comprises a data abstraction module 1003.
The data abstraction module 1003 is configured to determine second abstract data according to the original data.
Optionally, the data fusion module 1001 is further configured to: updating the abstract data according to the original data; wherein the abstract data comprises at least one of first abstract data and second abstract data.
Optionally, the data fusion module 1001 is further configured to: and establishing a data path of the original data according to the identification information.
Optionally, the data fusion module 1001 is further configured to: and establishing a data path of the first abstract data according to the identification information.
Optionally, the identification information includes any one or more of: the inherent identification ID of the second device, the application layer ID assigned by the first device to the second device, an internet protocol IP address, a medium access control layer MAC address, a port number or a timestamp.
Optionally, the receiving module 1002 is further configured to receive the identification information sent by the second device; or alternatively
The data fusion module 1001 is further configured to receive a registration message sent by the second device, and allocate the identification information to the second device according to the registration message. Of course, other modules may also be used to assign identification information, such as a configuration module, to the second device based on the registration message. The division of the modules is only illustrative, and the modules can be divided in other modes or functions in practical application.
The data processing device 1000 is a chip or an integrated circuit.
The data processing device 1000 is an MDC.
Based on the same concept as the above-described method embodiment, as shown in fig. 11, an embodiment of the present application further provides a data processing apparatus 1100, where the data processing apparatus 1100 is configured to perform an operation performed by the first device in the above-described data processing method. The data processing apparatus 1100 comprises a transceiver 1101, a processor 1102 and a memory 1103. The memory 1103 is optional. The memory 1103 is used for storing programs executed by the processor 1102. The data processing apparatus 1100 is configured to invoke a set of programs when the data processing apparatus 1100 is configured to implement the operations performed by the first device in the above-described method embodiments, and to cause the processor 1102 to perform the operations performed by the first device in the above-described method embodiments when the programs are executed. The functional module receiving module 1002 in fig. 10 may be implemented by the transceiver 1101, and the data fusion module 1001 and the data abstraction module 1003 may be implemented by the processor 1102.
The processor 1102 may be a central processing unit (central processing unit, CPU), a network processor (network processor, NP) or a combination of CPU and NP, among others.
The processor 1102 may further include a hardware chip. The hardware chip may be an application-specific integrated circuit (ASIC), a programmable logic device (programmable logic device, PLD), or a combination thereof. The PLD may be a complex programmable logic device (complex programmable logic device, CPLD), a field-programmable gate array (FPGA) GATE ARRAY, generic array logic (GENERIC ARRAY logic, GAL), or any combination thereof.
The memory 1103 may include volatile memory (RAM), such as random-access memory (RAM); the memory 1103 may also include a non-volatile memory (nonvolatile memory), such as a flash memory (flash memory), a hard disk (HARD DISK DRIVE, HDD) or a solid state disk (solid-state drive-STATE DRIVE, SSD); the memory 1103 may also include a combination of the above types of memory.
In the data processing method provided in the above embodiment of the present application, some or all of the operations and functions performed by the described first device may be implemented by a chip or an integrated circuit.
The embodiment of the present application further provides a chip, which includes a processor, configured to support the data processing apparatus 1000 and the data processing apparatus 1100 to implement the functions related to the first device in the method provided in the foregoing embodiment. In one possible design, the chip is connected to a memory or the chip includes a memory for holding the necessary program instructions and data for the device.
In one possible implementation, the chips may perform operations performed by the data fusion module 1001 and the data abstraction module 1003 in the data processing apparatus 1000.
In another possible implementation, the chip may perform the operations performed by the data fusion module 1001 in the data processing apparatus 1000. While the data abstraction module 1003 is executed by another chip. When the data abstraction module 1003 is executed by another chip, the other chip is used to obtain second abstract data, and the chip performs an operation performed by the data fusion module 1001 in the data processing apparatus 1000, including receiving the second abstract data from the other chip.
An application embodiment provides a computer storage medium storing a computer program including instructions for executing the data processing method provided in the above embodiment.
An embodiment of the present application provides a computer program product containing instructions which, when run on a computer, cause the computer to perform the data processing method provided by the above embodiment.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various modifications and variations can be made to the embodiments of the present application without departing from the spirit or scope of the embodiments of the application. Thus, if such modifications and variations of the embodiments of the present application fall within the scope of the claims and the equivalents thereof, the present application is also intended to include such modifications and variations.

Claims (16)

1. A method of data processing, comprising:
the first equipment acquires the identification information of the second equipment;
The first equipment receives a first message sent by second equipment, wherein the first message contains original data; and the first message indicates the identification information;
The method further comprises the steps of:
The first device receives a second message sent by the second device, wherein the second message comprises first abstract data obtained by processing the original data; and the second message indicates the identification information;
The first device establishes an association relation between the original data and the first abstract data according to the identification information;
And the first equipment performs data fusion according to the association relation.
2. The method of claim 1, wherein the method further comprises:
the first device determines second abstract data from the raw data.
3. The method of claim 2, wherein the method further comprises:
the first device updates abstract data according to the original data; wherein the abstract data comprises at least one of the first abstract data and the second abstract data.
4. A method according to any one of claims 1 to 3, further comprising:
And the first equipment establishes a data path of the original data according to the identification information.
5. A method according to any one of claims 1 to 3, further comprising:
And the first device establishes a data path of the first abstract data according to the identification information.
6. A method as claimed in any one of claims 1 to 3, wherein the identification information comprises any one or more of: an inherent identification ID of the second device, an application layer ID assigned by the first device to the second device, an internet protocol IP address, a medium access control layer MAC address, a port number, or a timestamp.
7. A method according to any one of claims 1 to 3, further comprising:
the first equipment receives the identification information sent by the second equipment; or alternatively
The first device receives a registration message sent by the second device, and the first device distributes the identification information for the second device according to the registration message.
8. A data processing apparatus, comprising:
The data fusion module is used for acquiring the identification information of the second equipment;
the receiving module is used for receiving a first message sent by the second equipment, wherein the first message contains original data; and the first message indicates the identification information;
The receiving module is further configured to: receiving a second message sent by the second device, wherein the second message contains first abstract data obtained by processing the original data; and the second message indicates the identification information;
the data fusion module is further configured to: establishing an association relation between the original data and the first abstract data according to the identification information; and carrying out data fusion according to the association relation.
9. The apparatus of claim 8, wherein the apparatus further comprises a data abstraction module,
The data abstraction module is used for determining second abstract data according to the original data.
10. The apparatus of claim 9, wherein the data fusion module is further to:
Updating the abstract data according to the original data; wherein the abstract data comprises at least one of the first abstract data and the second abstract data.
11. The apparatus of any one of claims 8 to 10, wherein the data fusion module is further configured to:
and establishing a data path of the original data according to the identification information.
12. The apparatus of any one of claims 8 to 10, wherein the data fusion module is further configured to:
And establishing a data path of the first abstract data according to the identification information.
13. The apparatus of any one of claims 8-10, wherein the identification information includes any one or more of: the inherent identification ID of the second device, the application layer ID assigned by the first device to the second device, an internet protocol IP address, a medium access control layer MAC address, a port number or a timestamp.
14. The apparatus of any one of claims 8-10, wherein the receiving module is further configured to:
Receiving the identification information sent by the second equipment; or alternatively
And receiving a registration message sent by the second device, wherein the data fusion module is further used for distributing the identification information to the second device according to the registration message.
15. The device according to any one of claims 8-10, wherein the device is a chip or an integrated circuit.
16. A computer readable storage medium having computer readable instructions stored therein, which when read and executed by a computer, cause the computer to perform the method of any of claims 1-7.
CN201811652064.4A 2018-12-31 2018-12-31 Data processing method and device Active CN111382774B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201811652064.4A CN111382774B (en) 2018-12-31 2018-12-31 Data processing method and device
EP19907076.4A EP3896915A4 (en) 2018-12-31 2019-12-31 Data processing method and apparatus
PCT/CN2019/130403 WO2020140890A1 (en) 2018-12-31 2019-12-31 Data processing method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811652064.4A CN111382774B (en) 2018-12-31 2018-12-31 Data processing method and device

Publications (2)

Publication Number Publication Date
CN111382774A CN111382774A (en) 2020-07-07
CN111382774B true CN111382774B (en) 2024-06-04

Family

ID=71214957

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811652064.4A Active CN111382774B (en) 2018-12-31 2018-12-31 Data processing method and device

Country Status (3)

Country Link
EP (1) EP3896915A4 (en)
CN (1) CN111382774B (en)
WO (1) WO2020140890A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114760330B (en) * 2020-12-28 2024-04-12 华为技术有限公司 Data transmission method, device, storage medium and system for Internet of vehicles

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101166129A (en) * 2006-10-20 2008-04-23 华为技术有限公司 Method, terminal, device and system for obtaining application server identifier information
EP2849341A1 (en) * 2013-09-16 2015-03-18 STMicroelectronics International N.V. Loudness control at audio rendering of an audio signal
CN105319482A (en) * 2015-09-29 2016-02-10 科大智能科技股份有限公司 Power distribution network fault diagnosis system and method based on multi-source information fusion
CN106056163A (en) * 2016-06-08 2016-10-26 重庆邮电大学 Multi-sensor information fusion object identification method
CN107391571A (en) * 2017-06-16 2017-11-24 深圳市盛路物联通讯技术有限公司 The processing method and processing device of sensing data
CN107578066A (en) * 2017-09-07 2018-01-12 南京莱斯信息技术股份有限公司 Civil defence comprehensive situation based on Multi-source Information Fusion shows system and method
CN108663677A (en) * 2018-03-29 2018-10-16 上海智瞳通科技有限公司 A kind of method that multisensor depth integration improves target detection capabilities

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001081124A1 (en) * 2000-04-25 2001-11-01 Siemens Automotive Corporation Method and system for communicating between sensors and a supplemental restraint system controller
US7911497B2 (en) * 2003-04-25 2011-03-22 Lockheed Martin Corporation Method and apparatus for video on demand
US7646336B2 (en) * 2006-03-24 2010-01-12 Containertrac, Inc. Automated asset positioning for location and inventory tracking using multiple positioning techniques
US20160086391A1 (en) * 2012-03-14 2016-03-24 Autoconnect Holdings Llc Fleetwide vehicle telematics systems and methods
CN103684950B (en) * 2013-12-17 2016-08-03 唐山轨道客车有限责任公司 Multisensor multiplex bus system and method
US9499185B2 (en) * 2013-12-20 2016-11-22 Thales Canada Inc Wayside guideway vehicle detection and switch deadlocking system with a multimodal guideway vehicle sensor
US9903946B2 (en) * 2016-05-26 2018-02-27 RFNAV, Inc. Low cost apparatus and method for multi-modal sensor fusion with single look ghost-free 3D target association from geographically diverse sensors

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101166129A (en) * 2006-10-20 2008-04-23 华为技术有限公司 Method, terminal, device and system for obtaining application server identifier information
EP2849341A1 (en) * 2013-09-16 2015-03-18 STMicroelectronics International N.V. Loudness control at audio rendering of an audio signal
CN105319482A (en) * 2015-09-29 2016-02-10 科大智能科技股份有限公司 Power distribution network fault diagnosis system and method based on multi-source information fusion
CN106056163A (en) * 2016-06-08 2016-10-26 重庆邮电大学 Multi-sensor information fusion object identification method
CN107391571A (en) * 2017-06-16 2017-11-24 深圳市盛路物联通讯技术有限公司 The processing method and processing device of sensing data
CN107578066A (en) * 2017-09-07 2018-01-12 南京莱斯信息技术股份有限公司 Civil defence comprehensive situation based on Multi-source Information Fusion shows system and method
CN108663677A (en) * 2018-03-29 2018-10-16 上海智瞳通科技有限公司 A kind of method that multisensor depth integration improves target detection capabilities

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
多传感器信息融合技术在加工过程监控中的应用研究;吴宏岐;组合机床与自动化加工技术(第09期) *

Also Published As

Publication number Publication date
WO2020140890A1 (en) 2020-07-09
CN111382774A (en) 2020-07-07
EP3896915A4 (en) 2022-01-19
EP3896915A1 (en) 2021-10-20

Similar Documents

Publication Publication Date Title
JP7031642B2 (en) Managing computational tasks in the vehicle context
RU2677970C2 (en) Remote data collection system
US11210023B2 (en) Technologies for data management in vehicle-based computing platforms
CN109873652B (en) Vehicle-mounted relay device, external device, information processing device, method, and system
US20210354708A1 (en) Online perception performance evaluation for autonomous and semi-autonomous vehicles
EP3404639A1 (en) Vehicle operation
US20200213820A1 (en) Message routing system and method thereof
CN114564209A (en) Intelligent automobile data processing method, device and equipment and storage medium
CN110717436A (en) Data analysis method and device, electronic equipment and computer storage medium
CN111382774B (en) Data processing method and device
CN113129382B (en) Method and device for determining coordinate conversion parameters
US10120715B2 (en) Distributed network management system and method for a vehicle
CN115186732A (en) Intelligent driving target fusion method, device and equipment and readable storage medium
CN116990776A (en) Laser radar point cloud compensation method and device, electronic equipment and storage medium
JP7150938B2 (en) Vehicle, method, computer program and apparatus for merging object information about one or more objects in vehicle surroundings
US10834553B2 (en) Vehicle communication system
US11308003B2 (en) Communication device and method for communication between two control devices in a vehicle
US20230093668A1 (en) Object Location Information Provisioning for Autonomous Vehicle Maneuvering
US20210377580A1 (en) Live or local environmental awareness
Ahmed et al. A Joint Perception Scheme For Connected Vehicles
US20210092091A1 (en) Communication device for vehicle, communication system for vehicle, and communication method
CN116601938B (en) Method and apparatus for reassigning addresses to network devices
US20230379188A1 (en) Vehicle data protection
CN118139018A (en) Sensor data transmission method and device for automatic driving, storage medium and terminal
JP2022532906A (en) Methods and devices for processing sensor data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant