CN111382296B - Data processing method, device, terminal and storage medium - Google Patents

Data processing method, device, terminal and storage medium Download PDF

Info

Publication number
CN111382296B
CN111382296B CN201811629084.XA CN201811629084A CN111382296B CN 111382296 B CN111382296 B CN 111382296B CN 201811629084 A CN201811629084 A CN 201811629084A CN 111382296 B CN111382296 B CN 111382296B
Authority
CN
China
Prior art keywords
target
image data
image
determining
micro
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811629084.XA
Other languages
Chinese (zh)
Other versions
CN111382296A (en
Inventor
刘希
邓裕琳
尹鹏
王成
邓志伟
麦继升
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Intellifusion Technologies Co Ltd
Original Assignee
Shenzhen Intellifusion Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Intellifusion Technologies Co Ltd filed Critical Shenzhen Intellifusion Technologies Co Ltd
Priority to CN201811629084.XA priority Critical patent/CN111382296B/en
Publication of CN111382296A publication Critical patent/CN111382296A/en
Application granted granted Critical
Publication of CN111382296B publication Critical patent/CN111382296B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/197Matching; Classification
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Ophthalmology & Optometry (AREA)
  • Image Analysis (AREA)
  • Storing Facsimile Image Data (AREA)

Abstract

The embodiment of the application provides a data processing method, a device, a terminal and a storage medium, wherein the method comprises the following steps: acquiring attribute information of target image data; the attribute information is adopted to carry out blocking processing on the target image data to obtain N image data blocks, wherein N is a positive integer; determining micro services corresponding to each image data block in the N image data blocks to obtain M target micro services, wherein the target micro services correspond to at least one image data block, and M is a positive integer less than or equal to N; and transmitting the N image data blocks to corresponding target micro-services in the M target micro-services, so that the efficiency in data processing can be improved.

Description

Data processing method, device, terminal and storage medium
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to a data processing method, a data processing device, a terminal, and a storage medium.
Background
With the continuous development of the Internet, the data volume is also larger and larger, and the large data age is gradually brought into the field of view of people. In many application scenarios, a large amount of data needs to be analyzed, so as to obtain a certain data rule. In the existing scheme, when analyzing a large amount of data, a mode of directly analyzing and processing the data is generally adopted, and the mode of directly processing the data easily causes lower efficiency in data processing.
Disclosure of Invention
The embodiment of the application provides a data processing method, a data processing device, a terminal and a storage medium, which can improve the efficiency of data processing.
A first aspect of an embodiment of the present application provides a data processing method, including:
acquiring attribute information of target image data;
the attribute information is adopted to carry out blocking processing on the target image data to obtain N image data blocks, wherein N is a positive integer;
determining micro services corresponding to each image data block in the N image data blocks to obtain M target micro services, wherein the target micro services correspond to at least one image data block, and M is a positive integer less than or equal to N;
and transmitting the N image data blocks to corresponding target micro-services in the M target micro-services.
With reference to the first aspect of the embodiments of the present application, in a first possible implementation manner of the first aspect, the target image data includes a plurality of first target images, the attribute information includes a shooting location, and the performing, by using the attribute information, a blocking process on the target image data to obtain N image data blocks includes:
performing face recognition on the plurality of first target images to obtain the number of users in each first target image in the plurality of first target images;
Dividing the plurality of first target images into A first image types according to the number of users in each first target image, wherein A is a positive integer;
determining second image types of each first target image in the A first image types according to the shooting place of each first target image in the A first image types, wherein the number of the second image types is N;
the target image data is divided into N image data blocks according to the second image type.
With reference to the first possible implementation manner of the first aspect of the embodiments of the present application, in a second possible implementation manner of the first aspect, the determining a micro service corresponding to each image data block of the N image data blocks, to obtain M target micro services includes:
acquiring a second image type of the first target image in each of the N image data blocks;
determining a reference authority level corresponding to each image data block according to the second image type;
acquiring an image memory value of each image data block;
determining a permission level correction factor of each image data block according to the memory value of each image;
Determining a target authority level corresponding to each image data block according to the reference authority level and the authority level correction factor;
and determining the micro-service corresponding to the target authority level according to the mapping relation between the preset authority level and the micro-service to obtain M target micro-services.
A second aspect of the embodiments of the present application provides a data processing apparatus, which includes an acquisition unit, a blocking unit, a determination unit, and a transmission unit, wherein,
the acquisition unit is used for acquiring attribute information of the target image data;
the blocking unit is used for blocking the target image data by adopting the attribute information to obtain N image data blocks, wherein N is a positive integer;
the determining unit is configured to determine a microservice corresponding to each of the N image data blocks, to obtain M target microservices, where the target microservices correspond to at least one image data block, and M is a positive integer less than or equal to N;
the sending unit is configured to send the N image data blocks to corresponding target micro-services in the M target micro-services.
With reference to the first aspect of the embodiments of the present application, in a first possible implementation manner of the first aspect, the target image data includes a plurality of first target images, the attribute information includes a shooting location, and in the applying the attribute information, the blocking unit is specifically configured to:
Performing face recognition on the plurality of first target images to obtain the number of users in each first target image in the plurality of first target images;
dividing the plurality of first target images into A first image types according to the number of users in each first target image, wherein A is a positive integer;
determining second image types of each first target image in the A first image types according to the shooting place of each first target image in the A first image types, wherein the number of the second image types is N;
the target image data is divided into N image data blocks according to the second image type.
With reference to the first possible implementation manner of the second aspect of the embodiments of the present application, in a second possible implementation manner of the second aspect, in the determining a micro service corresponding to each image data block of the N image data blocks, obtaining M target micro services, the determining unit is specifically configured to:
acquiring a second image type of the first target image in each of the N image data blocks;
determining a reference authority level corresponding to each image data block according to the second image type;
Acquiring an image memory value of each image data block;
determining a permission level correction factor of each image data block according to the memory value of each image;
determining a target authority level corresponding to each image data block according to the reference authority level and the authority level correction factor;
and determining the micro-service corresponding to the target authority level according to the mapping relation between the preset authority level and the micro-service to obtain M target micro-services.
A third aspect of the embodiments of the present application provides a terminal, comprising a processor, an input device, an output device and a memory, the processor, the input device, the output device and the memory being interconnected, wherein the memory is configured to store a computer program, the computer program comprising program instructions, the processor being configured to invoke the program instructions to execute the step instructions as in the first aspect of the embodiments of the present application.
A fourth aspect of the embodiments of the present application provides a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program for electronic data exchange, wherein the computer program causes a computer to perform part or all of the steps as described in the first aspect of the embodiments of the present application.
A fifth aspect of the embodiments of the present application provides a computer program product, wherein the computer program product comprises a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps as described in the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
The implementation of the embodiment of the application has at least the following beneficial effects:
according to the method, the device and the system for processing the target image data, the attribute information of the target image data is obtained, the target image data is subjected to block processing by adopting the attribute information, N image data blocks are obtained, N is a positive integer, the micro service corresponding to each image data block in the N image data blocks is determined, M target micro services are obtained, the target micro services correspond to at least one image data block, M is a positive integer smaller than or equal to N, the N image data blocks are sent to the corresponding target micro services in the M target micro services, and compared with the prior art, the method and the system directly process the target image data, the target data can be divided into a plurality of image data blocks according to the attribute information of the target image data, and the target image data blocks are sent to the corresponding target micro services for processing.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a data processing system according to an embodiment of the present application;
fig. 2A is a schematic flow chart of an image processing method according to an embodiment of the present application;
FIG. 2B is a schematic view of a vertical centerline of a face image according to an embodiment of the present application;
FIG. 3 is a flowchart of another data processing method according to an embodiment of the present application;
FIG. 4 is a flowchart of another data processing method according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application and in the above-described figures, are used for distinguishing between different objects and not for describing a particular sequential order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly understand that the embodiments described herein may be combined with other embodiments.
The electronic apparatus according to the embodiments of the present application may include various handheld devices, vehicle-mounted devices, wearable devices, computing devices, or other processing devices connected to a wireless modem, and various forms of User Equipment (UE), mobile Stations (MSs), terminal devices (terminal devices), and so on. For convenience of description, the above-mentioned apparatuses are collectively referred to as an electronic device.
A micro-service is understood to be an application or system that can perform a service alone.
For a better understanding of a data processing method provided in the embodiments of the present application, a brief description of a data processing system to which the data processing method is applied is provided below. With reference to FIG. 1, FIG. 1 is a schematic diagram of a data processing system according to an embodiment of the present application. As shown in fig. 1, the data processing system 101 includes a data processing device 1011, the data processing system 101 receives target image data, the target image data may include a plurality of first target image data or a plurality of second target image data, then the data processing device 1011 acquires attribute information of the target image data, the attribute information may be a shooting location and a camera identifier of a camera for shooting the target image data, the data processing device 1011 uses the attribute information to perform block processing on the target image to obtain N image data blocks, determines micro services corresponding to each of the N image data blocks to obtain M target micro services, the target micro services correspond to at least one image data block, M is a positive integer less than or equal to N, N is a positive integer, the data processing system 101 sends the N image data blocks to the corresponding target micro services, and compared with the prior art, the data processing system can directly process the target image data according to the attribute information of the target image data, and divide the target image data into a plurality of image data blocks, and send the target image data blocks to the corresponding target micro services, thereby performing processing on the target image data blocks, and improving the efficiency when the target image data is processed by the target micro services.
Referring to fig. 2A, fig. 2A is a schematic flow chart of an image processing method according to an embodiment of the present application. As shown in fig. 2A, the image processing method includes steps 201 to 204, specifically as follows:
201. attribute information of the target image data is acquired.
Alternatively, the attribute information of the target image data may include a shooting location, a camera identification of a camera that shoots the target image data, shooting time, the number of users in the target image data, traffic information carried in the target image data, and the like.
Optionally, before acquiring the attribute information of the target image data, acquiring the target image data may further include acquiring multiple target images, where the multiple target image data may be that the first user and the multiple second users are in the same line, and the camera shoots images shot according to a preset time interval, and the first user and the multiple second users are in the same line in the community, in the same line on the road, and the like. The preset time interval may be set by an empirical value or historical data.
202. And carrying out blocking processing on the target image data by adopting the attribute information to obtain N image data blocks, wherein N is a positive integer.
In one possible example, the target image data includes a plurality of first target images, the attribute information of the target image data may be a shooting location, the shooting location may be understood as a location where the plurality of first target images are shot, and the method for obtaining N image data blocks by performing a blocking process on the target image data using the attribute information includes the following steps A1-A4:
a1, carrying out face recognition on the plurality of first target images to obtain the number of users in each first target image in the plurality of first target images;
alternatively, one possible method for recognizing the target image may recognize the target image through a recognition algorithm, for example, a local feature analysis method, a Gabor wavelet transform and pattern matching method, a multiple template matching method, and the like.
Optionally, when the target face image appears to be blocked, the following method may be adopted for identification, specifically including the steps a100-a109, specifically as follows:
a100, repairing a target face image according to the symmetry principle of the face to obtain a first face image and a target repair coefficient, wherein the target repair coefficient is used for expressing the integrity of the face image on repair;
The target face image is a face image which is extracted from the acquired image and only comprises part of faces.
A101, extracting features of the first face image to obtain a first face feature set;
a102, extracting features of the target face image to obtain a second face feature set;
a103, searching in the database according to the first face feature set to obtain face images of a plurality of objects successfully matched with the first face feature set;
a104, matching the second face feature set with feature sets of face images of the plurality of objects to obtain a plurality of first matching values;
a105, acquiring human body characteristic data of each object in the plurality of objects to obtain a plurality of human body characteristic data;
a106, matching the human body characteristic data corresponding to the target human face with each human body characteristic data in the human body characteristic data to obtain a plurality of second matching values;
a107, determining a first weight corresponding to the target repair coefficient according to a mapping relation between a preset repair coefficient and the weight, and determining a second weight according to the first weight;
a108, carrying out weighting operation according to the first weight, the second weight, the plurality of first matching values and the plurality of second matching values to obtain a plurality of target matching values;
A109, selecting a maximum value from the target matching values, and taking an object corresponding to the maximum value as a complete face image corresponding to the target face image.
Optionally, mirror image transformation processing may be performed on the target face image according to a symmetry principle of the face, after mirror image transformation processing is performed, the processed target face image may be subjected to face restoration based on a model for generating an countermeasure network, so as to obtain a first face image and a target restoration coefficient, where the target restoration coefficient may be a proportion value of pixels of a repaired face part to a total number of pixels of the whole face, and the model for generating the countermeasure network may include the following components: discriminators, semantic regularization networks, and the like, without limitation herein.
Optionally, the method for extracting features from the first face image may include at least one of: LBP (Local Binary Patterns, local binary pattern) feature extraction algorithm, HOG (Histogram ofOriented Gradient, directional gradient histogram) feature extraction algorithm, loG (Laplacian of Gaussian, second order laplacian-gaussian) feature extraction algorithm, and the like, without limitation.
The mapping relationship between the preset repair coefficients and the weights can be that each preset repair coefficient corresponds to one weight, the sum of the weights of each preset repair coefficient is 1, the weights of the preset repair coefficients can be set by a user or default by a system, specifically, a first weight corresponding to a target repair coefficient is determined according to the mapping relationship between the preset repair coefficients and the weights, a second weight is determined according to the first weight, the second weight can be a weight corresponding to a second matching value, the sum of the first weight and the second weight is 1, the first weight is weighted with a plurality of first matching values respectively, the second weight is weighted with a plurality of second matching values respectively, a plurality of target matching values respectively corresponding to a plurality of objects are obtained, and the object corresponding to the largest matching value in the plurality of matching values is selected as a complete face image corresponding to the target face image.
In the example, the incomplete face images are repaired, the face images of a plurality of objects are obtained by matching the repaired face images, the complete face images corresponding to the target face images are determined by comparing the human body characteristics, and therefore the number of users can be determined more accurately by repairing the faces and screening the repaired matched images to obtain the final complete face images.
Alternatively, another method for determining the number of users in each of the plurality of first target images may be to perform eyeball identification on the first target image to obtain the eyeball number; and determining the number of users according to the number of the eyeballs and the coordinates of each eyeball.
Alternatively, since the eyeball is darker in the photo, the eyeball can be identified according to the preset gray level, for example, the preset gray level can be more than 80% gray level, and meanwhile, due to the fact that nevi or stains and other conditions possibly appear on the face, more than two areas higher than the preset gray level can be detected, and the condition can be that the eyeball of the person in the face image is identified according to the position of the reference eyeball in the face image. The method for determining the eyeball of the person according to the position of the reference eyeball can be as follows: and determining a vertical midline of the face image, wherein the vertical midline is a midline passing through chin and forehead, and if the first reference eyeball is completely overlapped or partially overlapped with the second reference eyeball after being symmetrical about the vertical midline, as shown in fig. 2B, wherein the partial overlapping ratio is 80% or more, the first reference eyeball and the second reference eyeball are used as eyeballs of a person in the face image.
Alternatively, the coordinates of the eyeball may be understood as a coordinate system established by taking the lower left corner of each first target image as the origin, the straight line where the long side of the image is located as the x-axis, and the straight line where the short side of the image is located as the y-axis. And determining the number of users according to the eyeball number and the eyeball coordinates, if the eyeball symmetrical to the target eyeball does not exist in the diagram, determining the number as one user, and if the eyeball symmetrical to the target eyeball does exist, determining the one user through two eyeballs, so as to determine the number of the users. For example, the number of eyeballs is 5, two eyeballs are symmetrical eyeballs, and the remaining three eyeballs are not symmetrical eyeballs, so that the number of users is 4.
A2, dividing the plurality of first target images into A first image types according to the number of users in each first target image, wherein A is a positive integer;
optionally, one possible method for dividing the plurality of first target images into a first image types according to the number of users in each first target image is as follows: the division is performed according to the number of users in different image types, for example, the number of users of face images included in the first image type is 3, the number of face images included in the second image type is 7, etc., and a specific division manner may be that the calculation cost is divided when face recognition is performed, the greater the system cost, the greater the number of corresponding users, the smaller the calculation cost, and the fewer the number of corresponding users. The larger the number of users, the larger the amount of data required for feature extraction, and the smaller the number of users, the smaller the amount of data required for feature extraction.
A3, determining second image types of each first target image in the A first image types according to the shooting place of each first target image in the A first image types, wherein the number of the second image types is N;
alternatively, the method for determining the second image type according to the shooting location may be: selecting a first reference first target image from any one of the first image types, wherein the first reference first target image is any one of the first image types; comparing the shooting sites of the rest first target image data with the shooting sites of the first reference first target image, and classifying the first target image data into the same category as the first reference first target image if the distance between the shooting sites and the shooting sites of the first reference first target image is smaller than a preset distance threshold; if the distance between the shooting location of the second reference first target image and the shooting location of the first reference first target image is larger than a preset distance threshold, taking the second reference first target image data as a new category, and determining the images of the same category with the second reference first target image data by the method on the second reference first target image data until the second image types of all the first target images are determined, so as to obtain N second image types. The preset distance threshold is set through experience values or historical data.
A4, dividing the target image data into N image data blocks according to the second image type.
Wherein the first target image in each of the second image types is taken as one image data block, thereby dividing the target image data into N image data blocks.
In another possible example, the target image data includes a plurality of second target images, the attribute information includes a camera identifier of a camera capturing the plurality of second target images, and one possible way to use the attribute information to perform a blocking process on the target image data to obtain N image data blocks includes steps B1-B2, specifically as follows:
b1, determining categories corresponding to the camera identifications of the cameras for shooting the plurality of second target images by adopting a preset algorithm to obtain N camera categories;
the preset algorithm comprises a load balancing algorithm for uniformly distributing the camera identifications to different categories, and can also be a hash algorithm, and the camera identifications can be divided into N categories through the hash algorithm. And if the number of the second target images shot by the cameras corresponding to the camera identifications is large, classifying the cameras into one type, and if the number of the second target images shot by the cameras corresponding to the camera identifications is small, classifying the cameras into one type. The classification may also be performed according to a number of segments, for example, between 0 and 10 in number is one category, between 11 and 15 is another category, and so on.
And B2, performing blocking processing on the target image data through the N camera categories to obtain the N image data blocks.
Optionally, one possible method for performing blocking processing on the target image data through N camera categories to obtain N image data blocks includes steps B21-B23, specifically as follows:
b21, extracting camera identifications in each of the N camera categories;
b22, taking a second target image shot by a camera corresponding to the camera identifier in each camera category as a category to obtain N image categories;
the method can be understood that one camera category includes a plurality of camera identifications, the cameras corresponding to the plurality of camera identifications can shoot a plurality of images, and the plurality of images shot by the cameras corresponding to all the camera identifications are taken as one image category, so that N image categories are obtained.
And B23, performing blocking processing on the target image data according to the N image categories to obtain N image data blocks.
Dividing target image data according to N image categories to obtain N image sets, wherein each set is provided with a plurality of second target images, and the second target images in each image set are used as an image data block, so that N image data blocks are obtained.
203. Determining micro services corresponding to each image data block in the N image data blocks to obtain M target micro services, wherein the target micro services correspond to at least one image data block, and M is a positive integer smaller than or equal to N.
Optionally, one possible method for determining the micro-service corresponding to each of the N image data blocks to obtain M target micro-services includes steps C1-C6, specifically as follows:
c1, acquiring a second image type of a first target image in each image data block of the N image data blocks;
wherein the second image data block is determined by the second image type, so that the second image type of the first target data can be directly acquired.
C2, determining a reference authority level corresponding to each image data block according to the second image type;
optionally, a possible mapping relationship between the preset image type and the authority level determines the authority level corresponding to each second image type, and takes the authority level corresponding to the second image type as the reference authority level corresponding to the image data block. The mapping relation between the image type and the authority level can be obtained through a neural network model, and one possible method for training the neural network model is as follows: the training of the neural network model can comprise forward training and reverse training, the neural network model can comprise an N-layer neural network, during training, sample data can be input into a first layer of the N-layer neural network, a first operation result is obtained after forward operation is carried out on the first layer, then the first operation result is input into a second layer for forward operation, a second result is obtained, accordingly, an N-1 result is input into the N-layer for forward operation, an N operation result is obtained, reverse training is carried out on the N operation result, forward training and reverse training are repeatedly carried out until the training of the neural network model is completed, and a sign of completion of training can be that a loss value converges to a certain fixed interval. The sample data is an image type and a rights level.
C3, acquiring an image memory value of each image data block;
the image memory value is understood to be the memory space required for storing each image data block.
C4, determining the authority level correction factor of each image data block according to the memory value of each image;
alternatively, the authority level correction factor may be any value between 0 and 2, such as 0.1,0.5,1.6, etc. Specifically, the higher the memory value of the image, the larger the corresponding authority level correction factor, and the lower the memory value, the smaller the corresponding authority level correction factor. The memory value and the correction factor may be in a proportional relationship, or may be in other proportional relationships.
C5, determining a target authority level corresponding to each image data block according to the reference authority level and the authority level correction factor;
optionally, multiplying the reference authority level by an authority level correction factor to obtain a product result, taking the product result as a target authority level corresponding to each image data block, and if the product result is a decimal, performing rounding operation on the decimal, thereby obtaining the target authority level.
And C6, determining the micro-service corresponding to the target authority level according to the mapping relation between the preset authority level and the micro-service, and obtaining M target micro-services.
Optionally, the higher the authority level is, the higher the computing power of the corresponding micro service is, the lower the authority level is, and the lower the computing power of the corresponding micro service is, the preset mapping relationship between the authority level and the micro service may be obtained by using a neural network model, and the training process of the neural network model refers to the step shown in step C2.
204. And transmitting the N image data blocks to corresponding target micro-services in the M target micro-services.
Optionally, the N image data blocks are sent to corresponding target micro services in the M target micro services, and a secure communication channel may also be established between the data processing system and the target micro services. One possible method of establishing a secure communication channel involves a data processing system, a target micro-service, and a proxy device, the proxy device being a trusted third party device, comprising the steps of:
d1, initializing: the initialization stage mainly completes the registration of the data processing system and the target micro-service in the proxy equipment, the subscription of the theme and the generation of the system parameters. The data processing system and the target micro-service register with the proxy equipment, and the target micro-service subscribes to related topics with the proxy equipment only through the registered data processing system and the target micro-service. The proxy device generates system public Parameters (PK) and a master key (MSK), and sends the PK to the registered data processing system and the target micro-service.
D2, encrypting and publishing: the encryption and release stage is mainly that the data processing system encrypts the load corresponding to the theme to be released and sends the load to the proxy equipment. Firstly, a data processing system encrypts a load by adopting a symmetrical encryption algorithm to generate a Ciphertext (CT), and then establishes an access structure
Figure BDA0001928594310000121
PK and +.>
Figure BDA0001928594310000122
And encrypting the symmetric key, and finally transmitting the encrypted key and the encrypted load to the proxy equipment. The proxy device filters and forwards the encrypted key and CT sent by the data processing system to the target micro-service.
Optionally, the access structure
Figure BDA0001928594310000123
Is an access tree structure. Each non-leaf node of the access tree is a threshold, using K x Representing 0<=K x <Num (x), which represents the number of its children nodes. When K is x When=num (x), the non-leaf node represents and gate; when K is x When=1, the non-leaf node represents an or gate; each leaf node of the access tree represents an attribute. The satisfaction of a set of attributes with an access tree structure may be defined as: let T be the access tree with r as the root node, T x Is a subtree of T taking x as a root node. If T x (S) =1, then the set of attributes S is said to satisfy the access structure T x . If node x is a leaf node, T is the value of T if and only if the attribute att (x) associated with leaf node x is an element of the attribute set S x (S) =1. If node x is a non-leaf node, at least K x Child node z satisfies T z When (S) =1, T x (S)=1。
D3, private key generation: the private key generation stage mainly comprises the step that the proxy equipment generates a corresponding key for the target micro-service and is used for decrypting the CT received thereafter. The target microservice provides a set of attributes A to a proxy device i (the attribute can be the characteristic, role and other information of the subscription end), and the proxy equipment gathers A according to PK and attribute i And the master key MSK generates a private key SK and then transmits the generated private key to the target microservice.
Optionally, the attribute set A i For global set u= { a 1 ,A 2 ,…,A n A subset of }. Attribute set a i The attribute information representing the target micro service i (i-th target micro service) may be a feature, a role, etc. of the target micro service, and is a default attribute of the target micro service, and the global set U represents a set of attribute information of all target micro services.
And D4, decryption: the decryption stage is mainly a process of decrypting the encrypted load by the target micro-service to extract the civilization. After receiving the encrypted key and CT sent by the proxy device, the target micro-service decrypts the encrypted key according to PK and SK to obtain a symmetric key. If it is attribute set A i Access structure satisfying ciphertext
Figure BDA0001928594310000131
The ciphertext can be successfully decrypted, so that the safety of the communication process is ensured.
By constructing the secure communication channel, the security of communication between the target micro-service and the data processing system can be ensured to a certain extent, the possibility that the illegal target micro-service steals the data transmitted between the legal target micro-service and the data processing system is reduced, and meanwhile, the situation that the illegal target micro-service steals important data in the system through an intrusion system and a tampering system is also reduced.
Referring to fig. 3, fig. 3 is a flowchart of another data processing method according to an embodiment of the present application. As shown in fig. 3, the data processing method may include steps 301-307, specifically as follows:
301. acquiring attribute information of target image data;
wherein the target image data includes a plurality of first target images, and the attribute information includes a shooting location.
302. Performing face recognition on the plurality of first target images to obtain the number of users in each first target image in the plurality of first target images;
303. dividing the plurality of first target images into A first image types according to the number of users in each first target image, wherein A is a positive integer;
304. Determining second image types of each first target image in the A first image types according to the shooting place of each first target image in the A first image types, wherein the number of the second image types is N, and N is a positive integer;
305. dividing the target image data into N image data blocks according to the second image type;
306. determining micro services corresponding to each image data block in the N image data blocks to obtain M target micro services, wherein the target micro services correspond to at least one image data block, and M is a positive integer less than or equal to N;
307. and transmitting the N image data blocks to corresponding target micro-services in the M target micro-services.
In this example, the first image data is classified according to the number of users in the first target image data in the target image to obtain a first image type, then the first image data is classified again according to the shooting locations of a plurality of pieces of first target image data in the target image to obtain a second image type, and finally the target image data is classified according to the second image type to obtain N image data blocks.
Referring to fig. 4, fig. 4 is a flowchart of another data processing method according to an embodiment of the present application. As shown in fig. 4, the data processing method may include steps 401-409, which are specifically as follows:
401. acquiring attribute information of target image data;
wherein the target image data includes a plurality of first target images, and the attribute information includes a shooting location.
402. The attribute information is adopted to carry out blocking processing on the target image data to obtain N image data blocks, wherein N is a positive integer;
the step of performing the blocking processing on the target image data by using the attribute information to obtain N image data blocks includes: performing face recognition on the plurality of first target images to obtain the number of users in each first target image in the plurality of first target images; dividing the plurality of first target images into A first image types according to the number of users in each first target image, wherein A is a positive integer; determining second image types of each first target image in the A first image types according to the shooting place of each first target image in the A first image types, wherein the number of the second image types is N; the target image data is divided into N image data blocks according to the second image type.
403. Acquiring a second image type of the first target image in each of the N image data blocks;
404. determining a reference authority level corresponding to each image data block according to the second image type;
405. acquiring an image memory value of each image data block;
the image memory value is understood to be the memory space required for storing each image data block. The image memory value of each image data block may be obtained directly from the memory address where each image data block is stored.
406. Determining a permission level correction factor of each image data block according to the memory value of each image;
the authority level correction factor may be any value between 0 and 2, for example 0.1,0.5,1.6. Specifically, the higher the memory value of the image, the larger the corresponding authority level correction factor, and the lower the memory value, the smaller the corresponding authority level correction factor. The memory value and the correction factor may be in a proportional relationship, or may be in other proportional relationships.
407. Determining a target authority level corresponding to each image data block according to the reference authority level and the authority level correction factor;
And multiplying the reference authority level by an authority level correction factor to obtain a product result, taking the product result as a target authority level corresponding to each image data block, and if the product result is decimal, carrying out rounding operation on the decimal, thereby obtaining the target authority level.
408. Determining the micro-service corresponding to the target authority level according to the mapping relation between the preset authority level and the micro-service to obtain M target micro-services;
wherein the target micro-service corresponds to at least one image data block, and M is a positive integer less than or equal to N.
The higher the authority level is, the higher the computing power of the corresponding micro service is, the lower the authority level is, the lower the computing power of the corresponding micro service is, the preset mapping relationship between the authority level and the micro service can be obtained by a neural network model, and the training process of the neural network model refers to the step shown in the step C2.
409. And transmitting the N image data blocks to corresponding target micro-services in the M target micro-services.
In this example, the reference authority level of each data block is determined for the first time according to the second image type, the correction factor is determined according to the memory value of the data block, the target authority level is obtained according to the correction factor and the reference authority level, and the target micro-service is finally determined, so that the accuracy of determining the target micro-service can be improved to a certain extent.
In accordance with the foregoing embodiments, referring to fig. 5, fig. 5 is a schematic structural diagram of a terminal provided in an embodiment of the present application, where the terminal includes a processor, an input device, an output device, and a memory, where the processor, the input device, the output device, and the memory are connected to each other, and the memory is configured to store a computer program, where the computer program includes program instructions, where the processor is configured to invoke the program instructions, and where the program includes instructions for performing the following steps;
acquiring attribute information of target image data;
the attribute information is adopted to carry out blocking processing on the target image data to obtain N image data blocks, wherein N is a positive integer;
determining micro services corresponding to each image data block in the N image data blocks to obtain M target micro services, wherein the target micro services correspond to at least one image data block, and M is a positive integer less than or equal to N;
and transmitting the N image data blocks to corresponding target micro-services in the M target micro-services.
In this example, attribute information of target image data is obtained, the attribute information is adopted, the target image data is subjected to blocking processing to obtain N image data blocks, N is a positive integer, and micro services corresponding to each image data block in the N image data blocks are determined to obtain M target micro services, the target micro services correspond to at least one image data block, M is a positive integer smaller than or equal to N, the N image data blocks are sent to corresponding target micro services in the M target micro services, and compared with the prior art, the processing of the target image data is directly performed, the target data can be divided into a plurality of image data blocks according to the attribute information of the target image data, and the processing is performed after the target image data blocks are sent to the corresponding target micro services, so that efficiency in processing the target data can be improved to a certain extent by performing blocking processing on the target data.
The foregoing description of the embodiments of the present application has been presented primarily in terms of a method-side implementation. It will be appreciated that, in order to achieve the above-mentioned functions, the terminal includes corresponding hardware structures and/or software modules for performing the respective functions. Those of skill in the art will readily appreciate that the elements and algorithm steps described in connection with the embodiments disclosed herein may be embodied as hardware or a combination of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application may divide the functional units of the terminal according to the above method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated in one processing unit. The integrated units may be implemented in hardware or in software functional units. It should be noted that, in the embodiment of the present application, the division of the units is schematic, which is merely a logic function division, and other division manners may be implemented in actual practice.
In line with the above, referring to fig. 6, fig. 6 provides a schematic structural diagram of a data processing apparatus according to an embodiment of the present application, where the apparatus includes an obtaining unit 601, a partitioning unit 602, a determining unit 603, and a sending unit 604, where,
the acquiring unit 601 is configured to acquire attribute information of target image data;
the partitioning unit 602 is configured to perform partitioning processing on the target image data by using the attribute information to obtain N image data blocks, where N is a positive integer;
the determining unit 603 is configured to determine a microservice corresponding to each of the N image data blocks, to obtain M target microservices, where the target microservices correspond to at least one image data block, and M is a positive integer less than or equal to N;
the sending unit 604 is configured to send the N image data blocks to a corresponding target micro service in the M target micro services.
Optionally, the target image data includes a plurality of first target images, the attribute information includes a shooting location, and in the adopting the attribute information, the blocking unit 602 is specifically configured to:
Performing face recognition on the plurality of first target images to obtain the number of users in each first target image in the plurality of first target images;
dividing the plurality of first target images into A first image types according to the number of users in each first target image, wherein A is a positive integer;
determining second image types of each first target image in the A first image types according to the shooting place of each first target image in the A first image types, wherein the number of the second image types is N;
the target image data is divided into N image data blocks according to the second image type.
Optionally, the target image data includes a plurality of second target images, the attribute information includes a camera identifier of a camera that captures the plurality of second target images, and in the aspect of performing a block processing on the target image data by using the attribute information to obtain N image data blocks, the block unit 602 is specifically configured to:
determining categories corresponding to the camera identifications of the cameras for shooting the plurality of second target images by adopting a preset algorithm to obtain N camera categories;
And carrying out blocking processing on the target image data through the N camera categories to obtain N image data blocks.
Optionally, in the aspect that the target image data is subjected to blocking processing through the N camera categories to obtain the N image data blocks, the blocking unit 602 is specifically configured to:
extracting camera identifications in each of the N camera categories;
taking a second target image shot by a camera corresponding to the camera identifier in each camera category as a category to obtain N image categories;
and performing block processing on the target image data according to the N image categories to obtain N image data blocks.
Optionally, in the aspect of determining the micro service corresponding to each of the N image data blocks to obtain M target micro services, the determining unit 603 is specifically configured to:
acquiring a second image type of the first target image in each of the N image data blocks;
determining a reference authority level corresponding to each image data block according to the second image type;
acquiring an image memory value of each image data block;
Determining a permission level correction factor of each image data block according to the memory value of each image;
determining a target authority level corresponding to each image data block according to the reference authority level and the authority level correction factor;
and determining the micro-service corresponding to the target authority level according to the mapping relation between the preset authority level and the micro-service to obtain M target micro-services.
The present application also provides a computer storage medium storing a computer program for electronic data exchange, the computer program causing a computer to execute some or all of the steps of any one of the data processing methods described in the above method embodiments.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer-readable storage medium storing a computer program that causes a computer to perform some or all of the steps of any one of the data processing methods described in the method embodiments above.
It should be noted that, for simplicity of description, the foregoing method embodiments are all expressed as a series of action combinations, but it should be understood by those skilled in the art that the present application is not limited by the order of actions described, as some steps may be performed in other order or simultaneously in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required in the present application.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, such as the division of the units, merely a logical function division, and there may be additional manners of dividing the actual implementation, such as multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, or may be in electrical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present invention may be integrated in one processing unit, each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units described above may be implemented either in hardware or in software program modules.
The integrated units, if implemented in the form of software program modules, may be stored in a computer-readable memory for sale or use as a stand-alone product. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a memory, including several instructions for causing a computer device (which may be a personal computer, a server or a network device, etc.) to perform all or part of the steps of the method described in the embodiments of the present application. And the aforementioned memory includes: a U-disk, a read-only memory (ROM), a random access memory (randomaccess memory, RAM), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Those of ordinary skill in the art will appreciate that all or a portion of the steps in the various methods of the above embodiments may be implemented by a program that instructs associated hardware, and the program may be stored in a computer readable memory, which may include: flash disk, read-only memory, random access memory, magnetic or optical disk, etc.
The foregoing has outlined rather broadly the more detailed description of embodiments of the present application, wherein specific examples are provided herein to illustrate the principles and embodiments of the present application, the above examples being provided solely to assist in the understanding of the methods of the present application and the core ideas thereof; meanwhile, as those skilled in the art will have modifications in the specific embodiments and application scope in accordance with the ideas of the present application, the present description should not be construed as limiting the present application in view of the above.

Claims (9)

1. A method of data processing, the method comprising:
acquiring attribute information of target image data, wherein the attribute information comprises shooting places;
performing block processing on the target image data by adopting the attribute information to obtain N image data blocks, wherein N is a positive integer, and the target image data comprises a plurality of first target images;
Determining the micro-service corresponding to each image data block in the N image data blocks to obtain M target micro-services, wherein the method comprises the following steps: acquiring a second image type of a first target image in each image data block of the N image data blocks, wherein the second image type of the first target image is determined according to a shooting place of the first target image; determining a reference authority level corresponding to each image data block according to the second image type; acquiring an image memory value of each image data block; determining a permission level correction factor of each image data block according to the memory value of each image; determining a target authority level corresponding to each image data block according to the reference authority level and the authority level correction factor; determining micro services corresponding to the target authority level according to a mapping relation between the preset authority level and the micro services to obtain M target micro services, wherein the target micro services correspond to at least one image data block, and M is a positive integer less than or equal to N;
and transmitting the N image data blocks to corresponding target micro-services in the M target micro-services.
2. The method according to claim 1, wherein the partitioning the target image data using the attribute information to obtain N image data blocks includes:
performing face recognition on the plurality of first target images to obtain the number of users in each first target image in the plurality of first target images;
dividing the plurality of first target images into A first image types according to the number of users in each first target image, wherein A is a positive integer;
determining second image types of each first target image in the A first image types according to the shooting place of each first target image in the A first image types, wherein the number of the second image types is N;
the target image data is divided into N image data blocks according to the second image type.
3. The method according to claim 1, wherein the target image data includes a plurality of second target images, the attribute information includes a camera identifier of a camera capturing the plurality of second target images, and the performing the blocking processing on the target image data using the attribute information to obtain N image data blocks includes:
Determining categories corresponding to the camera identifications of the cameras for shooting the plurality of second target images by adopting a preset algorithm to obtain N camera categories;
and carrying out blocking processing on the target image data through the N camera categories to obtain N image data blocks.
4. A method according to claim 3, wherein the partitioning the target image data by the N camera categories to obtain the N image data blocks includes:
extracting camera identifications in each of the N camera categories;
taking a second target image shot by a camera corresponding to the camera identifier in each camera category as a category to obtain N image categories;
and performing block processing on the target image data according to the N image categories to obtain N image data blocks.
5. A data processing apparatus is characterized in that the apparatus comprises an acquisition unit, a blocking unit, a determination unit, and a transmission unit, wherein,
the acquisition unit is used for acquiring attribute information of target image data, wherein the attribute information comprises shooting places;
The blocking unit is used for blocking the target image data by adopting the attribute information to obtain N image data blocks, wherein N is a positive integer, and the target image data comprises a plurality of first target images;
the determining unit is configured to determine a micro service corresponding to each of the N image data blocks, to obtain M target micro services, and includes: acquiring a second image type of a first target image in each image data block of the N image data blocks, wherein the second image type of the first target image is determined according to a shooting place of the first target image; determining a reference authority level corresponding to each image data block according to the second image type; acquiring an image memory value of each image data block; determining a permission level correction factor of each image data block according to the memory value of each image; determining a target authority level corresponding to each image data block according to the reference authority level and the authority level correction factor; determining micro services corresponding to the target authority level according to a mapping relation between the preset authority level and the micro services to obtain M target micro services, wherein the target micro services correspond to at least one image data block, and M is a positive integer less than or equal to N;
The sending unit is configured to send the N image data blocks to corresponding target micro-services in the M target micro-services.
6. The apparatus according to claim 5, wherein in the aspect of using the attribute information to perform a blocking process on the target image data to obtain N image data blocks, the blocking unit is specifically configured to:
performing face recognition on the plurality of first target images to obtain the number of users in each first target image in the plurality of first target images;
dividing the plurality of first target images into A first image types according to the number of users in each first target image, wherein A is a positive integer;
determining second image types of each first target image in the A first image types according to the shooting place of each first target image in the A first image types, wherein the number of the second image types is N;
the target image data is divided into N image data blocks according to the second image type.
7. The apparatus according to claim 5, wherein the target image data includes a plurality of second target images, the attribute information includes a camera identifier of a camera that captures the plurality of second target images, and the partitioning unit is specifically configured to, in the aspect of using the attribute information to perform partitioning processing on the target image data to obtain N image data blocks:
Determining categories corresponding to the camera identifications of the cameras for shooting the plurality of second target images by adopting a preset algorithm to obtain N camera categories;
and carrying out blocking processing on the target image data through the N camera categories to obtain N image data blocks.
8. A terminal for data processing, comprising a processor, an input device, an output device and a memory, the processor, the input device, the output device and the memory being interconnected, wherein the memory is adapted to store a computer program comprising program instructions, the processor being configured to invoke the program instructions to perform the method of any of claims 1-4.
9. A computer readable storage medium, characterized in that the computer storage medium stores a computer program comprising program instructions which, when executed by a processor, cause the processor to perform the method of any of claims 1-4.
CN201811629084.XA 2018-12-28 2018-12-28 Data processing method, device, terminal and storage medium Active CN111382296B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811629084.XA CN111382296B (en) 2018-12-28 2018-12-28 Data processing method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811629084.XA CN111382296B (en) 2018-12-28 2018-12-28 Data processing method, device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN111382296A CN111382296A (en) 2020-07-07
CN111382296B true CN111382296B (en) 2023-05-12

Family

ID=71220940

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811629084.XA Active CN111382296B (en) 2018-12-28 2018-12-28 Data processing method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN111382296B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113159091B (en) * 2021-01-20 2023-06-20 北京百度网讯科技有限公司 Data processing method, device, electronic equipment and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4772839B2 (en) * 2008-08-13 2011-09-14 株式会社エヌ・ティ・ティ・ドコモ Image identification method and imaging apparatus
US9008438B2 (en) * 2011-04-25 2015-04-14 Panasonic Intellectual Property Corporation Of America Image processing device that associates photographed images that contain a specified object with the specified object
JP6197659B2 (en) * 2014-01-20 2017-09-20 富士ゼロックス株式会社 Detection control device, program, and detection system
CN108229515A (en) * 2016-12-29 2018-06-29 北京市商汤科技开发有限公司 Object classification method and device, the electronic equipment of high spectrum image
CN108898171B (en) * 2018-06-20 2022-07-22 深圳市易成自动驾驶技术有限公司 Image recognition processing method, system and computer readable storage medium

Also Published As

Publication number Publication date
CN111382296A (en) 2020-07-07

Similar Documents

Publication Publication Date Title
JP6756037B2 (en) User identity verification methods, devices and systems
CN111340008B (en) Method and system for generation of counterpatch, training of detection model and defense of counterpatch
KR102045978B1 (en) Facial authentication method, device and computer storage
CN105631296B (en) A kind of safe face authentication system design method based on CNN feature extractors
CN106530200B (en) Steganographic image detection method and system based on deep learning model
TWI675308B (en) Method and apparatus for verifying the availability of biometric images
Wojtowicz et al. Digital images authentication scheme based on bimodal biometric watermarking in an independent domain
CN116383793B (en) Face data processing method, device, electronic equipment and computer readable medium
Bacchuwar et al. A jump patch-block match algorithm for multiple forgery detection
CN111382296B (en) Data processing method, device, terminal and storage medium
CN113766085B (en) Image processing method and related device
CN109856979B (en) Environment adjusting method, system, terminal and medium
Su et al. Secure image storage system based on compressed sensing and 2D-SLLIM in cloud environment
Umar et al. I-Marks: An iris code embedding system for ownership identification of multimedia content
CN116383470B (en) Image searching method with privacy protection function
CN109447240B (en) Training method of graphic image replication model, storage medium and computing device
CN111382286B (en) Data processing method and related product
Aberna et al. Optimal Semi-Fragile Watermarking based on Maximum Entropy Random Walk and Swin Transformer for Tamper Localization
CN110796149B (en) Image comparison method for food tracing and related device
Amerini et al. Acquisition source identification through a blind image classification
CN103327363A (en) System and method for realizing control over video information encryption on basis of semantic granularity
CN113689321A (en) Image information transmission method and device based on stereoscopic projection encryption
ShuChen et al. Generative Information Hiding of Iris Feature Data Based on Gaussian Fuzzy Algorithm
CN111132055A (en) Real-time indoor mobile equipment spontaneous access method and system based on situation consistency
CN104484587A (en) Mobile phone photographing authentication method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant