CN111566639A - Image classification method and device - Google Patents

Image classification method and device Download PDF

Info

Publication number
CN111566639A
CN111566639A CN201880085333.5A CN201880085333A CN111566639A CN 111566639 A CN111566639 A CN 111566639A CN 201880085333 A CN201880085333 A CN 201880085333A CN 111566639 A CN111566639 A CN 111566639A
Authority
CN
China
Prior art keywords
image
information
classification
image file
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880085333.5A
Other languages
Chinese (zh)
Inventor
孙伟
谭利文
杜明亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN111566639A publication Critical patent/CN111566639A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content

Landscapes

  • Engineering & Computer Science (AREA)
  • Library & Information Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

An image classification method and device relate to the field of image processing, and can reduce the calculated amount in the image classification process and improve the image classification efficiency. The method comprises the following steps: the first device captures image data, acquires image generation information when the image data is captured, and generates a first image file including the image data and the image generation information (S501); the first device performs an image classification operation on the first image file using the image generation information (S502); the apparatus displays the first image file in the classification directory of the gallery in response to the operation for viewing the gallery (S503).

Description

Image classification method and device Technical Field
The embodiment of the application relates to the field of image processing, in particular to an image classification method and device.
Background
The image classification is a mode for automatically classifying a plurality of images into at least two types of images for classification management by a device according to feature information of each of the plurality of images. In the process of image classification, the amount of calculation required by the equipment for analyzing a plurality of images to obtain the characteristic information of the images is large.
And after one device classifies the image a, the image a is transmitted to another device, and when the other device classifies the image a, the image a needs to be subjected to image analysis again to acquire the characteristic information of the image a. Wherein, repeatedly acquiring feature information of the same image may generate a large amount of redundant calculations.
Disclosure of Invention
The embodiment of the application provides an image classification method and device, which can reduce the calculated amount in the image classification process and improve the image classification efficiency.
In a first aspect, an embodiment of the present application provides an image classification method, where the image classification method includes: the device captures image data, acquires image generation information when the image data is captured, and generates a first image file comprising the image data and the image generation information; then, performing image classification operation on the first image file by using the image generation information; the device displays the first image file in a category directory of the gallery in response to an operation for viewing the gallery.
According to the image classification method provided by the embodiment of the application, when the device shoots image data, the device can acquire image generation information when the image data is captured, and then generates the first image file comprising the image data and the image generation information. In this way, when the device executes the image classification operation on the first image file, the image classification operation can be executed by directly utilizing the image generation information. That is, the apparatus may skip identifying the image data to obtain the image generation information. Therefore, the calculation amount in the image classification process can be reduced, and the image classification efficiency can be improved.
In a possible design manner of the first aspect, the image generation information may include: information of photographing parameters, information of photographing mode, information of photographing scene, and information of camera type.
In another possible design manner of the first aspect, the shooting parameters may include parameters such as an exposure value, the shooting modes include a panoramic mode, a normal mode, and the like, and the shooting scenes may include a person shooting scene, a building shooting scene, a natural wind and light shooting scene, an indoor shooting scene, an outdoor shooting scene, and the like. The classification feature information is feature information obtained by performing an image recognition operation on image data. The camera type is used to indicate that image data is captured using a front camera or a rear camera.
In another possible design manner of the first aspect, when the shooting scene is a person shooting scene, the image generation information further includes person feature information. Illustratively, the character feature information includes information such as the number of faces, face indication information, face position information, indication information of other objects (such as animals), and the number of other objects. The face indication information is used for indicating that the image data of the first image file comprises a face or does not comprise a face; the indication information of the other object is used for indicating that the other object is included or not included in the image data; the face position information is used to indicate the position of the face in the image data.
In another possible design manner of the first aspect, when the apparatus performs the image classification operation, the apparatus may not perform an image recognition operation on the image data to analyze the image data for the image generation information. That is, the apparatus may skip performing the image recognition operation on the image data to obtain the image generation information. That is, the image generation information is not obtained by performing the image recognition operation. Therefore, the calculation amount in the image classification process can be reduced, and the image classification efficiency can be improved.
In another possible design manner of the first aspect, the feature information required for the device to perform the image classification operation includes not only the image generation information described above, but also classification feature information. Specifically, the device performs an image classification operation on the first image file by using the image generation information, and includes: the equipment executes image recognition operation on the image data to analyze the image data to obtain classification characteristic information; the apparatus performs an image classification operation on the first image file using the image generation information and the classification characteristic information.
In another possible design manner of the first aspect, after the device analyzes the image data to obtain the classification feature information, the device may further store the classification feature information in the first image file to obtain an updated first image file. In this way, when the apparatus or another apparatus performs the image classification operation again on the first image file, the feature information (image generation information and classification feature information) stored in the first image file can be directly used without re-recognizing the image data of the first image file to obtain the feature information.
In another possible design manner of the first aspect, the image classification algorithms used by the device to analyze the image data to obtain different feature information are different. The image classification algorithm adopted by the equipment for analyzing the image data to obtain the image generation information comprises a first algorithm. Thus, the method for analyzing the image data to obtain the image generation information by the device without performing the image recognition operation on the image data comprises the following steps: the device performs an image recognition operation on the image data without employing the first algorithm to analyze the image data for image generation information corresponding to the first algorithm.
It should be noted that the first algorithm in the embodiment of the present application may include one or more image classification algorithms. The image generation information in the embodiment of the present application may include feature information of one or more attributes, where the feature information of each attribute corresponds to an image classification algorithm.
In another possible design manner of the first aspect, the device may perform an image recognition operation on the image data by using a second algorithm to analyze the image data to obtain classification feature information corresponding to the second algorithm; and then, performing image classification operation on the first image file by using the image generation information and the classification characteristic information. The first algorithm is different from the second algorithm.
In another possible design manner of the first aspect, when the apparatus performs the image classification operation on the first image file again, the apparatus may read feature information (image generation information and classification feature information) saved in the first image file; when the device determines that the first feature information (image generation information and/or classification feature information) is stored in the first image file, the device may not perform an image recognition operation on the image data by using the third algorithm to analyze the image data to obtain first feature information corresponding to the third algorithm; and directly utilizing the first characteristic information to carry out image classification operation on the first image file. Namely, the device can skip the image data identification by adopting the third algorithm to analyze the image data to obtain the first characteristic information, and directly utilize the first characteristic information to execute the image classification operation on the first image file, so that the calculation amount for executing the image classification operation can be reduced.
In another possible embodiment of the first aspect, the first image file may not include the first characteristic information. When the device determines that the first image file does not include the first feature information, the device may identify the image data using a third algorithm to obtain the first feature information and perform an image classification operation using the first feature information. Moreover, the device may store the first feature information in the first image file to obtain the updated first image file, so that when the device or another device performs the image classification operation on the first image file again, the device may directly utilize the first feature information stored in the first image file without newly adopting the first algorithm to identify the image data of the first image file.
In another possible design manner of the first aspect, a version of an algorithm used by the apparatus to perform the image classification operation is continuously updated over time, and feature information obtained by identifying image data of the first image file by using the same algorithm of different versions is different. Based on this, the classification feature information also includes a version of an image classification algorithm. In this case, even if the first feature information is stored in the first image file, the version of the algorithm for identifying the first feature information is not necessarily the same as the version of the third algorithm. In view of this, the performing of the image classification operation using the first feature information described above may include: the equipment determines that the version of the algorithm for identifying the first characteristic information is the same as the version of the third algorithm; the device performs an image classification operation directly using the first feature information, skipping recognition of the first image using a third algorithm to obtain the first feature information. Therefore, the calculation amount in the image classification process can be reduced, and the image classification efficiency can be improved.
In another possible design manner of the first aspect, the method of the embodiment of the present application further includes: the device determines that the version of the algorithm identifying the first feature information is different from the version of the first algorithm; the equipment adopts a third algorithm to identify image data so as to obtain first characteristic information, and performs image classification operation by utilizing the first characteristic information; and updating the first characteristic information stored in the first image file by adopting the identified first characteristic information to obtain an updated first image file. In this way, when the device or another device performs the image classification operation on the first image file again, the new first feature information stored in the first image file can be directly utilized without re-adopting the first algorithm to identify the image data of the first image file.
In another possible embodiment of the first aspect, the format of the first image file is an Exchangeable image file format (EXIF).
In another possible design of the first aspect, the image generation information is stored in a Maker Note (Maker Note) field of the first image file. Of course, the classification feature information is also stored in the Maker Note field of the first image file.
In another possible design manner of the first aspect, the Image generation information is stored in a marker Note field of the first Image File in a Tagged Image File Format (TIFF). Of course, the classification feature information is also stored in the Maker Note field of the first image file using TIFF.
In a second aspect, an embodiment of the present application provides an image classification method, including: the device captures image data through a camera; the device acquires image generation information when capturing image data, and generates a first image file including the image data and the image generation information; the format of the first image file is EXIF, and the image generation information is stored in a Maker Note field of the first image file; the image generation information includes at least one of information of a photographing parameter, information of a photographing mode, information of a photographing scene, and information of a camera type; the equipment directly utilizes the image generation information to execute image classification operation on the first image file; instead of performing an image recognition operation on the image data to analyze the image data to obtain image generation information, and then performing an image classification operation on the first image file using the image generation information obtained by the analysis; finally, in response to the operation for viewing the gallery, the first image file is displayed in the category directory of the gallery.
According to the image classification method provided by the embodiment of the application, when the equipment shoots image data through the camera, the image generation information during image data capture can be obtained, and then the first image file comprising the image data and the image generation information is generated. In this way, when the device executes the image classification operation on the first image file, the image classification operation can be executed by directly utilizing the image generation information. That is, the apparatus may skip identifying the image data to obtain the image generation information. Therefore, the calculation amount in the image classification process can be reduced, and the image classification efficiency can be improved.
In a possible design manner of the second aspect, the apparatus performs an image classification operation on the first image file by using the image generation information, and includes: the equipment executes image recognition operation on the image data to analyze the image data to obtain classification characteristic information; the device performs an image classification operation on the first image file using the image generation information and the classification characteristic information.
In another possible design manner of the second aspect, after the device obtains the classification feature information through analysis, the device may save the classification feature information in the first image file to obtain an updated first image file.
In another possible design manner of the second aspect, the image generation information is stored in a Maker Note field using TIFF.
In a third aspect, an embodiment of the present application provides an apparatus, including: the device comprises an acquisition unit, a classification unit and a display unit. An acquisition unit configured to capture image data, acquire image generation information at the time of capturing the image data, and generate a first image file including the image data and the image generation information; a classification unit configured to perform an image classification operation on the first image file using the image generation information acquired by the acquisition unit; and a display unit for displaying the first image file in the classification directory of the gallery in response to an operation for viewing the gallery.
In a possible design manner of the third aspect, the classifying unit is further configured to perform an image recognition operation on the image data; wherein the image generation information is not obtained by the classification unit recognizing the image data.
In another possible design manner of the third aspect, the classification unit is specifically configured to perform an image recognition operation on the image data to analyze the image data to obtain classification feature information; an image classification operation is performed on the first image file using the image generation information and the classification characteristic information.
In another possible design manner of the third aspect, the apparatus further includes: and an updating unit. The updating unit is used for storing the classification characteristic information in the first image file to obtain an updated first image file after the classification unit executes image recognition operation on the image data to analyze the image data to obtain the classification characteristic information.
In a fourth aspect, an embodiment of the present application provides an apparatus, including: the apparatus comprises: the system comprises a processor, a memory, a camera and a display; a memory and a display coupled to the processor, the display for displaying image files, the memory including a non-volatile storage medium, the memory for storing computer program code, the computer program code including computer instructions, the camera for capturing image data when the processor executes the computer instructions; the processor is used for acquiring image generation information when the camera captures the image data and generating a first image file comprising the image data and the image generation information; and executing image classification operation on the first image file by using the image generation information. A display for displaying the first image file in the category directory of the gallery in response to an operation for viewing the gallery.
In a possible design manner of the fourth aspect, the image generation information is not obtained by the processor performing an image recognition operation on the image data.
In another possible design manner of the fourth aspect, the processor is specifically configured to perform an image recognition operation on the image data to analyze the image data to obtain classification feature information; an image classification operation is performed on the first image file using the image generation information and the classification characteristic information.
In another possible design manner of the fourth aspect, the processor is further configured to perform an image recognition operation on the image data, so as to store the classification characteristic information in the first image file after analyzing the image data to obtain the classification characteristic information, so as to obtain an updated first image file.
It should be noted that, the image generation information, the format of the first image file, the positions of the image generation information and the classification feature information in the first image file, and the formats of the image generation information and the classification feature information stored in the manufacturer comment field, which are described in the second aspect, the third aspect, and the fourth aspect of the embodiment of the present application and any possible design manner thereof, may all refer to the relevant description in the possible design manner of the first aspect, and are not repeated herein.
In a fifth aspect, embodiments of the present application provide a control device comprising a processor and a memory, the memory being configured to store computer program code comprising computer instructions, which, when executed by the processor, cause the control device to perform the method according to the first and second aspects of embodiments of the present application and any of its possible design forms.
In a sixth aspect, embodiments of the present application provide a computer storage medium, which includes computer instructions that, when executed on a device, cause the device to perform the method according to the first aspect and any possible design manner of the first aspect of the embodiments of the present application.
In a seventh aspect, embodiments of the present application provide a computer program product, which when run on a computer, causes the computer to execute the method according to the first and second aspects and any possible design thereof.
In addition, for technical effects brought by any one of the third aspect, the fourth aspect and any one of the design manners of the third aspect, and the second aspect, the fourth aspect to the seventh aspect, reference may be made to the technical effects brought by the different design manners of the first aspect, and details are not described here again.
Drawings
Fig. 1 is a schematic diagram of a hardware structure of a mobile phone according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a data structure of a Joint Photographic Experts Group (JPEG) image file according to an embodiment of the present application;
fig. 3 is a schematic diagram of a data structure of an EXIF image file according to an embodiment of the present application;
fig. 4 is a first schematic diagram of a system principle framework of an image classification method according to an embodiment of the present application;
fig. 5 is a first flowchart of an image classification method according to an embodiment of the present application;
fig. 6 is a schematic diagram of a system principle framework of an image classification method according to an embodiment of the present application;
fig. 7A is a schematic diagram of an example of a mobile phone interface provided in an embodiment of the present application;
fig. 7B is a flowchart of a method for classifying images according to an embodiment of the present application;
FIG. 8 is a diagram illustrating a data structure of the Maker Note field of the EXIF image shown in FIG. 3;
FIG. 9 is a first diagram illustrating a data structure of the TIFF field in the Maker Note field shown in FIG. 8;
FIG. 10 is a diagram illustrating a data structure of the TIFF field in the Maker Note field shown in FIG. 8;
fig. 11 is a third schematic diagram of a system schematic framework of an image classification method according to an embodiment of the present application;
fig. 12 is a flowchart of a third method for classifying images according to an embodiment of the present application;
fig. 13 is a first schematic structural component diagram of an apparatus according to an embodiment of the present disclosure;
fig. 14 is a schematic structural composition diagram of an apparatus according to an embodiment of the present application.
Detailed Description
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. For example, the first feature information and the second feature information refer to different feature information in the first image file, and the non-first image file has two feature information.
The embodiment of the application provides an image classification method which can be applied to equipment for carrying out image classification on a first image file. The image file (e.g., the first image file) in the embodiment of the present application refers to an image file obtained by performing processing such as encoding and compressing on an image acquired by a camera, such as a JPEG image file. The images described in the embodiments of the present application may be understood as electronic pictures (hereinafter referred to as pictures).
The image classification in this application refers to that the device divides a plurality of image files into at least two types of image files (i.e., clusters the plurality of image files) according to feature information of image data in each of the plurality of image files, such as shooting mode (e.g., panoramic mode) information, shooting scene (e.g., person scene) information, and number of faces (e.g., 3 faces).
The devices (e.g., the first device and the second device) in the embodiment of the present application may be terminal devices such as a mobile phone (e.g., the mobile phone 100 shown in fig. 1), a tablet Computer, a Personal Computer (PC), a Personal Digital Assistant (PDA), a netbook, a wearable electronic device, an Augmented Reality (AR) \ Virtual Reality (VR) device, and an in-vehicle Computer.
For example, the device may manage pictures stored in the device, and perform the method provided by the embodiment of the present application to classify the pictures stored in the device. Or, a client (or an application) for managing pictures may be installed in the device, and the client may manage the pictures stored in the cloud server after logging in a picture management account; in addition, the client can be further used for executing the method provided by the embodiment of the application to classify the images in the cloud server.
Or, the device in the embodiment of the present application may also be a cloud server for storing and managing pictures, where the cloud server may receive the pictures uploaded by the terminal, and then execute the method provided in the embodiment of the present application to perform image classification on the pictures uploaded by the terminal. The embodiment of the present application does not specifically limit the specific form of the above-described apparatus.
As shown in fig. 1, taking the mobile phone 100 as the above-mentioned device as an example, the mobile phone 100 may specifically include: a processor 101, a Radio Frequency (RF) circuit 102, a memory 103, a touch screen 104, a bluetooth device 105, one or more sensors 106, a Wireless Fidelity (WiFi) device 107, a positioning device 108, an audio circuit 109, a peripheral interface 110, and a power supply 111. These components may communicate over one or more communication buses or signal lines (not shown in fig. 1). Those skilled in the art will appreciate that the hardware configuration shown in fig. 1 is not intended to be limiting, and that the handset 100 may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes the components of the handset 100 in detail with reference to fig. 1:
the processor 101 is a control center of the cellular phone 100, connects various parts of the cellular phone 100 using various interfaces and lines, and performs various functions of the cellular phone 100 and processes data by running or executing an application program stored in the memory 103 and calling data stored in the memory 103. In some embodiments, processor 101 may include one or more processing units. In some embodiments of the present application, the processor 101 may further include a fingerprint verification chip, configured to verify the collected fingerprint.
The rf circuit 102 may be used for receiving and transmitting wireless signals during the transmission and reception of information or calls. In particular, the rf circuit 102 may receive downlink data of the base station and then process the received downlink data to the processor 101; in addition, data relating to uplink is transmitted to the base station. Typically, the radio frequency circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency circuitry 102 may also communicate with other devices via wireless communication. The wireless communication may use any communication standard or protocol including, but not limited to, global system for mobile communications, general packet radio service, code division multiple access, wideband code division multiple access, long term evolution, email, short message service, and the like.
The memory 103 is used for storing application programs and data, and the processor 101 executes various functions and data processing of the mobile phone 100 by running the application programs and data stored in the memory 103. The memory 103 mainly includes a program storage area and a data storage area, wherein the program storage area can store an operating system and application programs (such as a sound playing function and an image playing function) required by at least one function; the storage data area may store data (e.g., audio data, a phonebook, etc.) created from use of the handset 100. In addition, the Memory 103 may include a high-speed Random Access Memory (RAM), and may further include a nonvolatile Memory, such as a magnetic disk storage device, a flash Memory device, or other volatile solid-state storage device. The memory 103 may store various operating systems such as, for example,
Figure PCTCN2018076081-APPB-000001
the operating system is used to operate the system,
Figure PCTCN2018076081-APPB-000002
an operating system, etc. The memory 103 may be independent and connected to the processor 101 through the communication bus; the memory 103 may also be integrated with the processor 101.
The touch screen 104 may specifically include a touch pad 104-1 and a display 104-2.
Wherein the touch pad 104-1 can capture touch events on or near the touch pad 104-1 by a user of the cell phone 100 (e.g., user operation on or near the touch pad 104-1 using any suitable object such as a finger, a stylus, etc.) and transmit the captured touch information to other devices (e.g., the processor 101). Among them, a touch event of a user near the touch pad 104-1 can be called a hover touch; hover touch may refer to a user not having to directly contact the touchpad in order to select, move, or drag a target (e.g., an icon, etc.), but rather only having to be in proximity to the device in order to perform a desired function. In addition, the touch pad 104-1 can be implemented by various types, such as resistive, capacitive, infrared, and surface acoustic wave.
Display (also referred to as a display screen) 104-2 may be used to display information entered by or provided to the user as well as various menus for handset 100. The display 104-2 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The trackpad 104-1 may be overlaid on the display 104-2, and when the trackpad 104-1 detects a touch event thereon or nearby, it is communicated to the processor 101 to determine the type of touch event, and the processor 101 may then provide a corresponding visual output on the display 104-2 based on the type of touch event. Although in FIG. 1, the touch pad 104-1 and the display screen 104-2 are shown as two separate components to implement the input and output functions of the cell phone 100, in some embodiments, the touch pad 104-1 and the display screen 104-2 may be integrated to implement the input and output functions of the cell phone 100. It is understood that the touch screen 104 is formed by stacking multiple layers of materials, and only the touch pad (layer) and the display screen (layer) are shown in the embodiment of the present application, and other layers are not described in the embodiment of the present application. In addition, the touch pad 104-1 may be disposed on the front surface of the mobile phone 100 in a full panel manner, and the display screen 104-2 may also be disposed on the front surface of the mobile phone 100 in a full panel manner, so that a frameless structure can be implemented on the front surface of the mobile phone.
In addition, the mobile phone 100 may also have a fingerprint recognition function. For example, the fingerprint identifier 112 may be disposed on the back side of the handset 100 (e.g., below the rear facing camera), or the fingerprint identifier 112 may be disposed on the front side of the handset 100 (e.g., below the touch screen 104). For another example, the fingerprint acquisition device 112 may be configured in the touch screen 104 to realize the fingerprint identification function, i.e., the fingerprint acquisition device 112 may be integrated with the touch screen 104 to realize the fingerprint identification function of the mobile phone 100. In this case, the fingerprint acquisition device 112 is disposed in the touch screen 104, may be a part of the touch screen 104, and may be disposed in the touch screen 104 in other manners. The main component of the fingerprint acquisition device 112 in the present embodiment is a fingerprint sensor, which may employ any type of sensing technology, including but not limited to optical, capacitive, piezoelectric, or ultrasonic sensing technologies, among others.
The handset 100 may also include a bluetooth device 105 for enabling data exchange between the handset 100 and other short-range devices (e.g., cell phones, smart watches, etc.). The bluetooth device in the embodiment of the present application may be an integrated circuit or a bluetooth chip.
The handset 100 may also include at least one sensor 106, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that adjusts the brightness of the display of the touch screen 104 according to the brightness of ambient light, and a proximity sensor that turns off the power of the display when the mobile phone 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone 100, further description is omitted here.
The WiFi device 107 is used for providing the mobile phone 100 with network access conforming to WiFi related standard protocols, and the mobile phone 100 can access to a WiFi access point through the WiFi device 107, thereby helping a user to send and receive e-mails, browse web pages, access streaming media and the like, and providing the user with wireless broadband internet access. In other embodiments, the WiFi device 107 may also be a WiFi wireless access point, which may provide WiFi network access for other devices.
And a positioning device 108 for providing a geographical position for the handset 100. It can be understood that the Positioning device 108 may specifically be a receiver of a Global Positioning System (GPS) or a Positioning System such as the beidou satellite navigation System, russian GLONASS, and the like. After receiving the geographical location transmitted by the positioning system, the positioning device 108 transmits the information to the processor 101 for processing or transmits the information to the memory 103 for storage. In some other embodiments, the Positioning device 108 may also be an Assisted Global Positioning System (AGPS) receiver that assists the Positioning device 108 in performing ranging and Positioning services by acting as an assistance server, in which case the assistance server provides Positioning assistance by communicating with the Positioning device 108 (i.e., GPS receiver) of the apparatus, such as the handset 100, over a wireless communication network. In some other embodiments, the positioning device 108 may also be a WiFi access point based positioning technology. Because each WiFi Access point has a globally unique Media Access Control (MAC) address, the device can scan and collect broadcast signals of surrounding WiFi Access points under the condition of starting WiFi, and therefore the MAC address broadcasted by the WiFi Access points can be obtained; the device sends the data (such as MAC address) capable of identifying the WiFi access points to the location server through the wireless communication network, the location server retrieves the geographical location of each WiFi access point, and calculates the geographical location of the device and sends the geographical location of the device to the positioning device 108 of the device according to the strength of the WiFi broadcast signal.
The audio circuitry 109, speaker 113, microphone 114 can provide an audio interface between a user and the handset 100. The audio circuit 109 may transmit the electrical signal converted from the received audio data to the speaker 113, and convert the electrical signal into a sound signal by the speaker 113 for output; on the other hand, the microphone 114 converts the collected sound signal into an electrical signal, converts the electrical signal into audio data after being received by the audio circuit 109, and outputs the audio data to the RF circuit 102 to be transmitted to, for example, another cellular phone, or outputs the audio data to the memory 103 for further processing.
Peripheral interface 110, which is used to provide various interfaces for external input/output devices (e.g., keyboard, mouse, external display, external memory, SIM card, etc.). For example, the mouse is connected through a Universal Serial Bus (USB) interface, and the Subscriber Identity Module (SIM) card provided by a telecom operator is connected through a metal contact on a SIM card slot. Peripheral interface 110 may be used to couple the aforementioned external input/output peripherals to processor 101 and memory 103.
In this embodiment of the present invention, the mobile phone 100 may communicate with other devices in the device group through the peripheral interface 110, for example, the peripheral interface 110 may receive display data sent by the other devices for displaying, and the like.
The mobile phone 100 may further include a power supply device 111 (such as a battery and a power management chip) for supplying power to each component, and the battery may be logically connected to the processor 101 through the power management chip, so as to implement functions of managing charging, discharging, and power consumption through the power supply device 111.
Although not shown in fig. 1, the mobile phone 100 may further include a camera (front camera and/or rear camera), a flash, a micro-projector, a Near Field Communication (NFC) device, etc., which will not be described in detail herein.
An execution subject of the image classification method provided in the embodiment of the present application may be an image Processing apparatus, where the image Processing apparatus is a device (such as the mobile phone 100 shown in fig. 1) that can be used to manage images, or a Central Processing Unit (CPU) of the device, or a control module in the device that is used to perform image Processing, or a client in the device that is used to manage images. In the embodiment of the present application, an image classification method performed by the above-mentioned device is taken as an example, and the image classification method provided by the present application is described.
The terms referred to in this application are described below:
(1) JPEG is an international standard for image compression.
(2) The JPEG File exchange Format (JFIF) is an image File Format standard, and JFIF is a Format of a JPEG encoded File conforming to the JPEG exchange Format standard. The image data in the JFIF file is compressed by JPEG; therefore, JFIF is also referred to as "JPEG/JFIF".
Among them, JPEG/JFIF is the most commonly used format for storing and transmitting pictures on the World Wide Web (Web).
As shown in FIG. 2, JPEG image files (i.e., JPEG formatted image files) each begin with the string "0 xFFD 8" and end with the string "0 xFFD 9". The file header of the JPEG image file contains a series of character strings in the format of "0 xFF ×", which are called "JPEG marks" or "JPEG segments", and are used for marking the information segments of the JPEG image file. Wherein "0 xFFD 8" is used to mark the beginning of image information and "0 xFFD 9" is used to mark the end of image information, two JPEG labels are not followed by information, and other JPEG labels are followed by some information characters.
(3) The EXIF image file (i.e., the image file in the EXIF format) is one of the JPEG image files described above, and conforms to the JPEG standard.
The EXIF image adds a photographing parameter (referred to as a first photographing parameter) of a camera photographed image to a JPEG image file. For example, the first photographing parameters may include: shooting date, shooting equipment parameters (such as the brand and model of the camera, lens parameters, flash parameters, etc.), shooting parameters (such as shutter speed, F-stop value, ISO speed, focal length, photometric mode, etc.), image processing parameters (such as sharpening, contrast, saturation, white balance, etc.), and GPS positioning data of the shot image.
As shown in fig. 3, EXIF information may be included in the EXIF image file, where the EXIF information includes an EXIF field 301 (a field in the EXIF image file for storing the first photographing parameter) and a Maker Note field 302, and the first photographing parameter is stored in the EXIF field 301. The Maker Note field 302 is a field reserved for vendors to hold vendor specific annotation data. The EXIF information starts with a character string "0 xFFE 0" and ends with a character string "0 xfffef" in the EXIF image, and is 64KB (kilobyte).
The format of the first image file in the embodiment of the present application may be EXIF (that is, the first image file may be an EXIF image file), and the feature information in the first image file may be stored in a Maker Note field of the EXIF image file. The image data field 303 of the EXIF image file is used to save image data. In this way, when the image classification operation is performed on the first image file (by other equipment or the equipment), the feature information in the first image file can be read, and the image classification operation is performed on the first image file by directly utilizing the feature information in the first image file; instead of performing the image recognition operation on the image data in the first image file, the feature information for performing the image classification operation is obtained through a large number of calculations. By the scheme, the calculated amount in the image classification process can be reduced, and the image classification efficiency can be improved.
Of course, the format of the first image file in the embodiment of the present application includes, but is not limited to, EXIF, and the first image file in the embodiment of the present application may also be other format files including reserved fields, where the reserved fields may be used to store feature information for performing an image classification operation. The reserved field includes, but is not limited to, the Maker Note field, and the reserved field may be any field in the first image file that can be used for storing feature information for performing an image classification operation, which is not limited in this embodiment of the present application.
Please refer to fig. 4, which illustrates a system schematic framework diagram of an image classification method according to an embodiment of the present application. In the embodiment of the present application, the first device may save the feature information of the image data in the first image file during the process of capturing the image and performing the image classification operation, and directly use the feature information saved in the image file when performing the image classification operation.
Specifically, as shown in fig. 4, the apparatus may acquire feature information (image generation information in the embodiment of the present application) when the camera acquires image data, and save the feature information in a first image file (i.e., execute 401), generate a first image file 403; subsequently, when the image classification engine 402 of the device performs an image classification operation, the image classification engine can directly read the feature information (i.e., the image generation information) in the first image file 403, and identify the image data of the first image file to obtain new feature information (e.g., classification feature information in the embodiment of the present application); then, performing image classification operation on the first image file by using the read characteristic information and the new characteristic information; finally, the feature information in the first image file may also be updated (e.g., new feature information added) with the identified new feature information.
An image classification method provided by the embodiment of the present application is described in detail below with reference to specific embodiments.
An image classification method provided by the embodiment of the present application is described herein by taking an example in which a format of a first image file is EXIF (that is, the first image file is an EXIF image file), and feature information is stored in a Maker Note field of the EXIF image file. As shown in fig. 5, the method of the embodiment of the present application includes S501-S503:
s501, the first device captures image data, acquires image generation information when the image data is captured, and generates a first image file comprising the image data and the image generation information.
In this embodiment, a preset field (e.g., a Maker Note field) of a first image file (e.g., an EXIF image file) may be used to store feature information of image data of the first image file. The characteristic information may include: image generation information. The image generation information is characteristic information acquired by the first device when the camera of the first device captures image data. For a detailed example of the image generation information, reference may be made to the subsequent description in the embodiment of the present application, which is not repeated herein.
In general, a first device, upon capturing image data, may generate a first image file including the image data. In the embodiment of the application, the method for obtaining the image file by shooting through the first device is different from the method for obtaining the image file by shooting through the device in the traditional scheme. Specifically, the first device may acquire not only image data captured by the camera but also image generation information at the time of capturing the image data, and then generate an image file including the image data and the image generation information.
Illustratively, the image generation information may include: shooting parameters (referred to as second shooting parameters) when the camera captures image data. For example, the second photographing parameters may include: information of photographing modes (such as a panorama mode, a normal mode, and the like), information of photographing scenes (such as a person photographing scene, a building photographing scene, a natural scene photographing scene, an indoor photographing scene, an outdoor photographing scene, and the like), and a camera type (the camera type indicates that image data is captured using a front camera or a rear camera), and the like. The first device can determine the information such as the shooting mode information, the shooting scene information, the camera type and the like in response to the selection of the shooting mode, the shooting scene and the front camera or the rear camera by the user when the camera captures the image data. The normal mode in the embodiment of the present application refers to a mode in which a picture is taken using a rear camera.
When the shooting scene is a character shooting scene, the image generation information further includes character characteristic information, and the character characteristic information includes information such as the number of human faces, human face indication information, human face position information, indication information of other objects (such as animals), and the number of other objects. The face indication information is used for indicating that the image data of the first image file comprises a face or does not comprise a face; the indication information of the other object is used for indicating that the other object is included or not included in the image data; the face position information is used to indicate the position of the face in the image data.
Note that the second photographing parameters in the embodiment of the present application are different from the first photographing parameters held in the EXIF field 301 of the EXIF image shown in fig. 3. In general, when a camera captures an image, only the first shooting parameter can be saved in the image, and the second shooting parameter in the embodiment of the present application is not recorded; and the second photographing parameter may be used to perform an image classification operation on the image. Thus, when the image classification operation is performed on the image, the image needs to be identified to obtain the second shooting parameter, which causes a problem of large calculation amount of the image classification operation. In this embodiment, the first device may generate the first image file including the image data and the second photographing parameter when photographing the image data of the first image file. In this way, when the image classification operation is performed on the image, the image classification operation can be directly read from the first image file and performed by using the second shooting parameter, so that the calculation amount when the image classification operation is performed on the image can be reduced, and the image classification efficiency can be improved. Of course, the first image file (e.g., EXIF image file) in the embodiment of the present application may also include the first setting parameter. The first photographing parameter is stored in an EXIF field of the EXIF image file.
Referring to fig. 6, a schematic diagram of generating an image file including feature information of image data when an image is captured according to an embodiment of the present application is shown. As shown in fig. 6, the camera engine may call algorithms such as a scene algorithm and a face algorithm when the camera captures an image (i.e., 61), and recognize a user operation when the camera captures image data (i.e., 62) to collect image generation information of the image (i.e., 63); the collected image generation information and the image captured by the camera are then handed to a JPEG maker 64; packing the image generation information from 63 into a byte array (feature byte array for short) by a MakerNote maker in the JPEG maker 64, and packing the image from 61 into a byte array by an EXIF maker in the JPEG maker 64; then, an image file (i.e., a first image file) including the image data and the above-described feature byte array is generated by the JPEG maker 64.
Alternatively, the first device may periodically perform the following S502 of image classifying a plurality of image files including the first image file.
As an example, assume that the first device is the cellular phone 100 shown in (a) in fig. 7A. As shown in fig. 7A (a), the photo album of the mobile phone 100 includes photos such as photos 1 to 8, the first image file is any one of the photos in the photo album of the mobile phone 100, and the first image file is photo 1 shown in fig. 7A (a). The cell phone 100 can periodically classify the images of the photos in the photo album of the cell phone 100. In this way, the mobile phone 100 can display the album interface shown in (b) of fig. 7A, in which the mobile phone 100 displays the result of classifying the pictures in the album. For example, the cell phone 100 divides the photos 1-8 into a "people" photo album b, an "animals" photo album a, and a "landscape" photo album c. Among them, the "person" photo album b includes the photograph 1, the photograph 3, the photograph 5, and the photograph 8 shown in (a) in fig. 7A, the "animal" photo album a includes the photograph 2 and the photograph 7 shown in (a) in fig. 7A, and the "landscape" photo album c includes the photograph 4 and the photograph 6 shown in (a) in fig. 7A. For example, taking "person" album b as an example, when the user clicks "person" album b shown in (b) in fig. 7A, the cellular phone 100 may display a "person" album interface shown in (c) in fig. 7A, which includes photograph 1, photograph 3, photograph 5, and photograph 8.
Alternatively, the first device may perform S502 in response to a user operation, and perform image classification on pictures in an album of the first device. Based on the example shown in fig. 7A, the cellular phone 100 may also perform image classification of the photos 1 to 8 in response to a user's click operation on the "album" button shown in (a) in fig. 7A, and then display an album interface shown in (b) in fig. 7A.
S502, the first device performs image classification operation on the first image file by using the image generation information.
In the embodiment of the present application, the image generation information is not obtained by the first device performing the image recognition operation on the image data, that is, the first device does not perform the image recognition operation on the image data in order to obtain the image generation information. That is, the first device may skip the following steps: and performing image recognition operation on the image data to analyze the image data to obtain image generation information, and performing image classification operation on the first image file by directly utilizing the image generation information stored in the first image file.
It can be understood that the image classification algorithms used by the first device to analyze the image data for different feature information are different. The image classification algorithm adopted by the first device for analyzing the image data to obtain the image generation information comprises a first algorithm. In this way, the first device does not perform an image recognition operation on the image data using the first algorithm to analyze the image data for image generation information corresponding to the first algorithm.
It should be noted that the first algorithm in the embodiment of the present application may include one or more image classification algorithms. The image generation information in the embodiment of the present application may include feature information of one or more attributes, where the feature information of each attribute corresponds to an image classification algorithm.
S503, the device responds to the operation for viewing the gallery, and displays the first image file in the classification directory of the gallery.
The image classification directory in the embodiment of the present application may display the first image file according to the classification result obtained by performing S502.
Illustratively, based on the example shown in fig. 7A, the mobile phone 100 may also display photos 1 to 8 in a category manner, that is, display the album interface (i.e., category directory of gallery) shown in (b) of fig. 7A, in response to a user clicking on the "album" button shown in (a) of fig. 7A.
Alternatively, the operation for querying the gallery may be that the user inputs a keyword in a search box of the gallery, and the device may display a plurality of image files including the first image file in the category directory of the gallery in response to the user inputting the keyword in the search box of the gallery, the plurality of image files being matched with the keyword input by the user.
For example, taking the mobile phone 100 storing the photos 1-8 shown in (a) of fig. 7A as an example, the mobile phone 100 performs the above-mentioned S501-S502 to perform the image classification operation on the image files (e.g., photos 1-8). When the user enters the keyword "person" in the search box of the gallery, the mobile phone 100 may display the person image files such as photograph 1, photograph 3, photograph 5, and photograph 8 in the category list of the gallery. When the user enters the keyword "two persons" in the search box of the gallery, the mobile phone 100 may display the person image files such as the photograph 3 and the photograph 5 in the category directory of the gallery.
According to the image classification method provided by the embodiment of the application, the image file obtained by shooting through the first device not only comprises image data, but also comprises image generation information. In this way, the first device can directly utilize the image generation information to perform image classification operation on the first image file; the image data does not need to be analyzed to obtain the image generation information without executing image recognition operation on the image data, so that the calculation amount in the image classification process can be reduced, and the image classification efficiency can be improved.
Further, since the feature information required for the first device to perform the image classification operation includes not only the above-mentioned image generation information but also classification feature information (feature information obtained by performing an image recognition operation on image data, the classification feature information being different from the image generation information); therefore, before the first device performs the image classification operation on the first image file, the first device may also perform an image recognition operation on the image data to analyze the image data for classification feature information; and then, performing image classification operation on the first image file by using the image generation information and the classification characteristic information. Specifically, as shown in fig. 7B, the S502 may include S701-S702:
s701, the first device executes image recognition operation on the image data to analyze the image data to obtain classification characteristic information.
The first device may perform an image recognition operation on the image data by using another image classification algorithm (e.g., a second algorithm) different from the first algorithm, so as to analyze the image data to obtain classification feature information corresponding to the second algorithm. The second algorithm is different from the first algorithm, and the classification feature information is different from the image generation information. The method for the first device to perform the image recognition operation on the image data by using the second algorithm to obtain the classification feature information may refer to a method for performing the image recognition operation on the image data to obtain the classification feature information in the conventional technology, which is not described herein again in this embodiment of the present application.
S702, the first device performs image classification operation on the first image file by using the image generation information and the classification characteristic information.
The method for the first device to perform the image classification operation on the first image file by using the image generation information and the classification characteristic information may refer to a method for the device to perform the image classification operation on the image file according to the characteristic information of the image file in the conventional technology, and details are not repeated here in the embodiments of the present application.
Further, as shown in fig. 7B, after S701, the method of the embodiment of the present application further includes S703:
and S703, the first device stores the classification characteristic information in the first image file to obtain an updated first image file.
It is understood that after S703, the first image file includes image generation information and classification characteristic information therein. In the embodiment of the present application, the image generation information and the classification feature information are collectively referred to as feature information of the first image file. The feature information (image generation information and classification feature information) in the embodiment of the present application is stored in a preset field of the first image file. In the embodiment of the present application, taking the example that the preset field is the Maker Note field 302 shown in fig. 3, the format of the preset field and the manner of storing the feature information in the preset field are described as follows:
for example, the feature information in the embodiment of the present application may be stored in the Maker Note field by using TIFF. The data format for storing the feature information in the Maker Note field 302 includes, but is not limited to, TIFF, and other data formats for storing the feature information in the Maker Note field 302 are not described herein in detail in the embodiments of the present application.
As shown in fig. 3, the Maker Note field 302 includes: a header 302a, a check field 302b, a Tagged Image File Format (TIFF) header 302c, and a TIFF field 302 d.
As shown in fig. 8, the header 302a is used to store vendor information. For example, "huawei" may be stored in the header 302 a; the Check field 302b is used to hold Check information that identifies the integrity and accuracy of the information held in the Maker Note field 302, for example, the Check information may be a Cyclic Redundancy Check (CRC). The TIFF header 302c is used to store TIFF instruction information for instructing that the format of the feature information stored in the TIFF field 302d is an Image File Directory (IFD) format. The TIFF field 302d is used to store feature information (e.g., image generation information and classification feature information).
In the embodiment of the application, after the device performs image classification operation on the image and obtains new feature information (i.e., classification feature information), the feature information stored in the Maker Note field 302 may be updated (i.e., the Maker Note field 302 is modified). In this case, in order to avoid that the feature information stored in the marker Note field 302 is updated by the device, the verification information stored in the verification field 302b indicates that the marker Note field 302 is tampered; the device may generate new verification information for the verification field 302b when updating the feature information stored in the Maker Note field 302.
Please refer to fig. 8, which illustrates an example diagram of a data structure of a marker Note field according to an embodiment of the present application. The format in which the feature information is held in the TIFF field 302d in fig. 8 is the IFD format.
Specifically, one or more IFDs may be stored in TIFF field 302 d. For example, as shown in FIG. 8, TIFF field 302d includes IFD0 and IFD 1. Taking IFD0 as an example, the IFD information in TIFF field 302d is described as follows:
the IFD0 includes a directory field and a data field, and the directory field of the IFD0 is used for storing a directory of sub-IFDs (e.g., sub-IFD 1 and sub-IFD 2, etc.) in the IFD0 and an end-of-connection tag of the IFD 0. The data field of the IFD0 is used to hold the child IFDs (e.g., child IFD1 and child IFD2, etc.). The end of attachment tag of the IFD0 is used to indicate where the IFD0 ends. The characteristic information in the embodiment of the present application may be stored in a directory, or may be stored in a data field.
Optionally, each sub-IFD may also include a directory field and a data field. For example, a child IFD1 in an IFD0 includes a directory field and a data field. For the functions of the directory field and the data field of the sub-IFD 1, reference may be made to the description of the directory field and the data field of the IFD in the embodiment of the present application, and details of the embodiment of the present application are not described herein again.
Assume that three IFDs are included in TIFF field 302d (e.g., IFD1-IFD3), and that IFD-0IFD2 does not include a child IFD. Please refer to fig. 9, which illustrates a structural diagram of the directory of the IFD shown in fig. 8. As shown in fig. 9, the directory of the IFD includes a plurality of tag entries, each of which includes a tag Identification (ID), a tag type, and a tag value.
The tag value in the embodiment of the present application may be feature information; alternatively, the characteristic information is stored in the data field of the IFD, and the tag value is the address offset of the characteristic information in the data field. For example, when one feature information is less than or equal to 4 bytes (byte), the tag value is the feature information; when a feature information is greater than 4 bytes, the feature information needs to be stored in the data field, and the tag value is the address offset of the feature information in the data field.
For example, taking 4 tag entries included in IFD0 as an example, the tag IDs of the three tag entries are 0x001, 0x002, 0x003, and 0x004, respectively. The tag ID0x 001 corresponds to an unscented type, and the tag value indicates the shooting mode information of the image (for example, when the tag value is 0, it indicates that the image is shot in a self-timer mode, and when the tag value is 1, it indicates that the image is shot in a panorama mode). The tag ID0x002 corresponds to a tag type of unidentified byte, and the tag value indicates the camera type of the image (for example, when the tag value is 0, it indicates that the image is shot by using a rear camera; when the tag value is 1, it indicates that the image is shot by using a front camera). The tag ID0x003 corresponds to a tag type of underdefined, and a tag value thereof is used to indicate face indication information (for example, when the tag value is 0, it indicates that there is no face in the image; when the tag value is 1, it indicates that there is a face in the image). The type of the tag corresponding to the tag ID0x 004 is unidentified byte, the tag value is address offset 1, and the address offset 1 is the address offset of the face position information in the data field of the IFD 0.
When the first device performs an image classification operation on the first image file, the feature information required to be used may include feature information of a plurality of preset attributes. In general, the first device may employ different algorithms to identify image data in the first image file to derive the characteristic information for the corresponding attribute. The "preset multiple attributes" in the embodiment of the present application is determined by an image classification client (abbreviated as client) in the first device. Specifically, the "preset multiple attributes" are determined by attributes of feature information to be identified when the image classification client in the first device performs an image classification operation on an image.
For example, it is assumed that the attributes of the feature information that needs to be identified when the client of the first device performs an image classification operation on an image file include: face attributes, scene attributes, and mode attributes. The face attribute corresponds to the number of faces, the face indication information and the like, the scene attribute corresponds to the shooting scene information, and the mode attribute corresponds to the shooting mode information. Then, the preset attributes include a face attribute, a scene attribute, and a mode attribute. Wherein, the characteristic information of each attribute corresponds to an image classification algorithm. For example, the feature information of the face attribute corresponds to a face algorithm.
When the feature information of an attribute is more complex or more, the feature information of the attribute may be stored in a sub IFD of an IFD in order to facilitate extraction and storage of the feature information of the attribute. For example, the feature information of the face attribute may include: the version of the face algorithm, the number of faces, face position information and the like. Assume that three sub-IFDs are included in the IFD0 (e.g., sub-IFD 1-sub-IFD 3).
Referring to FIG. 10, the structure of the directory of the child IFDs in the IFD0 shown in FIG. 8 is shown. As shown in fig. 10, a directory of a child IFD in an IFD includes a plurality of tag entries, each tag entry including a tag ID, a tag type, and a tag value. Taking the sub-IFD 1 of the IFD0 as an example, assume that 3 tag entries are included in the sub-IFD 1 of the IFD0, and the tag IDs of the three tag entries are 0x001, 0x002 and 0x003, respectively. The type of the label corresponding to the label ID0x 001 is unidimensioned long, and the label value is used for indicating the version of the face algorithm; the type of the label corresponding to the label ID0x002 is unidimensioned long, and the label value is used for indicating the number of the human faces. The tag type corresponding to the tag ID0x003 is unidentified byte, and the tag value is address offset 2, where the address offset 2 is the address offset of the data field of the sub IFD1 of the IFD0 of the face position information.
It should be noted that, in the embodiment of the present application, the tag IDs in the tag entries in different IFDs may be the same. Specifically, since the identifications (e.g., IDs) of different IFDs are different, the device can distinguish the tag items in different IFDs even if the tag items in two IFDs use the same tag ID. Also, the tag IDs in the tag entries in different sub IFDs may be the same. Specifically, since the identifiers (e.g., IDs) of different sub-IFDs are different, the device can distinguish the tag items in different sub-IFDs even if the tag items in two sub-IFDs use the same tag ID. Also, the tag types in the embodiments of the present application include, but are not limited to, the aforementioned unidimensioned long, unidimensioned byte, and Undefined.
It is understood that when the first device adds new feature information in the TIFF field 302d, a tag entry, IFD, or sub-IFD may be added in the TIFF field 302d for saving the feature information of the new attribute. For example, IFD2 is added to TIFF field 302d shown in FIG. 8.
Please refer to fig. 11, which illustrates a schematic diagram of a principle of performing an image classification operation according to an embodiment of the present application. As shown in fig. 11, when the device is to perform an image classification operation on an image file, the image classification engine may first read feature information stored in a Maker Note field of the image file, and after the feature information is read, the read feature information is analyzed by a Maker Note analyzer and an EXIF analyzer (i.e., 1101); for the feature information of a part of attributes (i.e., feature information not stored in the Maker Note field), an image classification algorithm (i.e., 1102) may be used to identify image data to obtain the feature information of the attribute (i.e., new feature information 1104); then an image classification operation (i.e., 1103) may be performed on the image using the feature information read by the execution 1101 and the new feature information (i.e., 1104); in addition, the image classification engine may also update feature information (i.e. 1105) stored in the Maker Note field of the image file by using the new feature information 1104, so as to obtain an updated image file.
It will be appreciated that the first device may acquire the image file before performing the image classification operation on the image file. Specifically, as shown in fig. 12, the method in the embodiment of the present application includes S1200:
s1200, the first equipment acquires a first image file.
The method for acquiring the first image file by the first device includes, but is not limited to, the method shown in S501, and the first device may also receive the image file sent by the second device. That is, S1200 may include S1200 a: the first device receives a first image file sent by the second device. Wherein, in different implementations (implementations a-d), the first image file that the first device receives from the second device is different:
the implementation mode a: the second device takes the first image file in the manner of taking the image file as shown in S501. The first device receives a first image file including image generation information from the second device.
The implementation mode b: the second device obtains a first image file by shooting in the mode of shooting the image file shown in S501; and the second device executes the image classification method provided by the embodiment of the application to perform the image classification operation on the first image file. After the second device performs the image classification operation on the first image file, the classification characteristic information of the image data in the first image file is saved in the first image file. Namely, the first image file received by the first device from the second device comprises image generation information and classification characteristic information.
The implementation mode c: the second device does not have the function of capturing an image file in the manner shown in S501, and the first device receives the first image file from the second device without including the image generation information and without including the classification feature information.
The implementation mode d: the second device does not have the function of capturing an image file in the manner shown in S501; however, the second device executes the image classification method provided by the embodiment of the application to perform the image classification operation on the first image file. After the second device performs the image classification operation on the first image file, the first image file comprises the classification characteristic information and does not comprise the image generation information.
After S502, when the first device performs the image classification operation on the first image file again, or when the first device performs the image classification operation on the first image file received from the second device, the first device may first read the feature information saved in the first image file; when it is determined that the first feature information (image generation information and/or classification feature information) is stored in the first image file, performing an image recognition operation on the image data without using a third algorithm to analyze the image data to obtain first feature information corresponding to the third algorithm; and directly utilizing the first characteristic information to carry out image classification operation on the first image file. Namely, the device can skip the image data identification by adopting the third algorithm to analyze the image data to obtain the first characteristic information, and directly utilize the first characteristic information to execute the image classification operation on the first image file, so that the calculation amount for executing the image classification operation can be reduced. Specifically, after S502 or S1200a, the method of the embodiment of the present application further includes S1201-S1205. For example, as shown in fig. 12, after S1200, the method of the embodiment of the present application further includes S1201-S1205:
s1201, the first device acquires feature information in the first image file.
The feature information (such as image generation information and classification feature information) in the embodiment of the present application is stored in a preset field (such as a Maker Note field) of the first image file. For a specific storage manner of the feature information in the image file, reference may be made to detailed descriptions in fig. 8 to 10 in this embodiment of the application, which is not described herein again.
S1202, the first device judges whether the first image file comprises first characteristic information.
Specifically, when the first device determines that the first image file includes the first feature information, that is, the first feature information is stored in a preset field (for example, a Maker Note field) of the first image file, the first device may not use the third algorithm to identify the image data to obtain the first feature information (i.e., S1203 is skipped), and directly perform the image classification operation using the first feature information (i.e., S1205 is performed).
When the first device determines that the first image file does not include the first feature information, that is, the first feature information is not stored in a preset field (e.g., a Maker Note field) of the first image file, the first device may identify the image data in the first image file by using a third algorithm to obtain the first feature information (i.e., execute S1203), and then perform an image classification operation by using the first feature information obtained in the execution of S1203 (i.e., execute S1205).
The first feature information may be feature information of a first attribute, and the third algorithm may be an image classification algorithm for identifying the feature information of the first attribute. For example, it is assumed that the feature information of an image can be classified into: feature information of the attribute a and feature information of the attribute b; then the plurality of preset attributes may include: the attribute a and the attribute b. For example, the feature information of one image may be divided into feature information of a shooting attribute and feature information of a face attribute according to the attribute of the feature information.
When the feature information of the attribute a can be divided into the following sub-attributes according to the feature information: when the attribute a-sub attribute 1 and the attribute a-sub attribute 2 are the feature information, the preset attribute may include: attribute a-child attribute 1, attribute a-child attribute 2, and attribute b. For example, the feature information of the above-described shooting attributes may include shooting mode information, shooting scene information, and the like; the feature information of the face attribute may be divided into: face indication information, the version of the face algorithm and the number of faces.
The Maker Note field of the first image file may store feature information of the image data according to different attributes. For example, as shown in fig. 8, the feature information of one attribute is stored in one IFD, and the attribute of the feature information stored in different IFDs is different.
For example, the IFD0 shown in fig. 8 may be used to store feature information of shooting attributes such as the above-described shooting mode information and shooting scene information, the shooting mode information being stored in the sub-IFD 1 in the IFD0, and the shooting scene information being stored in the sub-IFD 2 in the IFD 0; the IFD1 shown in fig. 8 can be used to store feature information of face attributes (version of face algorithm, number of faces, face position information, etc.), the version of face algorithm is stored in the sub-IFD 1 in the IFD1, the number of faces is stored in the sub-IFD 2 in the IFD1, and the face position information is stored in the sub-IFD 3 in the IFD 1.
The attributes of the feature information stored by each IFD may be predetermined. For example, each IFD stores feature information of a tag corresponding to an attribute. Wherein, a plurality of labels can be stored in each IFD, and each label comprises a label ID, a label type and a label value.
When the feature information of the first attribute is not stored in the Maker Note field (TIFF field in the Maker Note field), the IFD of the TIFF field may not include a tag corresponding to the first attribute. Of course, in the embodiment of the present application, it may also be indicated that the feature information of the first attribute is not saved by setting the tag value of the feature information of the first attribute to Null (e.g., Null).
S1203, the first device identifies image data in the first image file by using a third algorithm to obtain first feature information.
The method for identifying the image data of the first image file by the first device using the image classification algorithm (e.g., the third algorithm) to obtain the first feature information may refer to a method for identifying the feature information of the image data using the image classification algorithm when the device performs the image classification operation on the image in the conventional technology, and is not described herein again in the embodiments of the present application.
S1204, the first device saves the first characteristic information in the first image file to obtain an updated first image file.
The attribute of the feature information stored in each IFD may be predetermined, for example, the tag ID of each IFD corresponds to the feature information of one attribute; therefore, the first device may store the feature information of the first attribute in a TIFF field in the Maker Note field and a tag value corresponding to the tag ID of the first attribute.
And S1205, the first equipment executes image classification operation on the first image file by utilizing the first characteristic information.
The method for performing the image classification operation on the first image file by the first device using the first feature information may refer to a method for performing the image classification operation on the image file by the device according to the feature information of the image file in the conventional technology, and is not described herein again in the embodiments of the present application. For a principle of executing the image classification operation provided in the embodiment of the present application, reference may be made to the principle schematic diagram shown in fig. 11, which is not described herein again in the embodiment of the present application.
In the image classification method provided by the embodiment of the application, when the first device performs the image classification operation on the first image file, the first device may first acquire the feature information in the first image file, and determine whether the first image file includes the first feature information; when the first image file includes the first feature information, then identifying image data of the first image file using the third algorithm may be skipped to obtain the first feature information. Therefore, the calculation amount in the image classification process can be reduced, and the image classification efficiency can be improved.
It will be appreciated that the version of the algorithm used by the device to perform the image classification operation is continually updated over time, and that the feature information obtained by identifying the image data of the first image file using the same algorithm in different versions is different. Based on this, the classification feature information also includes a version of an image classification algorithm. In this case, even if the first feature information is stored in the preset field, the version of the algorithm identifying the first feature information is not necessarily the same as the version of the third algorithm. In view of this, after S1202, when the first feature information is included in the first image file, the method of the embodiment of the present application further includes S1301, and S1204 described above may be replaced with S1204 a:
s1301, the first device judges whether the version of the algorithm for identifying the first feature information is the same as the version of the third algorithm.
Specifically, when the version of the algorithm for identifying the first feature information is the same as the version of the third algorithm, the first device may skip identifying the image data in the first image file by using the third algorithm to obtain the first feature information (i.e., skip S1203), and perform an image classification operation on the first image file directly using the first feature information (i.e., perform S1205).
When the version of the algorithm for identifying the first feature information is different from the version of the third algorithm, the first device may identify the image data by using the third algorithm to obtain the first feature information (i.e., perform S1203), and then perform an image classification operation on the first image file by using the first feature information obtained by performing S1203 (i.e., perform S1205).
S1204a, the first device updates the first feature information stored in the first image file by using the first feature information, so as to obtain an updated first image file.
It can be understood that, when the first image file does not include the first feature information, the method for updating the saved feature information in the first image file by the first device is as follows: the first device adds first feature information in a preset field of the first image file.
When the first image file comprises the first characteristic information but the version of the algorithm for identifying the first characteristic information is different from the version of the third algorithm, the method for updating the characteristic information stored in the first image file by the first device comprises the following steps: and the first device replaces the first characteristic information stored in the preset field of the first image file with the identified first characteristic information.
Further, the version of the algorithm for identifying the first feature information is different from the version of the third algorithm, and can be divided into two cases: (1) identifying that the version of the algorithm of the first feature information is lower than the version of the third algorithm; (2) it is identified that the version of the algorithm of the first characteristic information is higher than the version of the third algorithm. Specifically, after S1301, when it is recognized that the version of the algorithm of the first feature information is different from the version of the third algorithm, the method in the embodiment of the present application further includes S1401:
s1401, the first device determines whether the version of the algorithm identifying the first feature information is lower than the version of the third algorithm.
Specifically, when the version of the algorithm for identifying the first feature information is lower than the version of the third algorithm, the first device may identify the image data in the first image file by using the third algorithm with the higher version to obtain the feature information of the first attribute (i.e., execute S1203), and then perform an image classification operation on the first image file by using the first feature information obtained by executing S1203 (i.e., execute S1205).
When the version of the algorithm for identifying the first feature information is higher than the version of the third algorithm, the first device does not need to identify the image data by using the third algorithm of the lower version to obtain the feature information of the first attribute, and can skip identifying the image data by using the third algorithm to obtain the feature information of the first attribute (i.e., skip S1203), and directly perform the image classification operation on the first image file by using the first feature information (i.e., perform S1205).
According to the image classification method provided by the embodiment of the application, the first device can update the feature information stored in the preset field by using the feature information identified by the algorithm after the version update in time after the version of the algorithm used for executing the image classification operation is updated. Therefore, when the image classification operation is executed on the image again, the updated feature information stored in the preset field can be directly used, the calculated amount in the image classification process can be reduced, and the image classification efficiency can be improved.
It is to be understood that the first device, the second device, and the like, include hardware structures and/or software modules for performing the respective functions in order to realize the functions. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present application.
In the embodiment of the present application, the first device may be divided into the functional modules according to the method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, in the embodiment of the present application, the division of the module is schematic, and is only one logic function division, and there may be another division manner in actual implementation.
In the case of dividing each functional module by corresponding functions, as shown in fig. 13, the present embodiment provides an apparatus 1300, where the apparatus 1300 is a first apparatus in the above method embodiment. The apparatus 1300 comprises: an acquisition unit 1301, a classification unit 1302, and a display unit 1303.
The obtaining unit 1301 is configured to support the device 1300 to perform S501, S1200, and S1201 in the foregoing method embodiment, and/or other processes for the technology described herein. The above classification unit 1302 is configured to enable the apparatus 1300 to perform S502, S701, S702, S1203, S1205 in the above method embodiment, and/or other processes for the techniques described herein. The display unit 1303 described above is used to support the apparatus 1300 to perform S503 in the above-described method embodiments, and/or other processes for the techniques described herein.
Further, the apparatus 1300 further includes an updating unit. The update unit is configured to enable the apparatus 1300 to perform the above method embodiments S703, S1204a, and/or other processes for the techniques described herein.
Further, the apparatus 1300 further includes a determining unit. The determination unit is used to enable the device 1300 to perform S1202, S1301, S1401 of the above-described method embodiments, and/or other processes for the techniques described herein.
Of course, the apparatus 1300 may further include other unit modules. For example, the apparatus 1300 described above further includes: and a memory unit. The storage unit is used for storing a first image file. Alternatively, the first image file may be saved in a cloud server, and the apparatus 1300 may perform an image classification operation on the image file in the cloud server. The apparatus 1300 may further include: a transceiving unit. The device 1300 may interact with other devices via transceiving units. For example, the device 1300 may transmit an image file to another device or receive an image file transmitted by another device through the transceiving unit.
In the case of using an integrated unit, the obtaining unit 1301, the classifying unit 1302, and the like may be integrated into a processing module, the transceiver may be an RF circuit, a WiFi module, or a bluetooth module of the device 1300, the storage unit may be a storage module of the device 1300, and the display unit 1303 may be a display module, such as a display (touch screen).
Fig. 14 shows a schematic diagram of a possible structure of the terminal involved in the above embodiment. The apparatus 1400 comprises: a processing module 1401, a storage module 1402 and a display module 1403.
The processing module 1401 is used for controlling and managing the device 1400. The display module 1403 is used for displaying the image files and the classification results of the image files. A storage module 1402 for storing program codes and data for the device 1400. The device 1400 may also include a communication module 1404, where the communication module 1404 is configured to communicate with other devices. Such as a communication module 1404 for receiving or transmitting messages or image files to other devices.
The processing module 1401 may be a Processor or a controller, such as a CPU, a general purpose Processor, a Digital Signal Processor (DSP), an Application-Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. The processor may also be a combination of computing functions, e.g., comprising one or more microprocessors, DSPs, and microprocessors, among others. The communication module 1404 may be a transceiver, a transceiver circuit, a communication interface, or the like. The storage module 1402 may be a memory.
When the processing module 1401 is a processor (such as the processor 101 shown in fig. 1), the communication module 1404 is a radio frequency circuit (such as the radio frequency circuit 102 shown in fig. 1), the storage module 1402 is a memory (such as the memory 103 shown in fig. 1), and the display module 1403 is a touch screen (including the touch pad 104-1 and the display panel 104-2 shown in fig. 1, the device provided by the present application can be the mobile phone 100 shown in fig. 1.
An embodiment of the present application further provides a control device, including a processor and a memory, where the memory is used to store computer program codes, and the computer program codes include computer instructions, and when the processor executes the computer instructions, the image classification method according to the embodiment of the method is performed.
The embodiment of the present application further provides a computer storage medium, in which computer program codes are stored, and when the processor executes the computer program codes, the device executes the relevant method steps in fig. 5 or fig. 12 to implement the method in the foregoing embodiment.
The embodiments of the present application also provide a computer program product, which when run on a computer causes the computer to execute the relevant method steps in fig. 5 or fig. 12 to implement the method in the above embodiments.
The device 1300 and the device 1400, the computer storage medium, or the computer program product provided in the present application are all configured to execute the corresponding methods provided above, so that beneficial effects achieved by the device may refer to beneficial effects in the corresponding methods provided above, and are not described herein again.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. For the specific working processes of the system, the apparatus and the unit described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details are not described here again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard drive, read only memory, random access memory, magnetic or optical disk, and the like.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (30)

  1. An image classification method, comprising:
    the method comprises the steps that the equipment captures image data, acquires image generation information when the image data is captured, and generates a first image file comprising the image data and the image generation information;
    the equipment executes image classification operation on the first image file by utilizing the image generation information;
    the device displays the first image file in a category directory of a gallery in response to an operation for viewing the gallery.
  2. The method according to claim 1, wherein the image generation information includes at least one of information of a photographing parameter, information of a photographing mode, information of a photographing scene, and information of a camera type.
  3. The method of claim 2, wherein the photographing parameters include an exposure value, the photographing modes include a panorama mode and a normal mode, the photographing scenes include a character photographing scene, a building photographing scene, and a nature scene photographing scene, and the camera type indicates that the image data is captured using a front camera or a rear camera;
    when the shooting scene is the character shooting scene, the image generation information further comprises character characteristic information.
  4. The method according to any one of claims 1 to 3, wherein the image generation information is not obtained by performing an image recognition operation.
  5. The method of any of claims 1-4, wherein the device performs an image classification operation on the first image file using the image generation information, comprising:
    the equipment executes image recognition operation on the image data to analyze the image data to obtain classification characteristic information;
    the device performs an image classification operation on the first image file using the image generation information and the classification feature information.
  6. The method of claim 5, wherein after the device performs an image recognition operation on the image data to analyze the image data for classification feature information, the method further comprises:
    and the equipment saves the classification characteristic information in the first image file to obtain an updated first image file.
  7. The method according to any of claims 1-6, wherein the format of the first image file is exchangeable image file format EXIF.
  8. The method of any of claims 1-7, wherein the image generation information is stored in a vendor comment field of the first image file.
  9. The method of claim 8, wherein the image generation information is stored in the vendor comment field in a Tagged Image File Format (TIFF).
  10. An apparatus, comprising:
    an acquisition unit configured to capture image data, acquire image generation information at the time of capturing the image data, and generate a first image file including the image data and the image generation information;
    a classification unit configured to perform an image classification operation on the first image file using the image generation information acquired by the acquisition unit;
    and the display unit is used for responding to the operation for viewing the gallery and displaying the first image file in a classified mode according to the classification result of the classification unit.
  11. The apparatus according to claim 10, wherein the image generation information acquired by the acquisition unit includes at least one of information of a shooting parameter, information of a shooting mode, information of a shooting scene, and information of a camera type.
  12. The apparatus of claim 11, wherein the photographing parameters include an exposure value, the photographing modes include a panorama mode and a normal mode, the photographing scenes include a character photographing scene, a building photographing scene, and a nature scene photographing scene, and the camera type indicates that the image data is captured using a front camera or a rear camera;
    when the shooting scene is the character shooting scene, the image generation information further comprises character characteristic information.
  13. The apparatus according to any of claims 10-12, wherein the classification unit is further configured to perform an image recognition operation on the image data;
    wherein the image generation information is not obtained by the classification unit recognizing the image data.
  14. The apparatus according to any of claims 10 to 13, wherein the classification unit is specifically configured to perform an image recognition operation on the image data to analyze the image data to obtain classification feature information; and executing image classification operation on the first image file by using the image generation information and the classification characteristic information.
  15. The apparatus of claim 14, further comprising:
    and the updating unit is used for storing the classification characteristic information in the first image file to obtain an updated first image file after the classification unit executes image recognition operation on the image data to analyze the image data to obtain the classification characteristic information.
  16. The apparatus according to any of claims 10-15, characterized in that the format of the first image file is the exchangeable image file format EXIF.
  17. The apparatus of any of claims 10-16, wherein the image generation information is stored in a vendor comment field of the first image file.
  18. The apparatus of claim 17, wherein the image generation information is stored in the vendor comment field in a Tagged Image File Format (TIFF).
  19. An apparatus, comprising: the apparatus comprises: the system comprises a processor, a memory, a camera and a display; the memory and the display are coupled to the processor, the display for displaying image files, the memory including a non-volatile storage medium, the memory for storing computer program code, the computer program code including computer instructions that, when executed by the processor,
    the camera is used for capturing image data;
    the processor is used for acquiring image generation information when the camera captures the image data and generating a first image file comprising the image data and the image generation information; performing an image classification operation on the first image file using the image generation information;
    the display is used for responding to the operation of viewing the gallery and displaying the first image files in a classified mode.
  20. The apparatus of claim 19, wherein the image generation information acquired by the processor comprises at least one of information of a photographing parameter, information of a photographing mode, information of a photographing scene, and information of a camera type.
  21. The apparatus of claim 20, wherein the photographing parameters include an exposure value, the photographing modes include a panorama mode and a normal mode, the photographing scenes include a character photographing scene, a building photographing scene, and a nature scene photographing scene, and the camera type indicates that the image data is captured using a front camera or a rear camera;
    when the shooting scene is the character shooting scene, the image generation information further comprises character characteristic information.
  22. The apparatus of any of claims 19-21, wherein the image generation information is not a result of the processor performing an image recognition operation on the image data.
  23. The device according to any of claims 19-22, wherein the processor is specifically configured to perform an image recognition operation on the image data to analyze the image data for classification feature information; and executing image classification operation on the first image file by using the image generation information and the classification characteristic information.
  24. The apparatus of claim 23, wherein the processor is further configured to save the classification characteristic information in the first image file to obtain an updated first image file after performing an image recognition operation on the image data to analyze the image data to obtain the classification characteristic information.
  25. The apparatus according to any of claims 19-24, wherein the format of the first image file is exchangeable image file format EXIF.
  26. The apparatus of any of claims 19-25, wherein the image generation information is stored in a vendor comment field of the first image file.
  27. The apparatus of claim 26, wherein the image generation information is stored in the vendor comment field in a Tagged Image File Format (TIFF).
  28. A control device, characterized in that the control device comprises a processor and a memory for storing computer program code comprising computer instructions which, when executed by the processor, perform the method according to any of claims 1-9.
  29. A computer storage medium comprising computer instructions that, when executed on a device, cause the device to perform the method of any one of claims 1-9.
  30. A computer program product, characterized in that, when the computer program product is run on a computer, it causes the computer to perform the method according to any of claims 1-9.
CN201880085333.5A 2018-02-09 2018-02-09 Image classification method and device Pending CN111566639A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/076081 WO2019153286A1 (en) 2018-02-09 2018-02-09 Image classification method and device

Publications (1)

Publication Number Publication Date
CN111566639A true CN111566639A (en) 2020-08-21

Family

ID=67549189

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880085333.5A Pending CN111566639A (en) 2018-02-09 2018-02-09 Image classification method and device

Country Status (2)

Country Link
CN (1) CN111566639A (en)
WO (1) WO2019153286A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113986096A (en) * 2021-12-29 2022-01-28 北京亮亮视野科技有限公司 Interaction method, interaction device, electronic equipment and storage medium
CN115327562A (en) * 2022-10-16 2022-11-11 常州海图信息科技股份有限公司 Handheld visual laser rangefinder

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111191522A (en) * 2019-12-11 2020-05-22 武汉光庭信息技术股份有限公司 Image scene information storage method and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103685815A (en) * 2012-09-20 2014-03-26 卡西欧计算机株式会社 Image classifying apparatus, electronic album creating apparatus, image classifying method, and program
CN105138578A (en) * 2015-07-30 2015-12-09 北京奇虎科技有限公司 Sorted storage method for target picture and terminal employing sorted storage method
CN105302872A (en) * 2015-09-30 2016-02-03 努比亚技术有限公司 Image processing device and method
CN105824859A (en) * 2015-01-09 2016-08-03 中兴通讯股份有限公司 Picture classification method and device as well as intelligent terminal

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007121548A (en) * 2005-10-26 2007-05-17 Olympus Imaging Corp Device, program, and method for image management, and recording medium
KR101532294B1 (en) * 2008-12-18 2015-07-01 삼성전자주식회사 Apparatus and method for tagging image
EP3706015A4 (en) * 2017-12-05 2020-09-23 Huawei Technologies Co., Ltd. Method and device for displaying story album

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103685815A (en) * 2012-09-20 2014-03-26 卡西欧计算机株式会社 Image classifying apparatus, electronic album creating apparatus, image classifying method, and program
CN105824859A (en) * 2015-01-09 2016-08-03 中兴通讯股份有限公司 Picture classification method and device as well as intelligent terminal
CN105138578A (en) * 2015-07-30 2015-12-09 北京奇虎科技有限公司 Sorted storage method for target picture and terminal employing sorted storage method
CN105302872A (en) * 2015-09-30 2016-02-03 努比亚技术有限公司 Image processing device and method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113986096A (en) * 2021-12-29 2022-01-28 北京亮亮视野科技有限公司 Interaction method, interaction device, electronic equipment and storage medium
CN115327562A (en) * 2022-10-16 2022-11-11 常州海图信息科技股份有限公司 Handheld visual laser rangefinder

Also Published As

Publication number Publication date
WO2019153286A1 (en) 2019-08-15

Similar Documents

Publication Publication Date Title
CN110495819B (en) Robot control method, robot, terminal, server and control system
US20110184980A1 (en) Apparatus and method for providing image
WO2018086262A1 (en) Method for acquiring photographing reference data, mobile terminal and server
WO2018184260A1 (en) Correcting method and device for document image
US20230284000A1 (en) Mobile information terminal, information presentation system and information presentation method
CN111566639A (en) Image classification method and device
US20220215050A1 (en) Picture Search Method and Device
KR101458305B1 (en) Communication system, information terminal, communication method and recording medium
CN115115679A (en) Image registration method and related equipment
WO2022160993A1 (en) Method and device for multimedia data sharing
CN107707816A (en) A kind of image pickup method, device, terminal and storage medium
US20210406524A1 (en) Method and device for identifying face, and computer-readable storage medium
KR20150027934A (en) Apparatas and method for generating a file of receiving a shoot image of multi angle in an electronic device
WO2021088497A1 (en) Virtual object display method, global map update method, and device
CN114842069A (en) Pose determination method and related equipment
KR100798917B1 (en) Digital photo contents system and method adn device for transmitting/receiving digital photo contents in digital photo contents system
CN110532474B (en) Information recommendation method, server, system, and computer-readable storage medium
US8477215B2 (en) Wireless data module for imaging systems
CN113313966A (en) Pose determination method and related equipment
CN115134316B (en) Topic display method, device, terminal and storage medium
CN112560612B (en) System, method, computer device and storage medium for determining business algorithm
CN114338642B (en) File transmission method and electronic equipment
CN112989092A (en) Image processing method and related device
CN114500737B (en) Picture collection method and device and computer storage medium
CN111124539A (en) Initial scene resource file searching method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination