CN111080722A - Color migration method and system based on significance detection - Google Patents

Color migration method and system based on significance detection Download PDF

Info

Publication number
CN111080722A
CN111080722A CN201911281152.2A CN201911281152A CN111080722A CN 111080722 A CN111080722 A CN 111080722A CN 201911281152 A CN201911281152 A CN 201911281152A CN 111080722 A CN111080722 A CN 111080722A
Authority
CN
China
Prior art keywords
color
theme
foreground
subject
colors
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911281152.2A
Other languages
Chinese (zh)
Other versions
CN111080722B (en
Inventor
高成英
刘颀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sun Yat Sen University
Original Assignee
Sun Yat Sen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sun Yat Sen University filed Critical Sun Yat Sen University
Priority to CN201911281152.2A priority Critical patent/CN111080722B/en
Publication of CN111080722A publication Critical patent/CN111080722A/en
Application granted granted Critical
Publication of CN111080722B publication Critical patent/CN111080722B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a color migration method and a system based on significance detection, which comprises the following steps: distinguishing the foreground and the background of an input image; calculating the theme colors of the foreground and the background; combining the foreground subject color and the background subject color; subdividing each pixel of the input image into a final theme color spectrum; and re-coloring the input image according to the user requirement. By distinguishing the foreground and the background of the input image and extracting the theme colors of the foreground and the background respectively, the theme color extraction can be more accurate and is more suitable for cloth pictures; meanwhile, the theme color is flexibly adjusted according to the user requirements, the cost required for producing the cloth is reduced, and the user is helped to efficiently find the most satisfactory color matching scheme; in addition, the method can accurately apply the target color to a specific image area, can highly reserve the texture characteristics of the cloth, has high calculation efficiency of the algorithm, and is suitable for industrial scenes.

Description

Color migration method and system based on significance detection
Technical Field
The invention relates to the technical field of image processing, in particular to a color migration method and system based on significance detection.
Background
In the current market, a large number of cloth samples are produced to check the different color matching effects of the cloth, or the cloth samples are selected by a designer or a user, but a large amount of manpower and material resources are consumed by the sample generation mode, so that the method for simulating the cloth effects of different color matching schemes by using a computer has high application value.
However, the existing color migration algorithm mainly obtains a target color from a target image and then migrates the target color to an original image as a whole, so that local colors of the image cannot be changed independently, and texture features of the cloth image cannot be kept well.
Disclosure of Invention
In view of the above, the present invention provides a color migration method and system based on saliency detection, which can flexibly adjust theme colors according to user requirements and reduce the cost required for producing cloth.
The technical scheme of the invention is realized as follows:
a color migration method based on significance detection specifically comprises the following steps:
step S1, inputting an image, and distinguishing the foreground and the background of the input image through saliency detection;
step S2, calculating the theme colors of the foreground and the background based on the color space;
step S3, combining the foreground subject color and the background subject color to obtain a final subject color spectrum;
step S4, subdividing each pixel of the input image into a final theme color spectrum according to the color difference;
and step S5, modifying the final theme color spectrum according to the user requirement to obtain a modified theme color spectrum, and recoloring the input image according to the modified theme color spectrum.
As a further alternative of the color migration method based on saliency detection, the step S1 includes the steps of:
step S11, obtaining a saliency map of the input image by adopting a cluster-based collaborative saliency detection method;
step S12, determining a pixel threshold for dividing the foreground region and the background region;
step S13, the saliency map of the input image is divided by the determined pixel threshold, and a foreground region and a background region of the input image are obtained.
As a further alternative of the color migration method based on saliency detection, the step S12 includes the steps of:
step S121, acquiring a full image gray scale minimum value and a full image gray scale span of the saliency map;
step S122, using the formula: the pixel threshold is obtained by the minimum value of the full-image grayscale + the full-image grayscale span × 1/5.
As a further alternative of the color migration method based on saliency detection, the step S2 is to calculate the theme colors of the foreground and the background based on the HSV color interval, and specifically includes the following steps:
step S21, dividing the whole HSV space into 42 color intervals;
step S22, calculating values of H, S, and V for each pixel of the input image;
step S23, judging the section where H, S and V of each pixel are located according to the divided 42 color sections;
step S24, calculating the number of pixels in each color space to which the foreground and background of the input image belong, and calculating HSV mean values of pixels covered by the foreground and background as initial foreground and background subject colors.
As a further alternative of the color migration method based on saliency detection, the merging in step S3 includes automatic merging or manual merging.
As a further alternative to the saliency detection based color migration method, the automatic merging comprises the steps of:
calculating color differences between the subject colors;
the color difference threshold is set to 150.0;
and judging whether the color difference of the two theme colors is smaller than a threshold value, if so, reserving the theme colors with more covered pixels in the color interval to which the theme colors belong.
As a further alternative to the saliency detection based color migration method, the manual merging comprises the steps of:
manually setting the number of theme colors by a user;
and combining the subject colors with the same number as the set number according to the color difference from small to large.
As a further alternative of the color migration method based on saliency detection, the step S4 includes the steps of:
step S41, calculating the color difference between each pixel of the input image and each subject color of the final subject color spectrum;
in step S42, each pixel is divided into a subject color coverage area having the smallest color difference therewith.
As a further alternative of the color migration method based on saliency detection, the step S5 includes the steps of:
step S51, the user replaces one or more subject colors in the final subject color spectrum to obtain a modified subject color spectrum;
step S52, calculating an RGB difference value of the target color of the modified theme color spectrum and the original color of the final theme color spectrum;
and step S53, adding the RGB difference value to all pixels covered by the original color until all the modified subject colors are replaced, and finishing the recoloring of the image.
A color migration system based on saliency detection, said system applying any of the above methods.
The invention has the beneficial effects that: by adopting the method, the foreground and the background of the input image are distinguished, and then the theme colors of the foreground and the background are respectively extracted, so that the theme color extraction can be more accurate and is more suitable for cloth pictures; meanwhile, the theme color is flexibly adjusted according to the user requirements, the cost required for producing the cloth is reduced, and the user is helped to efficiently find the most satisfactory color matching scheme; in addition, the method can accurately apply the target color to a specific image area, can highly reserve the cloth texture, has high calculation efficiency of the algorithm, and is suitable for industrial scenes.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a flow chart of a color migration method based on saliency detection according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, a color migration method based on saliency detection specifically includes the following steps:
step S1, inputting an image, and distinguishing the foreground and the background of the input image through saliency detection;
step S2, calculating the theme colors of the foreground and the background based on the color space;
step S3, combining the foreground subject color and the background subject color to obtain a final subject color spectrum;
step S4, subdividing each pixel of the input image into a final theme color spectrum according to the color difference;
and step S5, modifying the final theme color spectrum according to the user requirement to obtain a modified theme color spectrum, and recoloring the input image according to the modified theme color spectrum.
In the embodiment, the foreground and the background of the input image are distinguished, and then the theme colors of the foreground and the background are respectively extracted, so that the theme color extraction can be more accurate and is more suitable for cloth pictures; meanwhile, the theme color is flexibly adjusted according to the user requirements, the cost required for producing the cloth is reduced, and the user is helped to efficiently find the most satisfactory color matching scheme; in addition, the method can accurately apply the target color to a specific image area, can highly reserve the cloth texture, has high calculation efficiency of the algorithm, and is suitable for industrial scenes.
Preferably, the step S1 includes the steps of:
step S11, obtaining a saliency map of the input image by adopting a cluster-based collaborative saliency detection method;
step S12, determining a pixel threshold for dividing the foreground region and the background region;
step S13, the saliency map of the input image is divided by the determined pixel threshold, and a foreground region and a background region of the input image are obtained.
In this embodiment, the saliency map is used for measuring the degree of interest of a person in different image regions, usually, a foreground part can attract attention more and make saliency higher, and a background part is lower, so that the method is more efficient and universal by adopting a saliency detection algorithm, can test a plurality of images at one time, improves the detection accuracy by using the difference and similarity between the images, and is certainly suitable for detecting a single image; the gradation value of the saliency map is defined as a background region when being smaller than a threshold value, and is defined as a foreground region when being larger than the threshold value.
Preferably, the step S12 includes the steps of:
step S121, acquiring a full image gray scale minimum value and a full image gray scale span of the saliency map;
step S122, using the formula: the pixel threshold is obtained by the minimum value of the full-image grayscale + the full-image grayscale span × 1/5.
In this embodiment, for different images, the most suitable pixel threshold for dividing the foreground and the background is different, and the pixel threshold is calculated by using the minimum value of the full-image gray scale and the full-image gray scale span, and the pixel threshold can be applied to most images, so that the method of the present invention has wider applicability.
Preferably, the step S2 is to calculate the theme colors of the foreground and the background based on the HSV color interval, and specifically includes the following steps:
step S21, dividing the whole HSV space into 42 color intervals;
step S22, calculating values of H, S, and V for each pixel of the input image;
step S23, judging the section where H, S and V of each pixel are located according to the divided 42 color sections;
step S24, calculating the number of pixels in each color space to which the foreground and background of the input image belong, and calculating HSV mean values of pixels covered by the foreground and background as initial foreground and background subject colors.
In this embodiment, since the HSV color space is a color mode based on vision, the theme colors of the foreground and the background are calculated based on the HSV color interval, which can better simulate human vision; meanwhile, the whole HSV space is divided into 42 color intervals in a permutation and combination mode as the formulas (1), (2) and (3):
Figure BDA0002316786730000071
Figure BDA0002316786730000072
Figure BDA0002316786730000073
in addition, after the number of pixels of each color space to which the foreground and the background belong is calculated respectively, five foreground areas are taken, one color interval with the most covered pixels is taken as the background area, HSV average values of the pixels belonging to the six color intervals are calculated respectively to serve as an initial theme color spectrum, and the color spectrum comprises five foreground colors and one background color; note that H represents an image hue, S represents an image saturation, and V represents an image lightness.
Preferably, the merging in step S3 includes automatic merging or manual merging.
In this embodiment, the final theme color spectrum is obtained by automatically merging some theme colors by setting a threshold or manually merging some theme colors by manually inputting the number of theme colors by a user, and since the number of theme colors owned by each cloth cannot be completely unified, for example, the cloth has a single color and two colors, and the theme colors are too much or too little and cannot be used for accurately and completely expressing the colors of the image, it is necessary to set the number of theme colors; for users needing to accurately set the color number, a manual setting mode can be adopted; for users who have higher requirement efficiency, need to reduce manual operation as much as possible, and have less high requirement on accuracy, the most suitable theme color number for the cloth image can be automatically calculated.
Preferably, the automatic merging comprises the following steps:
calculating color differences between the subject colors;
the color difference threshold is set to 150.0;
and judging whether the color difference of the two theme colors is smaller than a threshold value, if so, reserving the theme colors with more covered pixels in the color interval to which the theme colors belong.
In this embodiment, it is not appropriate to calculate the euclidean distance based on the RGB color space or the HSV color space to measure the similarity between two colors, and since a point change of one channel may cause a great change in the final color, the calculation of the similarity between colors based on the LAB color space, that is, the color difference, is specifically calculated as calculating the euclidean distance of the two colors in the LAB space.
Preferably, the manual merging comprises the steps of:
manually setting the number of theme colors by a user;
and combining the subject colors with the same number as the set number according to the color difference from small to large.
In this embodiment, the user manually sets the number of theme colors according to subjective feeling, and then merges a certain number of theme colors according to the color difference from small to large, and the theme colors with more pixels covered in the color interval to which the theme colors belong are retained during merging.
Preferably, the step S4 includes the steps of:
step S41, calculating the color difference between each pixel of the input image and each subject color of the final subject color spectrum;
in step S42, each pixel is divided into a subject color coverage area having the smallest color difference therewith.
In the embodiment, by re-dividing the theme colors of each pixel of the input image according to the color difference, the vision of human eyes can be simulated, and the user can be helped to efficiently find the most satisfactory color scheme.
Preferably, the step S5 includes the steps of:
step S51, the user replaces one or more subject colors in the final subject color spectrum to obtain a modified subject color spectrum;
step S52, calculating an RGB difference value of the target color of the modified theme color spectrum and the original color of the final theme color spectrum;
and step S53, adding the RGB difference value to all pixels covered by the original color until all the modified subject colors are replaced, and finishing the recoloring of the image.
A color migration system based on saliency detection, said system applying any of the above methods.
In the embodiment, the foreground and the background of the input image are distinguished, and then the theme colors of the foreground and the background are respectively extracted, so that the theme color extraction can be more accurate and is more suitable for cloth pictures; meanwhile, the theme color is flexibly adjusted according to the user requirements, the cost required for producing the cloth is reduced, and the user is helped to efficiently find the most satisfactory color matching scheme; in addition, the system can not only highly reserve the cloth texture, but also accurately apply the target color to a specific image area, has higher calculation efficiency of the algorithm, and is suitable for industrial scenes.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (10)

1. A color migration method based on significance detection is characterized by specifically comprising the following steps of:
step S1, inputting an image, and distinguishing the foreground and the background of the input image through saliency detection;
step S2, calculating the theme colors of the foreground and the background based on the color space;
step S3, combining the foreground subject color and the background subject color to obtain a final subject color spectrum;
step S4, subdividing each pixel of the input image into a final theme color spectrum according to the color difference;
and step S5, modifying the final theme color spectrum according to the user requirement to obtain a modified theme color spectrum, and recoloring the input image according to the modified theme color spectrum.
2. The color migration method based on saliency detection as claimed in claim 1, characterized in that said step S1 comprises the steps of:
step S11, obtaining a saliency map of the input image by adopting a cluster-based collaborative saliency detection method;
step S12, determining a pixel threshold for dividing the foreground region and the background region;
step S13, the saliency map of the input image is divided by the determined pixel threshold, and a foreground region and a background region of the input image are obtained.
3. The color migration method based on saliency detection as claimed in claim 2, characterized in that said step S12 comprises the steps of:
step S121, acquiring a full image gray scale minimum value and a full image gray scale span of the saliency map;
step S122, using the formula: the pixel threshold is obtained by the minimum value of the full-image grayscale + the full-image grayscale span × 1/5.
4. The color migration method based on saliency detection as claimed in claim 3, wherein said step S2 calculates subject colors of foreground and background based on HSV color interval, specifically comprising the steps of:
step S21, dividing the whole HSV space into 42 color intervals;
step S22, calculating values of H, S, and V for each pixel of the input image;
step S23, judging the section where H, S and V of each pixel are located according to the divided 42 color sections;
step S24, calculating the number of pixels in each color space to which the foreground and background of the input image belong, and calculating HSV mean values of pixels covered by the foreground and background as initial foreground and background subject colors.
5. The color migration method based on saliency detection as claimed in claim 4, characterized in that said merging in step S3 includes automatic merging or manual merging.
6. The saliency detection based color migration method according to claim 5, characterized in that said automatic merging comprises the following steps:
calculating color differences between the subject colors;
the color difference threshold is set to 150.0;
and judging whether the color difference of the two theme colors is smaller than a threshold value, if so, reserving the theme colors with more covered pixels in the color interval to which the theme colors belong.
7. The saliency detection based color migration method according to claim 5, characterized in that said manual merging comprises the following steps:
manually setting the number of theme colors by a user;
and combining the subject colors with the same number as the set number according to the color difference from small to large.
8. The color migration method based on saliency detection as claimed in claim 6 or 7, characterized in that said step S4 comprises the following steps:
step S41, calculating the color difference between each pixel of the input image and each subject color of the final subject color spectrum;
in step S42, each pixel is divided into a subject color coverage area having the smallest color difference therewith.
9. The color migration method based on saliency detection as claimed in claim 8, characterized in that said step S5 comprises the steps of:
step S51, the user replaces one or more subject colors in the final subject color spectrum to obtain a modified subject color spectrum;
step S52, calculating an RGB difference value of the target color of the modified theme color spectrum and the original color of the final theme color spectrum;
and step S53, adding the RGB difference value to all pixels covered by the original color until all the modified subject colors are replaced, and finishing the recoloring of the image.
10. A color migration system based on saliency detection, characterized in that said system applies any of the methods of claims 1-9.
CN201911281152.2A 2019-12-11 2019-12-11 Color migration method and system based on significance detection Active CN111080722B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911281152.2A CN111080722B (en) 2019-12-11 2019-12-11 Color migration method and system based on significance detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911281152.2A CN111080722B (en) 2019-12-11 2019-12-11 Color migration method and system based on significance detection

Publications (2)

Publication Number Publication Date
CN111080722A true CN111080722A (en) 2020-04-28
CN111080722B CN111080722B (en) 2023-04-21

Family

ID=70314357

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911281152.2A Active CN111080722B (en) 2019-12-11 2019-12-11 Color migration method and system based on significance detection

Country Status (1)

Country Link
CN (1) CN111080722B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112252051A (en) * 2020-09-25 2021-01-22 沈阳美行科技有限公司 Cloth drawing method and device and electronic equipment
CN113284198A (en) * 2021-05-13 2021-08-20 稿定(厦门)科技有限公司 Automatic image color matching method and device
CN116597029A (en) * 2023-04-27 2023-08-15 北京隐算科技有限公司 Image re-coloring method for achromatopsia

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107146258A (en) * 2017-04-26 2017-09-08 清华大学深圳研究生院 A kind of detection method for image salient region
CN107730564A (en) * 2017-09-26 2018-02-23 上海大学 A kind of image edit method based on conspicuousness
CN109214420A (en) * 2018-07-27 2019-01-15 北京工商大学 The high texture image classification method and system of view-based access control model conspicuousness detection
CN109300169A (en) * 2018-09-06 2019-02-01 华东师范大学 A kind of translucent image color transfer method based on linear transformation
CN109410171A (en) * 2018-09-14 2019-03-01 安徽三联学院 A kind of target conspicuousness detection method for rainy day image
CN109710791A (en) * 2018-12-14 2019-05-03 中南大学 A kind of multi-source color image color moving method based on significant filter

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107146258A (en) * 2017-04-26 2017-09-08 清华大学深圳研究生院 A kind of detection method for image salient region
CN107730564A (en) * 2017-09-26 2018-02-23 上海大学 A kind of image edit method based on conspicuousness
CN109214420A (en) * 2018-07-27 2019-01-15 北京工商大学 The high texture image classification method and system of view-based access control model conspicuousness detection
CN109300169A (en) * 2018-09-06 2019-02-01 华东师范大学 A kind of translucent image color transfer method based on linear transformation
CN109410171A (en) * 2018-09-14 2019-03-01 安徽三联学院 A kind of target conspicuousness detection method for rainy day image
CN109710791A (en) * 2018-12-14 2019-05-03 中南大学 A kind of multi-source color image color moving method based on significant filter

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112252051A (en) * 2020-09-25 2021-01-22 沈阳美行科技有限公司 Cloth drawing method and device and electronic equipment
CN113284198A (en) * 2021-05-13 2021-08-20 稿定(厦门)科技有限公司 Automatic image color matching method and device
WO2022237038A1 (en) * 2021-05-13 2022-11-17 稿定(厦门)科技有限公司 Automatic color matching method and apparatus for image
CN116597029A (en) * 2023-04-27 2023-08-15 北京隐算科技有限公司 Image re-coloring method for achromatopsia
CN116597029B (en) * 2023-04-27 2024-03-05 北京隐算科技有限公司 Image re-coloring method for achromatopsia

Also Published As

Publication number Publication date
CN111080722B (en) 2023-04-21

Similar Documents

Publication Publication Date Title
JP7413400B2 (en) Skin quality measurement method, skin quality classification method, skin quality measurement device, electronic equipment and storage medium
CN110046673B (en) No-reference tone mapping image quality evaluation method based on multi-feature fusion
CN109978890B (en) Target extraction method and device based on image processing and terminal equipment
Ma et al. Objective quality assessment for color-to-gray image conversion
US8035871B2 (en) Determining target luminance value of an image using predicted noise amount
CN111080722A (en) Color migration method and system based on significance detection
CN105096347B (en) Image processing apparatus and method
Jiang et al. Fog density estimation and image defogging based on surrogate modeling for optical depth
El Khoury et al. Color and sharpness assessment of single image dehazing
CN110717865B (en) Picture detection method and device
CN104504722B (en) Method for correcting image colors through gray points
JP2021531571A (en) Certificate image extraction method and terminal equipment
CN110570435A (en) method and device for carrying out damage segmentation on vehicle damage image
CN110782448A (en) Rendered image evaluation method and device
CN114519698A (en) Equipment oil leakage detection method, device, equipment and storage medium in dark environment
CN112102207A (en) Method and device for determining temperature, electronic equipment and readable storage medium
CN105678301A (en) Method, system and device for automatically identifying and segmenting text image
CN109241970B (en) Urine test method, mobile terminal and computer readable storage medium
CN111489333B (en) No-reference night natural image quality evaluation method
CN116563570B (en) Color recognition method and device, electronic equipment and storage medium
CN111179245B (en) Image quality detection method, device, electronic equipment and storage medium
CN112348809A (en) No-reference screen content image quality evaluation method based on multitask deep learning
Fursov et al. Correction of distortions in color images based on parametric identification
He et al. Effective haze removal under mixed domain and retract neighborhood
CN111754491A (en) Picture definition judging method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant