CN115397073B - A lighting system for amphibious desilting robot of self-propelled - Google Patents

A lighting system for amphibious desilting robot of self-propelled Download PDF

Info

Publication number
CN115397073B
CN115397073B CN202211314813.9A CN202211314813A CN115397073B CN 115397073 B CN115397073 B CN 115397073B CN 202211314813 A CN202211314813 A CN 202211314813A CN 115397073 B CN115397073 B CN 115397073B
Authority
CN
China
Prior art keywords
central control
control processor
area
illumination
image information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211314813.9A
Other languages
Chinese (zh)
Other versions
CN115397073A (en
Inventor
王兴瑞
王朋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingzhou Xinjulong Equipment Manufacturing Co ltd
Original Assignee
Qingzhou Xinjulong Equipment Manufacturing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingzhou Xinjulong Equipment Manufacturing Co ltd filed Critical Qingzhou Xinjulong Equipment Manufacturing Co ltd
Priority to CN202211314813.9A priority Critical patent/CN115397073B/en
Publication of CN115397073A publication Critical patent/CN115397073A/en
Application granted granted Critical
Publication of CN115397073B publication Critical patent/CN115397073B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/28Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/05Underwater scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Image Input (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of illumination, in particular to an illumination system for a self-propelled amphibious dredging robot, which comprises a camera device, an illumination device, a space detection device and a central control processor, wherein the illumination system detects the space size data of the environment where the dredging robot is located, the central control processor determines the illumination parameters of the illumination device according to the detection result, simultaneously divides the space where the dredging robot is located, selects different illumination parameters in different space areas, so that the combination of the brightness of the illumination device and the aperture radius achieves a better illumination effect, adjusts the illumination parameters of the areas according to the image definition in the areas where the image definition does not meet the standard so as to obtain a better illumination effect, performs image processing on the non-clear images where the image brightness meets the standard, and identifies the sludge contour so as to achieve the purpose of identifying the sludge contour and ensure that the dredging robot can work smoothly during dredging.

Description

A lighting system for amphibious desilting robot of self-propelled
Technical Field
The invention relates to the technical field of illumination, in particular to an illumination system for a self-propelled amphibious dredging robot.
Background
The detection of underwater environment safety is of great importance to underwater activities, water has a strong absorption effect on light, and the space of dozens of meters underwater is almost a piece of paint black, so that the underwater safety detection generally needs manual illumination, the underwater illumination used in different environments is different, and a proper light source is of great importance to underwater detection imaging.
Chinese patent publication No.: CN110067964A discloses a lighting device for an amphibious pipeline robot and a control method, which belong to the field of robot pipeline lighting; the method comprises the following steps: a lighting unit and a watertight joint; the lighting unit comprises a waterproof shell, a lens is arranged at the light emitting end of the waterproof shell, a sensor module, a communication module, a power panel and an LED chip are arranged in the waterproof shell, the LED chip is connected with the power panel, the power panel is connected with the sensor module through the communication module, and the communication module and the power panel are connected with a watertight joint. The device and the control method solve the defects that the lighting device in the prior art is small in working environment adaptability, incapable of being remotely controlled, free of self-checking function, excessive in watertight joints and the like. The underwater environment of the dredging robot is various, and especially when the dredging robot is in a limited space, the accurate underwater environment information is acquired, so that the underwater environment information has important significance for the smooth work of the dredging robot. However, the existing lighting device has low capability of acquiring underwater environment information, and cannot accurately identify the underwater environment.
Disclosure of Invention
Therefore, the invention provides an illuminating system for a self-propelled amphibious dredging robot, which is used for solving the problems that the existing illuminating device in the prior art is low in underwater environment information obtaining capability and cannot accurately identify the underwater environment.
In order to achieve the above object, the present invention provides an illumination system for a self-propelled amphibious dredging robot, comprising:
the camera device is used for collecting image information of the environment where the dredging robot is located when the dredging robot carries out dredging;
the illumination device comprises a plurality of light sources which are arranged on the dredging robot in a surrounding way and is used for illuminating the environment where the dredging robot is located;
the space detection device is arranged at the front end of the dredging robot and used for acquiring environmental parameters of the environment where the dredging robot is located, and the environmental parameters comprise the area and the space height of the area of the environment where the dredging robot is located and the average distance from the position of the space detection device to the edge of the area of the environment where the space detection device is located;
and the central control processor is respectively connected with the camera device, the illuminating device and the space detection device and used for judging whether the illumination parameter of the illuminating device is adjusted to a corresponding value or not according to the space adjustment parameter Ki calculated through the environment parameter and the brightness of the environment detected by the camera device, when the brightness of the image information is judged to be lower than a preset value, the central control processor adjusts the illumination parameter to the corresponding value, when the brightness of the image information is larger than or equal to the preset value, the central control processor performs gray processing on the image information, and performs binarization processing on a mark area in a gray level image after the gray level processing to judge whether a body profile in the image information is silt or not, wherein the illumination parameter comprises the brightness and aperture radius of each light source in the illuminating device.
Further, the central control processor calculates a spatial adjustment parameter Ki according to the environmental parameter uploaded by the spatial detection device, establishes a spatial coordinate system f (x, y, z) by taking the position of the spatial detection device as a coordinate origin, takes a plane formed by an x axis and a y axis as a reference plane, divides the reference plane into a preset number of n areas, calculates the spatial adjustment parameter Ki of the area according to the following formula for the ith area, wherein the value of i is 1-n,
Figure DEST_PATH_IMAGE001
wherein Si represents the area of the ith area, H represents the space height of the space where the dredging robot is located, V0 represents the preset space volume, li represents the average distance from the coordinate origin to the edge of the ith area, and L0 represents the preset average distance.
Further, the camera device detects the brightness Dh of the environment where the desilting robot is located when the desilting robot runs, space contrast parameters K01 and K02 are preset inside the central control processor, wherein K01 is smaller than K02, the central control processor determines the illumination parameter when the area is illuminated by the corresponding light source in the lighting device according to the ith area space adjustment parameter Ki when the camera device shoots,
when Ki is larger than or equal to K02, the central control processor determines the brightness of the lighting device when the corresponding light source illuminates the ith area as D, and the setting is carried out
Figure 160689DEST_PATH_IMAGE002
(ii) a The central control processor illuminates the diaphragm of the ith area with the corresponding light sourceThe radius is determined as R, set>
Figure 688753DEST_PATH_IMAGE003
When K01 is larger than or equal to Ki and smaller than K02, the central control processor determines the brightness of the lighting device when the lighting device illuminates the ith area by the corresponding light source as D, sets D = D0, the central control processor determines the aperture radius of the lighting device when the lighting device illuminates the ith area by the corresponding light source as R, and sets R = R0;
when Ki is less than K01, the central control processor determines the brightness of the lighting device when the corresponding light source illuminates the ith area as D, and the setting is carried out
Figure 370139DEST_PATH_IMAGE004
(ii) a The central control processor determines the aperture radius of the lighting device when the corresponding light source illuminates the ith area as R and sets ^ R>
Figure 791893DEST_PATH_IMAGE005
Where D0 represents a preset brightness of the illumination device and R0 represents a preset radius of the aperture.
Furthermore, the camera device collects the image information of the environment where the dredging robot is located when the dredging robot runs, the central control processor compares the definition Qi of the image information with the preset definition Q0 when analyzing the image information collected from the ith area,
when Qi is larger than or equal to Q0, the central control processor judges that the definition of the image information of the region acquired by the camera device meets the standard, and extracts the contour characteristics of the sludge region in the region;
and when Qi is less than Q0, the central control processor judges that the definition of the image information of the region acquired by the camera device does not meet the standard, and the central control processor calculates the brightness of the image information.
Further, the central control processor acquires the RGB value of the image information when judging that the definition of the single area image information collected by the camera device does not meet the standard so as to calculate the brightness Y of the image information and compare the Y with the preset brightness Y0,
when Y is less than Y0, the central control processor judges that the illumination parameter of the area does not meet the requirement, and the central control processor adjusts the illumination parameter of the lighting device according to the definition Q of the image information of the area;
and when Y is larger than or equal to Y0, the central control processor judges that the illumination parameters for the area meet the requirements, and the central control processor performs image processing on the image information with the definition not meeting the standard so as to identify the sludge contour in the image information.
Further, the central control processor calculates a difference value Δ Q between the definition Qi of the image information of the region and a preset definition Q0 when the illumination parameter for a single region is determined to be not satisfactory, adjusts the illumination parameter when the region is illuminated by a corresponding light source in the illumination device according to Δ Q, sets Δ Q = Q0-Qi,
when Δ Q is not greater than Δ Q1, the central processor adjusts the brightness of the area illuminated by the corresponding light source in the lighting device to D 'by using α 1, sets D' = D × α 1, and adjusts the aperture radius of the area illuminated by the corresponding light source in the lighting device to R ', sets R' = R × α 1;
when delta Q1 is larger than delta Q and is smaller than or equal to delta Q2, the central control processor adjusts the brightness of the area illuminated by the corresponding light source in the illumination device to D ' by using alpha 2, sets D ' = D multiplied by alpha 2, and adjusts the aperture radius of the area illuminated by the corresponding light source in the illumination device to R ' = R multiplied by alpha 2;
when Δ Q2 < Δ Q, the central processor adjusts the brightness of the area illuminated by the corresponding light source in the illumination device to D 'by using α 3, sets D' = D × α 3, adjusts the aperture radius of the area illuminated by the corresponding light source in the illumination device to R ', sets R' = R × α 3;
wherein, Δ Q1 is a first preset definition difference, Δ Q2 is a second preset definition difference, α 1 is a first adjusting coefficient, α 2 is a second adjusting coefficient, α 3 is a third adjusting coefficient, Δ Q1 is less than Δ Q2,1 is less than α 1 and less than α 2 is less than α 3 and less than 1.5.
Further, the central control processor is provided with a gray value difference contrast parameter Δ Gpd, Δ Gpd > 0, performs graying processing on the image information to obtain a gray image for the image information when performing image processing on the image information with the definition not meeting the standard to identify a sludge contour in the image information, calculates an average gray value Gps of the gray image and equally divides the gray image into a plurality of regions, calculates average gray values of images in the regions in the gray image in sequence, calculates differences between the average gray values of the images in the regions in the gray image and Gps in sequence to determine whether to mark the corresponding region, sets z =1,2,3, m for the z-th region in the gray image, m is the total number of the regions into which the gray image is divided by the central control processor, sets the average gray value of the images in the regions in the gray image as Gpz, sets the difference between the Gpz and the Gps as Δ Gp = Gps-Gpz,
when the delta Gp is larger than the delta Gpd, the central control processor marks the z-th area and extracts the marked area;
when the delta Gp is less than or equal to the delta Gpd, the central control processor does not mark the z-th area.
Further, the central control processor carries out binarization processing on the image information corresponding to the marked area to obtain a binarized image, the central control processor adjusts the binarization threshold value of each binarized image to identify the body contour in each binarized image in the adjusting process,
in the process of adjusting the binarization threshold value, the central control processor sequentially extracts the body contour in each binarization image, and for a single binarization image, if the body contour existing in the binarization image is kept unchanged in a preset binarization threshold value change interval (E1, E2), the central control processor extracts the body contour to serve as an identified body contour;
the central control processor sequentially marks each identified profile in the gray level image and respectively verifies each identified profile, wherein:
the central control processor sequentially reduces each identification body profile by a preset proportion parameter B1 to form a plurality of first identification body profiles so as to respectively determine an annular area formed by each first identification body profile and the corresponding identification body profile and calculate a gray average value G1 of the annular area;
the central control processor sequentially amplifies each identified body profile by a preset proportion parameter B2 to form a plurality of second identified body profiles so as to respectively determine an annular area formed by each second identified body profile and the corresponding identified body profile and calculate a gray average value G2 of the annular area;
the central control processor calculates the gray difference parameter H of the two annular areas according to the following formula, and sets
Figure 667576DEST_PATH_IMAGE006
Wherein B1 is more than 0.8 and less than 1 and B2 is more than 1.2.
Further, the central control processor judges whether the identified body profile is the profile of the sludge area according to the gray difference parameter H,
when H-1 is larger than Hd, the central control processor judges that the recognized shape profile is a sludge area profile;
when H-1 is not more than Hd, the central control processor judges that the identified profile is not the profile of the sludge area;
wherein Hd is a preset gray difference contrast parameter.
Further, when the central control processor determines that the identified body profile is a sludge area profile, the dredging robot pre-touches sludge in the identified body profile with a preset force before dredging and determines the type of the body profile according to the received feedback force F,
when F is less than or equal to F0, the central control processor judges that the object in the recognized body outline is sludge;
when F > F0, the central control processor determines that the object in the identified profile is an obstacle.
Compared with the prior art, the desilting robot has the advantages that the camera device, the illuminating device, the space detection device and the central processor are arranged to detect the space size data of the environment where the desilting robot is located, the central processor determines the brightness and the aperture radius of the illuminating device according to the detection result, meanwhile, the space where the desilting robot is located is divided, different illumination parameters are carried out in different space areas, so that the brightness and the aperture radius of the illuminating device are combined to achieve a better illuminating effect, the camera device can obtain an image with higher definition, meanwhile, different illumination parameters are selected according to different space data to obtain a better illuminating effect, the better illuminating effect can be obtained in a limited space range under water, the image with higher definition can be obtained, and clear image information of the underwater environment can be obtained.
Further, the invention is provided with a space detection device, coordinate data of a space where the dredging robot is located is obtained and analyzed, the space is divided into a plurality of areas according to the space coordinate data, corresponding illumination parameters are selected for any area, in an actual situation, the dredging robot is in a limited space, when a camera device is used for collecting images of the surrounding environment, different areas are different in spatial layout, if the same illumination parameters are adopted, the problem that the definition of the images collected by partial areas is not satisfactory can be caused, and the problem that the images collected by the partial areas are overexposed can be caused.
Furthermore, a space adjusting parameter Ki is introduced into the central control processor, the Ki comprises space overall volume data and length data from the space detection device to the space edge, for light source illumination, the area and the space height of an area irradiated by the light source influence an illumination effect, the space adjusting parameter is constructed by taking the factors into consideration, so that illumination parameters corresponding to different space areas are determined, for a single light source, the area capable of being irradiated by the single light source is limited, the space where the dredging robot is located is divided into a plurality of areas, for the illumination effect of any area, the light source in the area is adjusted in a targeted mode, the brightness of the illumination device is matched with the aperture radius to the best, a better illumination effect is obtained, an image with higher definition is obtained, clear image information of an underwater environment is obtained, and the energy consumption of the dredging robot is saved.
Furthermore, the central control processor analyzes the image shot by the camera device to the surrounding environment, further calculates the brightness of the image when the definition of the image is judged not to meet the standard, and adjusts the illumination parameter of the illuminating device according to the definition of the image to obtain better illumination effect when the central control processor judges that the illumination parameter of the corresponding area does not meet the requirement according to the brightness of the image, so as to obtain the image with higher definition to obtain the clear image information of the underwater environment.
Further, when the central control processor judges that the illumination parameters of the corresponding area meet the requirements according to the brightness of the image, the central control processor performs image processing on the image with the definition not meeting the standards and identifies the sludge outline, in practical application, the gray value corresponding to the image of the sludge is low, the central control processor selects a marking area from the image by taking the gray value as a reference so as to mark the image which is possibly the sludge outline, the central control processor performs binarization processing on the marking area, determines the identified body outline by adjusting a binarization threshold value and verifies the identified body outline.
Further, when the central control processor judges that the identified body profile is the profile of the sludge area, the dredging robot uses the preset force to pre-touch the sludge in the identified body profile before dredging so as to further judge whether the identified body profile is the sludge, so that the working safety of the dredging robot is ensured, and the damage to the dredging robot caused by mistakenly digging rocks is avoided.
Drawings
FIG. 1 is a schematic structural diagram of an illumination system for a self-propelled amphibious dredging robot according to an embodiment of the present invention;
FIG. 2 is a block diagram of a lighting system for an autonomous amphibious dredging robot according to an embodiment of the present invention;
FIG. 3 is a structural block diagram of a further embodiment of the illumination system for the self-propelled amphibious dredging robot according to the embodiment of the invention;
fig. 4 is a schematic diagram of region division according to an embodiment of the present invention.
Detailed Description
In order that the objects and advantages of the invention will be more clearly understood, the invention is further described below with reference to examples; it should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Preferred embodiments of the present invention are described below with reference to the accompanying drawings. It should be understood by those skilled in the art that these embodiments are only for explaining the technical principle of the present invention, and do not limit the scope of the present invention.
It should be noted that in the description of the present invention, the terms of direction or positional relationship indicated by the terms "upper", "lower", "left", "right", "inner", "outer", etc. are based on the directions or positional relationships shown in the drawings, which are only for convenience of description, and do not indicate or imply that the device or element must have a specific orientation, be constructed in a specific orientation, and be operated, and thus, should not be construed as limiting the present invention.
Furthermore, it should be noted that, in the description of the present invention, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
Referring to fig. 1-3, fig. 1 is a schematic structural diagram of an illumination system for an autonomous amphibious dredging robot according to an embodiment of the present invention, fig. 2 is a block diagram of an illumination system for an autonomous amphibious dredging robot according to an embodiment of the present invention, fig. 3 is a block diagram of a further embodiment of an illumination system for an autonomous amphibious dredging robot according to an embodiment of the present invention, and the illumination system for an autonomous amphibious dredging robot according to an embodiment of the present invention includes:
the camera device is used for acquiring image information of the environment where the dredging robot is located when the dredging robot carries out dredging, and comprises a front-view camera 1 arranged at the front end of the dredging robot, a rear-view camera 2 arranged at the rear end of the dredging robot and a working camera 3 arranged on a working arm of the dredging robot;
the illuminating device comprises a plurality of light sources 5 which are arranged on the dredging robot in a surrounding way and used for illuminating the environment where the dredging robot is located;
the space detection device 4 is arranged at the front end of the dredging robot and used for acquiring environmental parameters of the environment where the dredging robot is located, wherein the environmental parameters comprise the area and the height of the area of the environment where the dredging robot is located and the average distance from the position of the space detection device to the edge of the area of the environment where the space detection device is located;
the preferred space detection device of the embodiment of the invention is a space detector;
and the central control processor is respectively connected with the camera device, the illuminating device and the space detection device 4 and is used for judging whether the illumination parameter of the illuminating device is adjusted to a corresponding value or not according to the space adjustment parameter Ki calculated through the environment parameter and the brightness of the environment detected by the camera device, when the brightness of the image information is judged to be lower than a preset value, the central control processor adjusts the illumination parameter to the corresponding value and carries out gray processing on the image information when the brightness of the image information is larger than or equal to the preset value, and carries out binarization processing on a mark area in the gray image after the gray processing so as to judge whether a shape outline in the image information is sludge or not, wherein the illumination parameter comprises the brightness and the aperture radius of each light source 5 in the illuminating device.
According to the invention, the camera device, the illuminating device, the space detection device 4 and the central processor are arranged to detect the space size data of the environment where the dredging robot is located, the central processor determines the brightness and the aperture radius of the illuminating device according to the detection result, simultaneously divides the space where the dredging robot is located, and performs different illumination parameters in different space areas, so that the brightness and the aperture radius of the illuminating device are combined to achieve a better illumination effect, and the camera device can obtain an image with higher definition, and simultaneously selects different illumination parameters according to different space data to obtain a better illumination effect, so that a better illumination effect can be obtained in a limited space range under water, and further the image with higher definition is obtained, so that the image information of the clear underwater environment is obtained.
Please refer to fig. 4, which is a schematic diagram illustrating region division according to an embodiment of the present invention;
specifically, the central control processor calculates a spatial adjustment parameter Ki according to the environmental parameter uploaded by the spatial detection device 4, establishes a spatial coordinate system f (x, y, z) with the position of the spatial detection device as a coordinate origin, uses a plane formed by an x-axis and a y-axis as a reference plane, divides the reference plane into a preset number n of regions 6, calculates the spatial adjustment parameter Ki of the region according to the following formula for the ith region 6, wherein the value of i is 1-n,
Figure 338729DEST_PATH_IMAGE001
wherein Si represents the area of the ith area, H represents the space height of the space where the dredging robot is located, V0 represents the preset space volume, li represents the average distance from the coordinate origin to the edge of the ith area, and L0 represents the preset average distance.
The invention sets a space detection device 4, acquires and analyzes coordinate data of a space where the dredging robot is located, the space is divided into a plurality of areas 6 according to the space coordinate data, corresponding illumination parameters are selected for any area 6, in an actual situation, the dredging robot is in a limited space, when a camera device acquires images of a surrounding environment, different areas are different in spatial layout, if the same illumination parameters are adopted, the problem that the definition of the images acquired by partial areas is not satisfactory can be caused, and the problem that the images acquired by partial areas are overexposed can be caused.
Specifically, the camera device detects the brightness Dh of the environment where the desilting robot is located when the desilting robot runs, the central control processor presets space contrast parameters K01 and K02 inside, wherein K01 is less than K02, the central control processor determines the illumination parameter when the corresponding light source 5 in the illumination device illuminates the area according to the i-th area space adjustment parameter Ki when the camera device shoots,
when Ki is larger than or equal to K02, the central control processor determines the brightness of the lighting device corresponding to the light source 5 when the lighting device illuminates the ith area as D, and the brightness is set
Figure 154632DEST_PATH_IMAGE002
(ii) a The central control processor determines the aperture radius of the lighting device corresponding to the light source 5 when the i-th area is illuminated as R and sets ^ R>
Figure 243941DEST_PATH_IMAGE003
When K01 is larger than or equal to Ki and smaller than K02, the central control processor determines the brightness of the lighting device when the lighting device corresponding to the light source 5 illuminates the ith area as D, and sets D = D0, the central control processor determines the aperture radius of the lighting device when the lighting device corresponding to the light source 5 illuminates the ith area as R, and sets R = R0;
when Ki is less than K01, the central control processor illuminates the ith area with the corresponding light source 5 in the illumination deviceIs determined as D, set
Figure 274214DEST_PATH_IMAGE004
(ii) a The central control processor determines the aperture radius of the lighting device corresponding to the light source 5 when the i-th area is illuminated as R and sets ^ R>
Figure 885193DEST_PATH_IMAGE005
Where D0 represents a preset brightness of the illumination device and R0 represents a preset radius of the aperture.
According to the invention, a space adjusting parameter Ki is introduced into the Ki, the Ki comprises space integral volume data and length data from a space detection device 4 to a space edge, for light source illumination, the area and the space height of an area irradiated by a light source influence an illumination effect, the factors are taken into consideration, the space adjusting parameter is constructed to determine illumination parameters corresponding to different space areas, for a single light source, the area capable of being irradiated is limited, the space where the dredging robot is located is divided into a plurality of areas 6, for the illumination effect of any area 6, the light source in the area is adjusted in a targeted manner, the brightness and the aperture radius of the illumination device are matched to be optimal, a better illumination effect is obtained, an image with higher definition is obtained, clear image information of an underwater environment is obtained, and the energy consumption of the dredging robot is saved.
Specifically, the camera device collects image information of the environment where the dredging robot is located when the dredging robot runs, the central control processor compares the definition Qi of the image information with a preset definition Q0 when analyzing the image information collected from the ith area,
when Qi is larger than or equal to Q0, the central control processor judges that the definition of the image information of the region acquired by the camera device meets the standard, and extracts the contour characteristics of the sludge region in the region;
and when Qi is less than Q0, the central control processor judges that the definition of the image information of the region acquired by the camera device does not meet the standard, and the central control processor calculates the brightness of the image information.
Specifically, the central processor acquires RGB value of the image information when determining that the definition of the single area image information collected by the camera device does not meet the standard so as to calculate the brightness Y of the image information and compare Y with the preset brightness Y0,
when Y is less than Y0, the central control processor judges that the illumination parameter of the area does not meet the requirement, and the central control processor adjusts the illumination parameter of the lighting device according to the definition Q of the image information of the area;
and when Y is larger than or equal to Y0, the central control processor judges that the illumination parameters for the area meet the requirements, and the central control processor performs image processing on the image information with the definition not meeting the standard so as to identify the sludge contour in the image information.
Specifically, the brightness calculation formula of the picture is as follows: y = 0.299R + 0.587G + 0.114B, where R is the mean of the R channels, G is the mean of the G channels, and B is the mean of the B channels.
Specifically, the central control processor calculates a difference Δ Q between the definition Qi of the image information of a single region 6 and a preset definition Q0 when determining that the illumination parameter for the region is not satisfactory, adjusts the illumination parameter when the corresponding light source 5 in the illumination device illuminates the region according to Δ Q, and sets Δ Q = Q0-Qi,
when Δ Q is not greater than Δ Q1, the central processor adjusts the luminance of the area illuminated by the corresponding light source 5 in the illumination device to D 'using α 1, sets D' = D × α 1, and adjusts the aperture radius of the area illuminated by the corresponding light source 5 in the illumination device to R ', sets R' = R × α 1;
when Δ Q1 is greater than Δ Q and less than Δ Q2, the central processor adjusts the brightness of the area illuminated by the corresponding light source 5 in the illumination device to D ' by using α 2, sets D ' = D × α 2, and adjusts the aperture radius of the area illuminated by the corresponding light source 5 in the illumination device to R ' = R × α 2;
when Δ Q2 < Δ Q, the central processor adjusts the luminance of the area illuminated by the corresponding light source 5 in the illumination device to D 'using α 3, sets D' = D × α 3, adjusts the aperture radius of the area illuminated by the corresponding light source 5 in the illumination device to R ', sets R' = R × α 3;
wherein, Δ Q1 is a first preset definition difference, Δ Q2 is a second preset definition difference, α 1 is a first adjusting coefficient, α 2 is a second adjusting coefficient, α 3 is a third adjusting coefficient, Δ Q1 is less than Δ Q2,1 is less than α 1 and less than α 2 is less than α 3 and less than 1.5.
The central control processor analyzes the image shot by the camera device to the surrounding environment, further calculates the brightness of the image when the definition of the image is judged not to meet the standard, and adjusts the illumination parameter of the illuminating device according to the definition of the image to obtain better illuminating effect when the central control processor judges that the illumination parameter of the corresponding area does not meet the requirement according to the brightness of the image, so as to obtain the image with higher definition to obtain the clear image information of the underwater environment.
Specifically, the central control processor is provided with a gray value difference contrast parameter Δ Gpd, Δ Gpd > 0, performs graying processing on image information to obtain a gray image for the image information when performing image processing on the image information with the definition not meeting the standard to identify a sludge contour in the image information, calculates an average gray value Gps of the gray image and averagely divides the gray image into a plurality of regions, sequentially calculates an average gray value of an image in each region in the gray image, and sequentially calculates a difference between the average gray value of the image in each region in the gray image and the Gps to determine whether to mark the corresponding region, sets z =1,2,3, m, m for a z-th region in the gray image, sets an average gray value of the image in each region in the gray image as Gpz, sets a difference between the Gpz and the Gps as Δ Gp, and sets Δ Gp = Gps-Gpz,
when the delta Gp is larger than the delta Gpd, the central control processor marks the z-th area and extracts the marked area;
when the delta Gp is less than or equal to the delta Gpd, the central control processor does not mark the z-th area.
Specifically, the central control processor performs binarization processing on image information corresponding to the marking area to obtain binarized images, adjusts the binarization threshold value of each binarized image to identify the body contour in each binarized image during adjustment,
in the process of adjusting the binarization threshold value, the central control processor sequentially extracts the body contour in each binarization image, and for a single binarization image, if the body contour existing in the binarization image is kept unchanged in a preset binarization threshold value change interval (E1, E2), the central control processor extracts the body contour to serve as an identified body contour;
the central control processor sequentially marks each identified profile in the gray level image and respectively verifies each identified profile, wherein:
the central control processor sequentially reduces each identified body profile by a preset proportion parameter B1 to form a plurality of first identified body profiles so as to respectively determine an annular area formed by each first identified body profile and the corresponding identified body profile and calculate a gray average value G1 of the annular area;
the central control processor sequentially amplifies each identification body profile by a preset proportion parameter B2 to form a plurality of second identification body profiles so as to respectively determine an annular area formed by each second identification body profile and the corresponding identification body profile and calculate a gray average value G2 of the annular area;
the central control processor calculates the gray difference parameter H of the two annular areas according to the following formula, and sets
Figure 393535DEST_PATH_IMAGE006
Wherein B1 is more than 0.8 and less than 1 and B2 is less than 1.2.
Specifically, the central control processor judges whether the identification body profile is the profile of the sludge area according to the gray difference parameter H,
when H-1 is larger than Hd, the central control processor judges that the recognized shape profile is a sludge area profile;
when H-1 is less than or equal to Hd, the central control processor judges that the identified profile is not the sludge area profile;
wherein Hd is a preset gray difference contrast parameter.
When the central control processor judges that the illumination parameters of the corresponding area meet the requirements according to the brightness of the image, the central control processor performs image processing on the image with the definition not meeting the standards and identifies the sludge outline, in practical application, the gray value corresponding to the image of the sludge is low, the central control processor selects a marking area from the image by taking the marking area as a reference so as to mark the image which is possibly the sludge outline, the central control processor performs binarization processing on the marking area, determines the identified body outline by adjusting a binarization threshold value and verifies the identified body outline.
Specifically, when the central control processor determines that the identified body profile is a sludge area profile, the dredging robot pre-touches sludge in the identified body profile with a preset force before dredging and determines the type of the body profile according to the received feedback force F,
when F is less than or equal to F0, the central control processor judges that the object in the identified body profile is sludge;
when F > F0, the central control processor determines that the object in the identified profile is an obstacle.
Specifically, a touch probe is installed on a dredging manipulator of the dredging robot and used for detecting the feedback force of the dredging manipulator touching an object, when the central control processor judges that the identification body profile is the profile of a sludge area, the dredging robot uses preset force to pre-touch sludge in the identification body profile before dredging so as to further judge whether the identification body profile is internally provided with sludge, the safety of the working of the dredging robot is ensured, and the damage of mistakenly digging rocks to the dredging robot is avoided.
So far, the technical solutions of the present invention have been described in connection with the preferred embodiments shown in the drawings, but it is easily understood by those skilled in the art that the scope of the present invention is obviously not limited to these specific embodiments. Equivalent changes or substitutions of related technical features can be made by those skilled in the art without departing from the principle of the invention, and the technical scheme after the changes or substitutions can fall into the protection scope of the invention.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention; various modifications and alterations to this invention will become apparent to those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (7)

1. An illumination system for a self-propelled amphibious dredging robot, comprising:
the camera device is used for collecting image information of the environment where the dredging robot is located when the dredging robot carries out dredging;
the illumination device comprises a plurality of light sources which are arranged on the dredging robot in a surrounding way and is used for illuminating the environment where the dredging robot is located;
the space detection device is arranged at the front end of the dredging robot and used for acquiring environmental parameters of the environment where the dredging robot is located, and the environmental parameters comprise the area and the space height of the area of the environment where the dredging robot is located and the average distance from the position of the space detection device to the edge of the area of the environment where the space detection device is located;
the central control processor is respectively connected with the camera device, the illuminating device and the space detection device and used for judging whether the illumination parameter of the illuminating device is adjusted to a corresponding value or not according to the space adjustment parameter Ki calculated through the environment parameter and the brightness of the environment detected by the camera device, when the brightness of the image information is judged to be lower than a preset value, the central control processor adjusts the illumination parameter to the corresponding value and carries out graying processing on the image information when the brightness of the image information is larger than or equal to the preset value, and carries out binarization processing on a mark area in a grayed gray level image so as to judge whether a shape outline in the image information is sludge or not, wherein the illumination parameter comprises the brightness and aperture radius of each light source in the illuminating device;
the central control processor calculates a space adjusting parameter Ki according to the environment parameter uploaded by the space detection device, the central control processor establishes a space coordinate system f (x, y, z) by taking the position of the space detection device as a coordinate origin, a plane formed by an x axis and a y axis is taken as a reference plane, the reference plane is divided into a preset number n of areas, for the ith area, the space adjusting parameter Ki of the area is calculated according to the following formula, wherein the value of i is 1-n,
Figure 126733DEST_PATH_IMAGE001
si represents the area of the ith area, H represents the space height of the space where the dredging robot is located, V0 represents the preset space volume, li represents the average distance from the coordinate origin to the edge of the ith area, and L0 represents the preset average distance;
the camera device detects the brightness Dh of the environment where the desilting robot is located when the desilting robot runs, space contrast parameters K01 and K02 are preset in the central control processor, wherein K01 is smaller than K02, the central control processor determines the illumination parameter when the corresponding light source in the illuminating device illuminates the area according to the ith area space adjustment parameter Ki when the camera device shoots,
when Ki is larger than or equal to K02, the central control processor determines the brightness of the lighting device when the corresponding light source illuminates the ith area as D, and the setting is carried out
Figure 927461DEST_PATH_IMAGE002
(ii) a The central control processor determines the aperture radius of the lighting device when the lighting device illuminates the ith area by the corresponding light source as R, and sets
Figure 528206DEST_PATH_IMAGE003
When K01 is larger than or equal to Ki and is smaller than K02, the central control processor determines the brightness of the lighting device when the lighting device illuminates the ith area as D, D = D0 is set, the central control processor determines the aperture radius of the lighting device when the lighting device illuminates the ith area as R, and R = R0 is set;
when Ki is less than K01, the central control processor determines the brightness of the lighting device when the corresponding light source illuminates the ith area as D, and the setting is carried out
Figure 663653DEST_PATH_IMAGE004
(ii) a The central control processor determines the aperture radius of the lighting device when the lighting device illuminates the ith area by the corresponding light source as R, and sets
Figure 194997DEST_PATH_IMAGE005
Wherein D0 represents the preset brightness of the lighting device, and R0 represents the preset radius of the aperture;
the central control processor is provided with a gray value difference value contrast parameter delta Gpd, delta Gpd is larger than 0, the central control processor performs gray processing on image information when performing image processing on the image information with the definition not meeting the standard to identify a silt contour in the image information so as to obtain a gray image aiming at the image information, the central control processor calculates the average gray value Gps of the gray image and averagely divides the gray image into a plurality of areas, calculates the average gray value of the image in each area in the gray image in turn, calculates the difference value between the average gray value of the image in each area in the gray image and Gps in turn so as to judge whether to mark the corresponding area, for the z-th area in the gray image, z =1,2, 3.
When the delta Gp is larger than the delta Gpd, the central control processor marks the z-th area and extracts the marked area;
when the delta Gp is less than or equal to the delta Gpd, the central control processor does not mark the z-th area.
2. The illumination system for the self-propelled amphibious dredging robot according to claim 1, wherein the camera device collects image information of an environment where the dredging robot is located when the dredging robot is running, the central control processor compares definition Qi of the image information with preset definition Q0 when analyzing the image information collected from the ith area,
when Qi is larger than or equal to Q0, the central control processor judges that the definition of the image information of the region acquired by the camera device meets the standard, and extracts the contour characteristics of the sludge region in the region;
and when Qi is less than Q0, the central control processor judges that the definition of the image information of the region acquired by the camera device does not meet the standard, and the central control processor calculates the brightness of the image information.
3. The illumination system for the self-propelled amphibious dredging robot according to claim 2, wherein the central control processor obtains RGB values of the image information when determining that the definition of the image information of a single region collected by the camera does not meet a standard so as to calculate brightness Y of the image information and compare Y with preset brightness Y0,
when Y is less than Y0, the central control processor judges that the illumination parameter of the area does not meet the requirement, and the central control processor adjusts the illumination parameter of the lighting device according to the definition Q of the image information of the area;
and when Y is larger than or equal to Y0, the central control processor judges that the illumination parameters for the area meet the requirements, and the central control processor performs image processing on the image information with the definition not meeting the standard so as to identify the sludge contour in the image information.
4. The illumination system for the self-propelled amphibious dredging robot according to claim 3, wherein the central control processor calculates a difference Δ Q between the definition Qi of the image information of a single region and a preset definition Q0 when the illumination parameter for the region is determined to be not satisfactory, and adjusts the illumination parameter when the region is illuminated by the corresponding light source in the illumination device according to Δ Q, setting Δ Q = Q0-Qi,
when Δ Q is not greater than Δ Q1, the central processor adjusts the brightness of the area illuminated by the corresponding light source in the illumination device to D 'by using α 1, sets D' = D × α 1, and adjusts the aperture radius of the area illuminated by the corresponding light source in the illumination device to R ', sets R' = R × α 1;
when delta Q1 is larger than delta Q and is smaller than or equal to delta Q2, the central control processor adjusts the brightness of the area illuminated by the corresponding light source in the illumination device to D ' by using alpha 2, sets D ' = D multiplied by alpha 2, and adjusts the aperture radius of the area illuminated by the corresponding light source in the illumination device to R ' = R multiplied by alpha 2;
when Δ Q2 < Δ Q, the central processor adjusts the brightness of the area illuminated by the corresponding light source in the illumination device to D 'by using α 3, sets D' = D × α 3, adjusts the aperture radius of the area illuminated by the corresponding light source in the illumination device to R ', sets R' = R × α 3;
wherein, Δ Q1 is a first preset definition difference, Δ Q2 is a second preset definition difference, α 1 is a first adjusting coefficient, α 2 is a second adjusting coefficient, α 3 is a third adjusting coefficient, Δ Q1 is greater than Δ Q2,1 is greater than α 1 and is greater than α 2 and is greater than α 3 and is less than 1.5.
5. The illumination system for the self-propelled amphibious dredging robot as claimed in claim 4, wherein the central control processor performs binarization processing on image information corresponding to the marking region to obtain binarized images, the central control processor adjusts binarization threshold values of the binarized images to identify body contours in the binarized images during adjustment,
in the process of adjusting the binarization threshold value, the central control processor sequentially extracts the body contour in each binarization image, and for a single binarization image, if the body contour existing in the binarization image is kept unchanged in a preset binarization threshold value change interval (E1, E2), the central control processor extracts the body contour to serve as an identified body contour;
the central control processor sequentially marks each identified profile in the gray level image and respectively verifies each identified profile, wherein:
the central control processor sequentially reduces each identification body profile by a preset proportion parameter B1 to form a plurality of first identification body profiles so as to respectively determine an annular area formed by each first identification body profile and the corresponding identification body profile and calculate a gray average value G1 of the annular area;
the central control processor sequentially amplifies each identification body profile by a preset proportion parameter B2 to form a plurality of second identification body profiles so as to respectively determine an annular area formed by each second identification body profile and the corresponding identification body profile and calculate a gray average value G2 of the annular area;
the central control processor calculates the gray difference parameter H of the two annular areas according to the following formula, and sets
Figure 224133DEST_PATH_IMAGE006
Wherein B1 is more than 0.8 and less than 1 and B2 is less than 1.2.
6. The illumination system for the self-propelled amphibious dredging robot of claim 5, wherein the central control processor determines whether the recognized figure profile is a profile of a sludge area according to the gray difference parameter H,
when H-1 is larger than Hd, the central control processor judges that the recognized shape profile is a sludge area profile;
when H-1 is less than or equal to Hd, the central control processor judges that the identified profile is not the sludge area profile;
wherein Hd is a preset gray difference contrast parameter.
7. The illumination system for a self-propelled amphibious dredging robot according to claim 6, wherein when the central processor determines that the recognized figure profile is a sludge area profile, the dredging robot pre-touches sludge in the recognized figure profile with a preset force before dredging and determines the kind of the recognized figure profile according to the received feedback force F,
when F is less than or equal to F0, the central control processor judges that the object in the identified body profile is sludge;
when F > F0, the central control processor determines that the object in the identified profile is an obstacle.
CN202211314813.9A 2022-10-26 2022-10-26 A lighting system for amphibious desilting robot of self-propelled Active CN115397073B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211314813.9A CN115397073B (en) 2022-10-26 2022-10-26 A lighting system for amphibious desilting robot of self-propelled

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211314813.9A CN115397073B (en) 2022-10-26 2022-10-26 A lighting system for amphibious desilting robot of self-propelled

Publications (2)

Publication Number Publication Date
CN115397073A CN115397073A (en) 2022-11-25
CN115397073B true CN115397073B (en) 2023-03-24

Family

ID=84127638

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211314813.9A Active CN115397073B (en) 2022-10-26 2022-10-26 A lighting system for amphibious desilting robot of self-propelled

Country Status (1)

Country Link
CN (1) CN115397073B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022205525A1 (en) * 2021-04-01 2022-10-06 江苏科技大学 Binocular vision-based autonomous underwater vehicle recycling guidance false light source removal method

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2555148A1 (en) * 2006-08-02 2008-02-02 Mcgill University Amphibious robotic device
CN102169580B (en) * 2011-04-08 2012-10-10 中国船舶重工集团公司第七○二研究所 Self-adaptive image processing method utilizing image statistic characteristics
CN107701993A (en) * 2017-10-19 2018-02-16 中信重工开诚智能装备有限公司 A kind of adaptive illuminator for underwater robot
CN107808161B (en) * 2017-10-26 2020-11-24 江苏科技大学 Underwater target identification method based on optical vision
CN108693535B (en) * 2018-04-03 2021-05-18 中信重工开诚智能装备有限公司 Obstacle detection system and method for underwater robot
CN213062166U (en) * 2020-07-31 2021-04-27 南京信息工程大学 Autonomous and remote control type underwater dredging robot
CN112067555B (en) * 2020-11-12 2021-03-30 山东海德智能科技有限公司 Part detection system capable of automatically visually identifying part types
CN114991298B (en) * 2022-06-23 2023-06-06 华中科技大学 Urban drainage pipeline detection and dredging intelligent robot and working method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022205525A1 (en) * 2021-04-01 2022-10-06 江苏科技大学 Binocular vision-based autonomous underwater vehicle recycling guidance false light source removal method

Also Published As

Publication number Publication date
CN115397073A (en) 2022-11-25

Similar Documents

Publication Publication Date Title
US11324095B2 (en) Automatic stage lighting tracking system and a control method therefor
KR102065975B1 (en) Safety Management System Using a Lidar for Heavy Machine
CN108693535B (en) Obstacle detection system and method for underwater robot
CN109753081B (en) Roadway inspection unmanned aerial vehicle system based on machine vision and navigation method
CN107797560B (en) Visual recognition system and method for robot tracking
CN110189375B (en) Image target identification method based on monocular vision measurement
KR20070012118A (en) Robot having function of recognizing image and leading system for thereof
CN110120074B (en) Cable positioning method for live working robot in complex environment
CN110827361B (en) Camera group calibration method and device based on global calibration frame
CN112001917A (en) Machine vision-based geometric tolerance detection method for circular perforated part
US20070206182A1 (en) Surface Defect Inspecting Method And Device
CN113624225B (en) Pose resolving method for mounting engine positioning pins
US20230186516A1 (en) Method and flat bed machine tool for detecting a fitting position of a supporting bar
CN103363898B (en) Container is to boxes detecting device
WO2020086698A1 (en) Methods and systems used to measure tire treads
CN104657702B (en) Eyeball arrangement for detecting, pupil method for detecting and iris discrimination method
CN115397073B (en) A lighting system for amphibious desilting robot of self-propelled
CN110320523B (en) Target positioning device and method for following robot
CN104858877B (en) High-tension line drop switch changes the control method of control system automatically
KR101136743B1 (en) Position measuring device having distance and angle measuring function
CN113109762B (en) Optical vision guiding method for AUV (autonomous Underwater vehicle) docking recovery
CN117008622A (en) Visual robot underwater target identification tracking method and underwater visual robot thereof
CN111935866B (en) Navigation mark brightness adjusting method, device and system based on light pollution evaluation
US11132579B2 (en) Contour recognition device, contour recognition system and contour recognition method
CN114518079A (en) Hole internal feature detection system and detection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant