CN116883491A - Adjustment distance determining method, device, computer equipment and storage medium - Google Patents

Adjustment distance determining method, device, computer equipment and storage medium Download PDF

Info

Publication number
CN116883491A
CN116883491A CN202310683129.6A CN202310683129A CN116883491A CN 116883491 A CN116883491 A CN 116883491A CN 202310683129 A CN202310683129 A CN 202310683129A CN 116883491 A CN116883491 A CN 116883491A
Authority
CN
China
Prior art keywords
sampling
pixel point
image
matching
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310683129.6A
Other languages
Chinese (zh)
Inventor
张雨浓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bank of China Ltd
Original Assignee
Bank of China Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bank of China Ltd filed Critical Bank of China Ltd
Priority to CN202310683129.6A priority Critical patent/CN116883491A/en
Publication of CN116883491A publication Critical patent/CN116883491A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The application relates to an adjustment distance determining method, an adjustment distance determining device, computer equipment and a storage medium. The method comprises the following steps: acquiring an initial sampling image, and acquiring matching pixel points corresponding to sampling pixel points from the initial image aiming at each sampling pixel point in the initial sampling image; determining a first sampling pixel point and a second sampling pixel point corresponding to the sampling pixel point based on the relative positions between the matching pixel point and the sampling pixel point; determining a sampling pixel value of the sampling pixel point based on the first sampling pixel point and the second sampling pixel point; obtaining a target image corresponding to the initial image based on each sampling pixel point in the initial sampling image and the sampling pixel value corresponding to each sampling pixel point; calculating a separation distance between the target object and the target device based on the target image; determining an adjustment distance of the target object based on the separation distance; the adjustment distance is used for prompting the target object to adjust the position. The method can improve the determination efficiency of the adjustment distance.

Description

Adjustment distance determining method, device, computer equipment and storage medium
Technical Field
The present application relates to the field of artificial intelligence, and in particular, to a method, an apparatus, a computer device, a storage medium, and a computer program product for determining an adjustment distance.
Background
With the development of artificial intelligence technology, a bank business hall has fully entered an intelligent age, an intelligent counter deployed in the bank business hall realizes intelligent handling of banking businesses, and when the intelligent counter handles businesses for users, a user image is acquired through an installed camera.
In the conventional technology, an intelligent cabinet processes an acquired user image to determine an adjustment distance that a user needs to move, and the time taken to determine the adjustment distance is long, resulting in low determination efficiency of the adjustment distance.
Disclosure of Invention
In view of the foregoing, it is desirable to provide an adjustment distance determination method, apparatus, computer device, computer-readable storage medium, and computer program product that can improve the adjustment distance determination efficiency.
In a first aspect, the present application provides a method for adjusting a distance determination. The method comprises the following steps:
Acquiring an initial sampling image, and acquiring a matching pixel point corresponding to each sampling pixel point in the initial sampling image from the initial image; the initial image is obtained by shooting a target object by target equipment;
determining a first sampling pixel point and a second sampling pixel point corresponding to the sampling pixel point based on the relative positions between the matching pixel point and the sampling pixel point;
determining a sampling pixel value of the sampling pixel point based on the first sampling pixel point and the second sampling pixel point;
obtaining a target image corresponding to the initial image based on each sampling pixel point in the initial sampling image and the sampling pixel value corresponding to each sampling pixel point;
calculating a separation distance between the target object and the target device based on the target image;
determining an adjustment distance of the target object based on the separation distance; the adjustment distance is used for prompting the target object to adjust the position.
In one embodiment, the obtaining the matching pixel point corresponding to the sampling pixel point from the initial image includes:
acquiring sampling positions of the sampling pixel points;
And acquiring four matched pixel points adjacent to the sampling pixel point from the initial image based on the sampling position.
In one embodiment, the determining the first sampling pixel point and the second sampling pixel point corresponding to the sampling pixel point based on the relative positions between the matching pixel point and the sampling pixel point includes:
dividing the matched pixel points into a first combination and a second combination; two matched pixel points in the first combination are adjacent left and right, and two matched pixel points in the second combination are adjacent left and right;
determining the first sampling pixel point based on the two matching pixel points and the sampling pixel point in the first combination;
the second sampled pixel point is determined based on the two matched pixel points and the sampled pixel point in the second combination.
In one embodiment, the two matching pixels in the first combination are a first matching pixel and a second matching pixel, respectively; the determining the first sampling pixel based on the two matching pixels and the sampling pixel in the first combination includes:
acquiring a first matching coordinate of the first matching pixel point and a second matching coordinate of the second matching pixel point; the first matching coordinates and the second matching coordinates comprise coordinate values of the first coordinates and coordinate values of the second coordinates;
Determining a first sampling coordinate of the first sampling pixel point based on the coordinate value of a second coordinate in the first matching coordinate and the coordinate value of a first coordinate in the sampling coordinates of the sampling pixel point;
determining a first weight of the first matched pixel point and a second weight of the second matched pixel point based on the first matched coordinate, the second matched coordinate and the first sampling coordinate;
and fusing the first pixel value of the first matched pixel point and the second pixel value of the second matched pixel point based on the first weight value and the second weight value to obtain a first sampling pixel value of the first sampling pixel point.
In one embodiment, the determining the first weight of the first matched pixel and the second weight of the second matched pixel based on the first matched coordinate, the second matched coordinate, and the first sampled coordinate includes:
determining a distance between the first matching sampling point and the second matching sampling point based on the first matching coordinate and the second matching coordinate;
determining a first distance between the first matching sampling point and the first sampling pixel point based on the first matching coordinate and the first sampling coordinate, and determining a first weight of the first matching pixel point based on the first distance and the distance;
And determining a second distance between the second matching sampling point and the first sampling pixel point based on the second matching coordinate and the first sampling coordinate, and determining a second weight of the second matching pixel point based on the second distance and the distance.
In one embodiment, the calculating the separation distance between the target object and the target device based on the target image includes:
determining an avatar area in the target image;
and determining the separation distance between the target object and the target equipment by using a positioning model algorithm based on the head portrait area in the target image.
In one embodiment, the calculating a separation distance between the target object and the target device based on the target image includes:
performing uniform brightness treatment on the target image to obtain a brightness image;
denoising the brightness image to obtain a denoised image;
performing edge detection processing on the denoising image to obtain an edge image;
and calculating the interval distance between the target object and the target equipment based on the edge image.
In a second aspect, the application further provides an adjustment distance determining device. The device comprises:
The acquisition module is used for acquiring an initial sampling image, and acquiring matched pixel points corresponding to the sampling pixel points from the initial image aiming at each sampling pixel point in the initial sampling image; the initial image is obtained by shooting a target object by target equipment;
the first determining module is used for determining a first sampling pixel point and a second sampling pixel point corresponding to the sampling pixel point based on the relative positions between the matching pixel point and the sampling pixel point;
a second determining module, configured to determine a sampling pixel value of the sampling pixel point based on the first sampling pixel point and the second sampling pixel point;
the generation module is used for obtaining a target image corresponding to the initial image based on each sampling pixel point in the initial sampling image and the sampling pixel value corresponding to each sampling pixel point;
a calculation module for calculating a separation distance between the target object and the target device based on the target image;
the adjusting module is used for determining the adjusting distance of the target object based on the interval distance; the adjustment distance is used for prompting the target object to adjust the position.
In a third aspect, the present application also provides a computer device comprising a memory storing a computer program and a processor implementing the steps of the method of any one of the first aspects when the computer program is executed by the processor.
In a fourth aspect, the present application also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method of any of the first aspects.
In a fifth aspect, the application also provides a computer program product comprising a computer program which, when executed by a processor, implements the steps of the method of any of the first aspects.
According to the adjustment distance determining method, the device, the computer equipment, the storage medium and the computer program product, the initial sampling image is acquired, the matched pixel points corresponding to the sampling pixel points are acquired from the initial sampling image aiming at each sampling pixel point in the initial sampling image, the first sampling pixel point and the second sampling pixel point corresponding to the sampling pixel points are determined based on the relative positions of the matched pixel points and the sampling pixel points, the sampling pixel values of the sampling pixel points are determined based on the first sampling pixel point and the second sampling pixel point, the target image corresponding to the initial sampling image is obtained based on each sampling pixel point in the initial sampling image and the sampling pixel values corresponding to each sampling pixel point, the initial image is converted into the target image corresponding to the initial image through the steps, the content in the initial image is contained in the target image, but the pixels in the target image are fewer than the pixels in the initial image, the distance between the target object and the target equipment is determined by processing the target image with fewer pixels, the distance between the target object is improved, the determination efficiency of the distance is improved, and the adjustment distance is determined based on the distance between the target object.
Drawings
FIG. 1 is a diagram of an application environment for a method of adjusting distance determination in one embodiment;
FIG. 2 is a flow chart of a method for adjusting distance determination in one embodiment;
FIG. 3 is a flowchart illustrating steps for determining a first sampling pixel and a second sampling pixel in one embodiment;
FIG. 4 is a schematic diagram of matching pixels and sampling pixels in one embodiment;
FIG. 5 is a flowchart illustrating a first exemplary pixel determination step in one embodiment;
FIG. 6 is a flowchart illustrating steps for determining the first weight and the second weight in one embodiment;
FIG. 7 is a flow diagram of a step of determining a separation distance in one embodiment;
FIG. 8 is a block diagram of an adjustment distance determination device in one embodiment;
fig. 9 is an internal structural diagram of a computer device in one embodiment.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
The adjustment distance determining method provided by the embodiment of the application can be applied to an application environment shown in fig. 1. Wherein the terminal 102 communicates with the server 104 via a network. The data storage system may store data that the server 104 needs to process. The data storage system may be integrated on the server 104 or may be located on a cloud or other network server. The terminal and the server can be used separately to execute the adjustment distance determining method provided in the embodiment of the application. The terminal and the server may also cooperate to perform the adjustment distance determination method provided in the embodiments of the present application. For example, the terminal acquires an initial sampling image, acquires a matching pixel point corresponding to the sampling pixel point from the initial image for each sampling pixel point in the initial sampling image, determines a first sampling pixel point and a second sampling pixel point corresponding to the sampling pixel point based on the relative positions between the matching pixel point and the sampling pixel point, determines sampling pixel values of the sampling pixel points based on the first sampling pixel point and the second sampling pixel point, obtains a target image corresponding to the initial image based on each sampling pixel point in the initial sampling image and the sampling pixel values corresponding to each sampling pixel point, calculates a spacing distance between the target object and the target device based on the target image, determines an adjustment distance of the target object based on the spacing distance, and adjusts the distance to be used for prompting the target object adjustment position. The terminal 102 may be, but is not limited to, various personal computers, notebook computers, smart phones, tablet computers, internet of things devices, and portable wearable devices. The server 104 may be implemented as a stand-alone server or as a server cluster of multiple servers.
In one embodiment, as shown in fig. 2, a method for determining an adjustment distance is provided, and this embodiment is described by taking the application of the method to a computer device as an example, and includes steps 202 to 212.
Step 202, acquiring an initial sampling image, and acquiring matching pixel points corresponding to sampling pixel points from the initial image aiming at each sampling pixel point in the initial sampling image; the initial image is obtained by photographing the target object by the target device.
The initial sampling image is a preset sampling image, the size of the initial sampling image is the same as that of the initial image, but the number of sampling pixels in the initial sampling image is smaller than that of the pixels in the initial image, and the initial sampling image comprises a plurality of sampling pixels. The sampling pixel points refer to pixel points in the initial sampling image, the sampling positions of the sampling pixel points are known data, but the pixel values of the sampling pixel points are unknown data. The initial image refers to an image obtained by shooting a target object by a target device. The matching pixel points refer to pixel points adjacent to sampling positions of sampling pixel points in the initial image. The number of matching pixels is at least two. The target device is a camera mounted on the intelligent counter. The target object refers to a customer transacting business on the intelligent counter.
Illustratively, the computer device acquires an initial image through the camera, then acquires an initial sampled image, and for each sampled pixel in the initial sampled image, acquires a matching pixel corresponding to the sampled pixel from the initial image.
In one embodiment, the computer device obtains an initial image through the camera, determines an initial size of the initial image, then obtains an initial sampling image corresponding to the initial ruler, and obtains, for each sampling pixel point in the initial sampling image, a matching pixel point corresponding to the sampling pixel point from the initial image.
Step 204, determining a first sampling pixel point and a second sampling pixel point corresponding to the sampling pixel point based on the relative positions between the matching pixel point and the sampling pixel point.
The first sampling pixel point is the same as the Y-axis coordinate value of the two matching pixel points, the same as the X-axis coordinate value of the sampling pixel point, and the first sampling pixel value of the first sampling pixel point is determined by the pixel values of the two matching pixel points and the distance between the two matching pixel points and the first sampling pixel point. The second sampling pixel point is the pixel point which is the same as the Y-axis coordinate value of the other two matching pixel points and the X-axis coordinate value of the sampling pixel point, and the second sampling pixel value of the second sampling pixel point is determined by the pixel values of the other two matching pixel points and the distance between the other two matching pixel points and the first sampling pixel point.
The computer device determines first and second sampling pixels corresponding to the sampling pixels based on the relative positions between the matching pixels and the sampling pixels.
In step 206, a sampled pixel value of the sampled pixel is determined based on the first sampled pixel and the second sampled pixel.
The sampling pixel value refers to the pixel value of the sampling pixel point.
The computer device determines a sampled pixel value of the sampled pixel point based on a first sampled pixel value of the first sampled pixel point and a second sampled pixel value of the second sampled pixel point, and a first separation distance between the first sampled pixel point and the sampled pixel point, and a second separation distance between the second sampled pixel point and the sampled pixel point.
In one embodiment, the computer device calculates a total separation distance between the first and second sampled pixels, then calculates a first separation distance between the first and sampled pixels, and a second separation distance between the second and sampled pixels, determines a ratio between the first separation distance and the total separation distance as a first coefficient of the first sampled pixel, determines a ratio between the second separation distance and the total separation distance as a second coefficient of the second sampled pixel, multiplies the first sampled pixel value of the first sampled pixel by the first coefficient, and multiplies the second sampled pixel value of the second sampled pixel by the second coefficient to obtain a sampled pixel value of the sampled pixel.
Step 208, obtaining a target image corresponding to the initial image based on each sampling pixel point in the initial sampling image and the sampling pixel value corresponding to each sampling pixel point.
The target image is an image containing the image content in the initial image, but fewer pixels than the pixels in the initial image. It is understood that the target image is an image obtained by downsampling the initial image.
The computer device determines a sampling position of each sampling pixel point in the initial sampling image as a position of each target pixel point in the target image, determines a sampling pixel value of each sampling pixel point as a target pixel value of a corresponding target pixel point in the target image, and obtains a target image corresponding to the initial image.
Step 210, calculating a separation distance between the target object and the target device based on the target image.
Wherein, the interval distance refers to a straight line distance between the target object and the target device.
The computer device processes the target image, determines a separation distance between the target object and the target device
Step 212, determining an adjustment distance of the target object based on the separation distance; the adjustment distance is used for prompting the target object to adjust the position.
The adjustment distance refers to the distance that the target object needs to be adjusted. The adjustment distance at least comprises at least one of a front-back adjustment distance, a top-bottom adjustment distance and a left-right adjustment distance.
The computer device determines an adjustment distance of the target object from the separation distance, and prompts the target object to adjust the position based on the adjustment distance.
In this embodiment, an initial sampling image is acquired, for each sampling pixel point in the initial sampling image, a matching pixel point corresponding to the sampling pixel point is acquired from the initial image, a first sampling pixel point and a second sampling pixel point corresponding to the sampling pixel point are determined based on the relative positions of the matching pixel point and the sampling pixel point, the sampling pixel value of the sampling pixel point is determined based on the first sampling pixel point and the second sampling pixel point, the target image corresponding to the initial image is obtained based on each sampling pixel point in the initial sampling image and the sampling pixel value corresponding to each sampling pixel point, the initial image is converted into the target image corresponding to the initial image by the steps, the content in the initial image is contained in the target image, but the pixel points in the target image are fewer than the pixel points in the initial image, the distance between the target object and the target device is determined by processing the target image with fewer pixel points, the distance between the target object is determined, the adjustment efficiency of the target object is improved, and the adjustment distance of the target object is determined based on the distance.
In one embodiment, obtaining a matching pixel point corresponding to a sampling pixel point from an initial image includes:
acquiring sampling positions of sampling pixel points; based on the sampling position, four matching pixel points adjacent to the sampling pixel point are acquired from the initial image.
The sampling position refers to the coordinates of the sampling pixel point in the initial sampling image.
The computer device obtains sampling positions of sampling pixels, performs rounding processing on the sampling positions to obtain target sampling positions, determines four original pixels adjacent to the sampling pixels based on the target sampling positions and original positions of all original pixels in the initial image, and determines the four original pixels adjacent to the sampling pixels as four matching pixels corresponding to the sampling pixels.
In this embodiment, four matching pixel points corresponding to the sampling pixel points are obtained from the initial image, which can be understood as that the image features represented by the four matching pixel points are concentrated on the corresponding sampling pixel points, so as to reduce the number of pixel points in the target image.
In one embodiment, as shown in fig. 3, determining a first sampling pixel point and a second sampling pixel point corresponding to the sampling pixel point based on a relative position between the matching pixel point and the sampling pixel point includes:
Step 302, dividing the matched pixel points into a first combination and a second combination; two matched pixels in the first combination are adjacent to each other left and right, and two matched pixels in the second combination are adjacent to each other left and right.
The first combination refers to a set formed by two left and right adjacent matched pixel points. The second combination refers to a set of two other left and right adjacent matched pixel points. For example, as shown in fig. 4, P is a sampling pixel point, Q1, Q2, Q3 and Q4 are four matching pixels corresponding to the sampling pixel point P, R1 is a first sampling pixel point, R2 is a second sampling pixel point, where Q1 and Q2 are adjacent left and right, and Q3 and Q4 are adjacent left and right, then Q1 and Q2 are divided into a first combination, and Q3 and Q4 are divided into a second combination.
Illustratively, the computer device obtains matching locations of matching pixels, divides two matching pixels in the matching locations that have the same Y-axis coordinate values into a first combination, and divides another two matching pixels in the matching locations that have the same Y-axis coordinate values into a second combination.
Step 304, determining a first sampling pixel point based on the two matching pixels points and the sampling pixel point in the first combination.
The computer device determines a first sampling position of the first sampling pixel based on the matching positions of the two matching pixels in the first combination and the sampling position of the sampling pixel, and determines a first sampling pixel value of the first sampling pixel based on the matching pixel values of the two matching pixels in the first combination.
Step 306, determining a second sampling pixel point based on the two matching pixel points and the sampling pixel point in the second combination.
The computer device determines a second sampling position of the second sampling pixel based on the matching positions of the two matching pixels in the second combination and the sampling positions of the sampling pixels, and determines a second sampling pixel value of the second sampling pixel based on the matching pixel values of the two matching pixels in the second combination.
In this embodiment, a first sampling pixel point is determined according to two matching pixel points and sampling pixel points in the first combination, and the first sampling pixel point is determined according to the two matching pixel points and the sampling pixel points in the first combination, so that accurate basic data is provided for subsequent determination of the sampling pixel points.
In one embodiment, as shown in fig. 5, two matched pixels in the first combination are a first matched pixel and a second matched pixel, respectively; determining a first sampling pixel based on the two matching pixels and the sampling pixel in the first combination includes:
step 502, obtaining a first matching coordinate of a first matching pixel point and a second matching coordinate of a second matching pixel point; the first matching coordinate and the second matching coordinate each include a coordinate value of the first coordinate and a coordinate value of the second coordinate.
The first matching coordinates refer to coordinates of a position of the first matching pixel point in the initial image. The first matching coordinates are composed of coordinate values of two dimensions, namely coordinate values of a first coordinate and coordinate values of a second coordinate, and the first coordinate is an X-axis coordinate and the second coordinate is a Y-axis coordinate. The coordinate value of the first coordinate refers to the value of the X-axis coordinate. The coordinate value of the second coordinate means the value of the Y-axis coordinate.
Illustratively, the computer device obtains a first matching coordinate of the first matching pixel point and a second matching coordinate of the second matching pixel point.
In step 504, the first sampling coordinate of the first sampling pixel point is determined based on the coordinate value of the second coordinate in the first matching coordinate and the coordinate value of the first coordinate in the sampling coordinate of the sampling pixel point.
The computer device determines the coordinate value of the first coordinate in the sampling coordinates of the sampling pixel point as the coordinate value of the first coordinate in the first sampling coordinates of the first sampling pixel point, determines the coordinate value of the second coordinate in the first matching coordinates as the coordinate value of the second coordinate in the first sampling coordinates of the first sampling pixel point, and obtains the first sampling coordinates of the first sampling pixel point.
Step 506, determining a first weight of the first matched pixel point and a second weight of the second matched pixel point based on the first matched coordinate, the second matched coordinate and the first sampling coordinate.
The first weight refers to the weight of the first matched pixel value of the first matched pixel point, and can be understood as the proportion of multiplication with the first matched pixel value of the first matched pixel point.
The computer device determines a distance between the first and second matched sampling points based on the first and second matched coordinates, determines a first distance between the first and first sampled pixel points based on the first and first matched coordinates, determines a second distance between the second and first sampled pixel points based on the second and first sampled coordinates, and determines a first weight of the first and second matched pixel points based on the distance, the first and second distances.
Step 508, based on the first weight and the second weight, fusing the first pixel value of the first matched pixel point and the second pixel value of the second matched pixel point to obtain a first sampling pixel value of the first sampling pixel point.
The computer device multiplies the first pixel value of the first matching pixel point by the first weight value to obtain a first weight pixel value, multiplies the second pixel value of the second matching pixel point by the second weight value to obtain a second weight pixel value, and adds the second weight pixel value to the first weight pixel value to obtain a first sampling pixel value of the first sampling pixel point.
In one embodiment, the sampled pixel values for the sampled pixel points are calculated using the following formula:
wherein f (R1) is the first sampling pixel value of the first sampling pixel point, f (R2) is the second sampling pixel value of the second sampling pixel point, f (P) is the sampling pixel value of the sampling pixel point, (x) R1 ,y E1 ) A first sampling coordinate (x) for a first sampling pixel R2 ,y R2 ) For the second sampling coordinate of the second sampling pixel point, X1 is the X-axis coordinate value of the first matching pixel point, X2 is the X-axis coordinate value of the second matching pixel point, X3 is the X-axis coordinate value of the third matching pixel point, X4 is the X-axis coordinate value of the fourth matching pixel point, f (Q1) the first pixel value of the first matching pixel point, f (Q2) the second pixel value of the second matching pixel point, f (Q3) the third pixel value of the third matching pixel point, and f (Q4) the fourth pixel value of the fourth matching pixel point.
In this embodiment, the first sampling pixel point is determined based on the two matching pixel points and the sampling pixel point in the first combination, which can be understood that the image features of the two matching pixel points are represented by the first sampling pixel point, and the first sampling pixel point provides accurate basic data for determining the sampling pixel point.
In one embodiment, as shown in fig. 6, determining the first weight of the first matched pixel point and the second weight of the second matched pixel point based on the first matched coordinate, the second matched coordinate, and the first sampling coordinate includes:
step 602, determining a distance between the first matching sampling point and the second matching sampling point based on the first matching coordinate and the second matching coordinate.
Illustratively, the computer device subtracts the coordinate value of the X-axis coordinate of the second matching coordinate from the coordinate value of the X-axis coordinate of the first matching coordinate to obtain a distance between the first matching sample point and the second matching sample point.
Step 604, determining a first distance between the first matching sampling point and the first sampling pixel point based on the first matching coordinate and the first sampling coordinate, and determining a first weight of the first matching pixel point based on the first distance and the distance.
The computer device subtracts the coordinate value of the X-axis coordinate of the first sampling coordinate from the coordinate value of the X-axis coordinate of the first matching coordinate to obtain a first distance between the first matching sampling point and the first sampling pixel point, and divides the first distance by the distance to obtain a first weight of the first matching pixel point.
Step 606, determining a second distance between the second matching sampling point and the first sampling pixel point based on the second matching coordinate and the first sampling coordinate, and determining a second weight of the second matching pixel point based on the second distance and the distance.
The computer device subtracts the coordinate value of the X-axis coordinate of the first sampling coordinate from the coordinate value of the X-axis coordinate of the second matching coordinate to obtain a second distance between the second matching sampling point and the first sampling pixel point, and divides the second distance by the distance to obtain a second weight of the second matching pixel point.
In this embodiment, a first weight of a first matching pixel is determined by a first distance between the first matching sampling point and the first sampling pixel, and a second weight of a second matching pixel is determined by a second distance between the second matching sampling point and the first sampling pixel, that is, the closer the matching pixel is to the first sampling pixel, the greater the similarity between the second weight and the first sampling pixel is, and the higher the weight corresponding to the matching pixel is.
In one embodiment, calculating the separation distance between the target object and the target device based on the target image comprises:
determining an avatar area in the target image; based on the head portrait region in the target image, a positioning model algorithm is used to determine the separation distance between the target object and the target device.
The head portrait region refers to a region including a head image of a target object in a target image. The positioning model algorithm is an algorithm for estimating the position and posture of a target in three-dimensional space, and it generally performs target positioning and posture estimation through a series of mathematical and computational methods based on input sensor data (e.g., camera images, laser scan data, etc.) and a priori knowledge (e.g., model of the target, environmental map, etc.).
Illustratively, the computer device performs image recognition on the target image, determines a head portrait region in the target image, and then determines a separation distance between the target object and the target device using a positioning model algorithm based on the head portrait region in the target image.
In this embodiment, the target image includes the content in the initial image, but the pixels in the target image are fewer than the pixels in the initial image, and the distance between the target object and the target device is determined by processing the target image with fewer pixels, so that the efficiency of determining the distance is improved.
In one embodiment, as shown in fig. 7, calculating the separation distance between the target object and the target device based on the target image includes:
step 702, performing uniform brightness processing on the target image to obtain a brightness image.
The uniform brightness processing refers to that the brightness distribution of the whole image is more uniform by adjusting the brightness value of the image, so as to achieve a balanced visual effect, and methods used in the uniform brightness processing include, but are not limited to, histogram equalization, adaptive equalization and the like. The luminance image is an image obtained by uniformly luminance processing a target image.
Illustratively, the computer device performs uniform brightness processing on the target image to obtain a brightness image.
Step 704, denoising the brightness image to obtain a denoised image.
The denoising process refers to a process of reducing or eliminating noise in an image, and methods used in the denoising process include, but are not limited to, gaussian filtering, mean filtering, bilateral filtering and the like. The denoising image is an image obtained by denoising a luminance image.
Illustratively, the computer device performs denoising processing on the luminance image to obtain a denoised image.
In step 706, edge detection processing is performed on the denoised image to obtain an edge image.
The edge detection process refers to a process of detecting an edge or a contour of an object or a scene in an image, and methods used in the edge detection process include, but are not limited to, canny edge detection, sobel operator and Roberts operator, wherein Canny edge detection is a widely-used edge detection algorithm that detects an edge in an image by using a plurality of steps, sobel operator performs convolution operation using a 3x3 convolution check image, calculates gradients in horizontal and vertical directions respectively, and then combines the gradients in two directions to detect an edge, roberts operator performs convolution operation using a 2x2 convolution check image, and detects an edge by calculating a difference between adjacent pixels. The edge image is an image obtained by performing edge detection on the denoised image.
Illustratively, the computer device performs edge detection processing on the denoised image to obtain an edge image.
Step 708, calculating a separation distance between the target object and the target device based on the edge image.
Illustratively, the computer device performs image recognition on the edge image, determines a head portrait region in the edge image, and then determines a separation distance between the target object and the target device using a positioning model algorithm based on the head portrait region in the edge image.
In this embodiment, the target image is subjected to uniform brightness processing, denoising processing and edge detection processing, where the target image includes content in the initial image, but pixels in the target image are fewer than those in the initial image, so that the duration of the uniform brightness processing, denoising processing and edge detection processing is shortened, and then the separation distance between the target object and the target device is determined based on the target image, so that the determination efficiency of the separation distance is improved.
In one exemplary embodiment, the adjustment distance determination method includes the steps of:
the intelligent counter acquires an initial image through the camera, acquires an initial sampling image, acquires sampling positions of sampling pixel points aiming at each sampling pixel point in the initial sampling image, performs rounding processing on the sampling positions to obtain a target sampling position, determines four original pixel points adjacent to the sampling pixel points based on the target sampling position and original positions of all original pixel points in the initial image, determines the four original pixel points adjacent to the sampling pixel points as four matching pixel points corresponding to the sampling pixel points, acquires matching positions of the matching pixel points, divides two matching pixel points with the same Y-axis coordinate value in the matching positions into a first combination, and divides the other two matching pixel points with the same Y-axis coordinate value in the matching positions into a second combination.
And respectively taking the two matched pixel points in the first combination as a first matched pixel point and a second matched pixel point, acquiring a first matched coordinate of the first matched pixel point and a second matched coordinate of the second matched pixel point by the intelligent counter, determining coordinate values of X-axis coordinates in sampling coordinates of the sampling pixel points as coordinate values of X-axis coordinates in the first sampling coordinates of the first sampling pixel points, determining coordinate values of Y-axis coordinates in the first matched coordinates as coordinate values of Y-axis coordinates in the first sampling coordinates of the first sampling pixel points, and obtaining the first sampling coordinates of the first sampling pixel points. The intelligent counter subtracts the coordinate value of the X-axis coordinate of the first matching coordinate from the coordinate value of the X-axis coordinate of the second matching coordinate to obtain a distance between the first matching sampling point and the second matching sampling point, subtracts the coordinate value of the X-axis coordinate of the first matching coordinate from the coordinate value of the X-axis coordinate of the first matching coordinate to obtain a first distance between the first matching sampling point and the first sampling pixel point, divides the first distance by the distance to obtain a first weight of the first matching pixel point, subtracts the coordinate value of the X-axis coordinate of the second matching coordinate from the coordinate value of the X-axis coordinate of the first sampling point to obtain a second distance between the second matching sampling point and the first sampling pixel point, divides the second distance by the distance to obtain a second weight of the second matching pixel point, multiplies the first pixel value of the first matching pixel point by the first weight to obtain a first weight pixel value, multiplies the second pixel value of the second matching pixel point by the second weight to obtain a second weight pixel value, and determines the first sampling pixel value based on the first sampling pixel point and the first sampling pixel point. With the above method, the second sampling pixel point is determined based on the two matching pixel points and the sampling pixel point in the second combination.
The intelligent counter calculates the total interval distance between the first sampling pixel point and the second sampling pixel point, then calculates the first interval distance between the first sampling pixel point and the sampling pixel point, and the second interval distance between the second sampling pixel point and the sampling pixel point, determines the ratio between the first interval distance and the total interval distance as a first coefficient of the first sampling pixel point, determines the ratio between the second interval distance and the total interval distance as a second coefficient of the second sampling pixel point, multiplies the first sampling pixel value of the first sampling pixel point by the first coefficient, and adds the second sampling pixel value of the second sampling pixel point by the second coefficient to obtain the sampling pixel value of the sampling pixel point.
And the intelligent counter determines the sampling position of each sampling pixel point in the initial sampling image as the position of each target pixel point in the target image, and determines the sampling pixel value of each sampling pixel point as the target pixel value of the corresponding target pixel point in the target image, so as to obtain the target image corresponding to the initial image.
The intelligent counter performs uniform brightness processing on the target image by using histogram equalization to obtain a brightness image, performs denoising processing on the brightness image by using Gaussian filtering to obtain a denoising image, performs edge detection processing on the denoising image by using Canny edge detection to obtain an edge image, performs image recognition on the edge image, determines a head portrait area in the edge image, determines a spacing distance between the target object and target equipment by using a positioning model algorithm based on the head portrait area in the edge image, determines an adjustment distance of the target object according to the spacing distance, and prompts the target object to adjust the position based on the adjustment distance.
In this embodiment, an initial sampling image is acquired, for each sampling pixel point in the initial sampling image, a matching pixel point corresponding to the sampling pixel point is acquired from the initial image, a first sampling pixel point and a second sampling pixel point corresponding to the sampling pixel point are determined based on the relative positions of the matching pixel point and the sampling pixel point, the sampling pixel value of the sampling pixel point is determined based on the first sampling pixel point and the second sampling pixel point, the target image corresponding to the initial image is obtained based on each sampling pixel point in the initial sampling image and the sampling pixel value corresponding to each sampling pixel point, the initial image is converted into the target image corresponding to the initial image by the steps, the content in the initial image is contained in the target image, but the pixel points in the target image are fewer than the pixel points in the initial image, the distance between the target object and the target device is determined by processing the target image with fewer pixel points, the distance between the target object is determined, the adjustment efficiency of the target object is improved, and the adjustment distance of the target object is determined based on the distance.
It should be understood that, although the steps in the flowcharts related to the embodiments described above are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
Based on the same inventive concept, the embodiment of the application also provides an adjustment distance determining device for realizing the adjustment distance determining method. The implementation of the solution provided by the device is similar to the implementation described in the above method, so the specific limitation in the embodiments of the adjustment distance determining device or devices provided below may refer to the limitation of the adjustment distance determining method hereinabove, and will not be described herein.
In one embodiment, as shown in fig. 8, there is provided an adjustment distance determining apparatus including: an acquisition module 802, a first determination module 804, a second determination module 806, a generation module 808, a calculation module 810, and an adjustment module 812, wherein:
an obtaining module 802, configured to obtain an initial sampled image, and obtain, for each sampling pixel point in the initial sampled image, a matching pixel point corresponding to the sampling pixel point from the initial image; the initial image is obtained by photographing the target object by the target device.
The first determining module 804 is configured to determine a first sampling pixel point and a second sampling pixel point corresponding to the sampling pixel point based on a relative position between the matching pixel point and the sampling pixel point.
The second determining module 806 is configured to determine a sampling pixel value of the sampling pixel point based on the first sampling pixel point and the second sampling pixel point.
The generating module 808 is configured to obtain a target image corresponding to the initial image based on each sampling pixel point in the initial sampling image and the sampling pixel values corresponding to each sampling pixel point.
A calculating module 810 for calculating a separation distance between the target object and the target device based on the target image;
an adjustment module 812, configured to determine an adjustment distance of the target object based on the separation distance; the adjustment distance is used for prompting the target object to adjust the position.
In one embodiment, the acquisition module 802 is further configured to: acquiring sampling positions of sampling pixel points; based on the sampling position, four matching pixel points adjacent to the sampling pixel point are acquired from the initial image.
In one embodiment, the first determining module 804 is further configured to: dividing the matched pixel points into a first combination and a second combination; two matched pixel points in the first combination are adjacent left and right, and two matched pixel points in the second combination are adjacent left and right; determining a first sampling pixel point based on the two matching pixel points and the sampling pixel point in the first combination; a second sampled pixel point is determined based on the two matched pixel points and the sampled pixel point in the second combination.
In one embodiment, the first determining module 804 is further configured to: acquiring a first matching coordinate of a first matching pixel point and a second matching coordinate of a second matching pixel point; the first matching coordinate and the second matching coordinate comprise coordinate values of the first coordinate and coordinate values of the second coordinate; determining a first sampling coordinate of the first sampling pixel point based on the coordinate value of the second coordinate in the first matching coordinate and the coordinate value of the first coordinate in the sampling coordinate of the sampling pixel point; determining a first weight of the first matched pixel point and a second weight of the second matched pixel point based on the first matched coordinate, the second matched coordinate and the first sampling coordinate; and fusing the first pixel value of the first matched pixel point and the second pixel value of the second matched pixel point based on the first weight value and the second weight value to obtain a first sampling pixel value of the first sampling pixel point.
In one embodiment, the first determining module 804 is further configured to: determining a distance between the first matching sampling point and the second matching sampling point based on the first matching coordinate and the second matching coordinate; determining a first distance between the first matching sampling point and the first sampling pixel point based on the first matching coordinate and the first sampling coordinate, and determining a first weight of the first matching pixel point based on the first distance and the distance; a second distance between the second matched sampling point and the first sampled pixel point is determined based on the second matched coordinates and the first sampled coordinates, and a second weight of the second matched pixel point is determined based on the second distance and the distance.
In one embodiment, the computing module 810 is further to: determining an avatar area in the target image; based on the head portrait region in the target image, a positioning model algorithm is used to determine the separation distance between the target object and the target device.
In one embodiment, the computing module 810 is further to: performing uniform brightness treatment on the target image to obtain a brightness image; denoising the brightness image to obtain a denoised image; performing edge detection processing on the denoising image to obtain an edge image; based on the edge image, a separation distance between the target object and the target device is calculated.
The respective modules in the adjustment distance determination device described above may be implemented in whole or in part by software, hardware, or a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a terminal, and the internal structure thereof may be as shown in fig. 9. The computer device includes a processor, a memory, an input/output interface, a communication interface, a display unit, and an input means. The processor, the memory and the input/output interface are connected through a system bus, and the communication interface, the display unit and the input device are connected to the system bus through the input/output interface. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The input/output interface of the computer device is used to exchange information between the processor and the external device. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless mode can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement a method of adjusting a distance determination. The display unit of the computer device is used for forming a visual picture, and can be a display screen, a projection device or a virtual reality imaging device. The display screen can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, can also be a key, a track ball or a touch pad arranged on the shell of the computer equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by persons skilled in the art that the architecture shown in fig. 9 is merely a block diagram of some of the architecture relevant to the present inventive arrangements and is not limiting as to the computer device to which the present inventive arrangements are applicable, and that a particular computer device may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components.
In one embodiment, a computer device is provided, comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of the method embodiments described above when the computer program is executed.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when executed by a processor, carries out the steps of the method embodiments described above.
In an embodiment, a computer program product is provided, comprising a computer program which, when executed by a processor, implements the steps of the method embodiments described above.
The user information (including but not limited to user equipment information, user personal information, etc.) and the data (including but not limited to data for analysis, stored data, presented data, etc.) related to the present application are information and data authorized by the user or sufficiently authorized by each party.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magnetic random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (Phase Change Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in the form of a variety of forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), and the like. The databases referred to in the embodiments provided herein may include at least one of a relational database and a non-relational database. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processor referred to in the embodiments provided in the present application may be a general-purpose processor, a central processing unit, a graphics processor, a digital signal processor, a programmable logic unit, a data processing logic unit based on quantum computing, or the like, but is not limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the application and are described in detail herein without thereby limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of the application should be assessed as that of the appended claims.

Claims (10)

1. A method of adjusting a distance determination, the method comprising:
acquiring an initial sampling image, and acquiring a matching pixel point corresponding to each sampling pixel point in the initial sampling image from the initial image; the initial image is obtained by shooting a target object by target equipment;
determining a first sampling pixel point and a second sampling pixel point corresponding to the sampling pixel point based on the relative positions between the matching pixel point and the sampling pixel point;
Determining a sampling pixel value of the sampling pixel point based on the first sampling pixel point and the second sampling pixel point;
obtaining a target image corresponding to the initial image based on each sampling pixel point in the initial sampling image and the sampling pixel value corresponding to each sampling pixel point;
calculating a separation distance between the target object and the target device based on the target image;
determining an adjustment distance of the target object based on the separation distance; the adjustment distance is used for prompting the target object to adjust the position.
2. The method of claim 1, wherein the obtaining the matching pixel corresponding to the sampling pixel from the initial image comprises:
acquiring sampling positions of the sampling pixel points;
and acquiring four matched pixel points adjacent to the sampling pixel point from the initial image based on the sampling position.
3. The method of claim 1, wherein determining the first and second sampling pixels corresponding to the sampling pixel based on the relative position between the matching pixel and the sampling pixel comprises:
Dividing the matched pixel points into a first combination and a second combination; two matched pixel points in the first combination are adjacent left and right, and two matched pixel points in the second combination are adjacent left and right;
determining the first sampling pixel point based on the two matching pixel points and the sampling pixel point in the first combination;
the second sampled pixel point is determined based on the two matched pixel points and the sampled pixel point in the second combination.
4. A method according to claim 3, wherein the two matched pixels in the first combination are a first matched pixel and a second matched pixel, respectively; the determining the first sampling pixel based on the two matching pixels and the sampling pixel in the first combination includes:
acquiring a first matching coordinate of the first matching pixel point and a second matching coordinate of the second matching pixel point; the first matching coordinates and the second matching coordinates comprise coordinate values of the first coordinates and coordinate values of the second coordinates;
determining a first sampling coordinate of the first sampling pixel point based on the coordinate value of a second coordinate in the first matching coordinate and the coordinate value of a first coordinate in the sampling coordinates of the sampling pixel point;
Determining a first weight of the first matched pixel point and a second weight of the second matched pixel point based on the first matched coordinate, the second matched coordinate and the first sampling coordinate;
and fusing the first pixel value of the first matched pixel point and the second pixel value of the second matched pixel point based on the first weight value and the second weight value to obtain a first sampling pixel value of the first sampling pixel point.
5. The method of claim 4, wherein the determining the first weight for the first matched pixel and the second weight for the second matched pixel based on the first matched coordinate, the second matched coordinate, and the first sampled coordinate comprises:
determining a distance between the first matching sampling point and the second matching sampling point based on the first matching coordinate and the second matching coordinate;
determining a first distance between the first matching sampling point and the first sampling pixel point based on the first matching coordinate and the first sampling coordinate, and determining a first weight of the first matching pixel point based on the first distance and the distance;
And determining a second distance between the second matching sampling point and the first sampling pixel point based on the second matching coordinate and the first sampling coordinate, and determining a second weight of the second matching pixel point based on the second distance and the distance.
6. The method of claim 1, wherein the calculating a separation distance between the target object and the target device based on the target image comprises:
determining an avatar area in the target image;
and determining the separation distance between the target object and the target equipment by using a positioning model algorithm based on the head portrait area in the target image.
7. The method of claim 1, wherein the calculating a separation distance between the target object and the target device based on the target image comprises:
performing uniform brightness treatment on the target image to obtain a brightness image;
denoising the brightness image to obtain a denoised image;
performing edge detection processing on the denoising image to obtain an edge image;
and calculating the interval distance between the target object and the target equipment based on the edge image.
8. An adjustment distance determination device, the device comprising:
the acquisition module is used for acquiring an initial sampling image, and acquiring matched pixel points corresponding to the sampling pixel points from the initial image aiming at each sampling pixel point in the initial sampling image; the initial image is obtained by shooting a target object by target equipment;
the first determining module is used for determining a first sampling pixel point and a second sampling pixel point corresponding to the sampling pixel point based on the relative positions between the matching pixel point and the sampling pixel point;
a second determining module, configured to determine a sampling pixel value of the sampling pixel point based on the first sampling pixel point and the second sampling pixel point;
the generation module is used for obtaining a target image corresponding to the initial image based on each sampling pixel point in the initial sampling image and the sampling pixel value corresponding to each sampling pixel point;
a calculation module for calculating a separation distance between the target object and the target device based on the target image;
the adjusting module is used for determining the adjusting distance of the target object based on the interval distance; the adjustment distance is used for prompting the target object to adjust the position.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1 to 7 when the computer program is executed.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 7.
CN202310683129.6A 2023-06-09 2023-06-09 Adjustment distance determining method, device, computer equipment and storage medium Pending CN116883491A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310683129.6A CN116883491A (en) 2023-06-09 2023-06-09 Adjustment distance determining method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310683129.6A CN116883491A (en) 2023-06-09 2023-06-09 Adjustment distance determining method, device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116883491A true CN116883491A (en) 2023-10-13

Family

ID=88259401

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310683129.6A Pending CN116883491A (en) 2023-06-09 2023-06-09 Adjustment distance determining method, device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116883491A (en)

Similar Documents

Publication Publication Date Title
CN110574025B (en) Convolution engine for merging interleaved channel data
CN107330439B (en) Method for determining posture of object in image, client and server
CN110574026B (en) Convolution engine, electronic device and method for processing data
CN110637297B (en) Convolution engine, data processing method and electronic equipment
WO2016054779A1 (en) Spatial pyramid pooling networks for image processing
US20160267349A1 (en) Methods and systems for generating enhanced images using multi-frame processing
US20200111234A1 (en) Dual-view angle image calibration method and apparatus, storage medium and electronic device
JP2016508652A (en) Determining object occlusion in image sequences
CN116580028B (en) Object surface defect detection method, device, equipment and storage medium
Yung et al. Efficient feature-based image registration by mapping sparsified surfaces
WO2023169281A1 (en) Image registration method and apparatus, storage medium, and electronic device
CN111325828B (en) Three-dimensional face acquisition method and device based on three-dimensional camera
CN110163095B (en) Loop detection method, loop detection device and terminal equipment
CN113963072A (en) Binocular camera calibration method and device, computer equipment and storage medium
CN117058022A (en) Depth image denoising method and device, computer equipment and storage medium
CN115564639A (en) Background blurring method and device, computer equipment and storage medium
CN116091998A (en) Image processing method, device, computer equipment and storage medium
CN110956131A (en) Single-target tracking method, device and system
CN115514887A (en) Control method and device for video acquisition, computer equipment and storage medium
Chen et al. Depth-guided deep filtering network for efficient single image bokeh rendering
CN116883491A (en) Adjustment distance determining method, device, computer equipment and storage medium
CN115063473A (en) Object height detection method and device, computer equipment and storage medium
CN108062741B (en) Binocular image processing method, imaging device and electronic equipment
KR20180012638A (en) Method and apparatus for detecting object in vision recognition with aggregate channel features
Bhatia et al. Accurate corner detection methods using two step approach

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination