CN113254910B - User convenient authentication method and device for unmanned vehicle authentication system - Google Patents

User convenient authentication method and device for unmanned vehicle authentication system Download PDF

Info

Publication number
CN113254910B
CN113254910B CN202110765446.3A CN202110765446A CN113254910B CN 113254910 B CN113254910 B CN 113254910B CN 202110765446 A CN202110765446 A CN 202110765446A CN 113254910 B CN113254910 B CN 113254910B
Authority
CN
China
Prior art keywords
user
image
authentication
unmanned vehicle
certificate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110765446.3A
Other languages
Chinese (zh)
Other versions
CN113254910A (en
Inventor
赵增侠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Neolithic Zhiye Anyang Intelligent Technology Co ltd
Original Assignee
Neolix Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neolix Technologies Co Ltd filed Critical Neolix Technologies Co Ltd
Priority to CN202110765446.3A priority Critical patent/CN113254910B/en
Publication of CN113254910A publication Critical patent/CN113254910A/en
Application granted granted Critical
Publication of CN113254910B publication Critical patent/CN113254910B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The disclosure provides a user convenient authentication method and device for an unmanned vehicle authentication system. The method is applied to unmanned equipment, automatic driving equipment or an unmanned vehicle, and comprises the following steps: detecting users around the unmanned vehicle, and acquiring a first image of the user when detecting that at least one user enters a preset range boundary around the unmanned vehicle; responding to an operation instruction of a user to the authentication system, acquiring a second image of the user, taking the second image as an authentication certificate of the user, sending the authentication certificate to the authentication system for authentication, and passing the authentication when the matching degree of the second image and the authentication image exceeds a first threshold value; otherwise, the authentication is not passed, the first image is called, the matching degree of the first image and the second image is calculated, if the matching degree exceeds a second threshold value, the first image is used as a new authentication certificate of the user, and the new authentication certificate is sent to an authentication system for user authentication. The method and the device for the user authentication achieve the user authentication in the editing place, improve the efficiency of the user authentication and improve the user experience.

Description

User convenient authentication method and device for unmanned vehicle authentication system
Technical Field
The disclosure relates to the technical field of unmanned driving, in particular to a user convenient authentication method and device for an unmanned vehicle authentication system.
Background
Unmanned vehicles are also known as autonomous vehicles and unmanned vehicles. With the development of unmanned technology, the application field of unmanned vehicles including unmanned delivery vehicles, unmanned retail vehicles, unmanned cleaning vehicles, etc. is gradually expanding, for example, to be distinguished by use scenes.
At present, there are still some insufficiencies in the aspect of user's use experience in unmanned car, for example when unmanned retail car verifies user's identity after the user is placed an order, because unmanned retail car's vehicle height is lower, when the higher user of height swipes face payment after shopping, often need half squat down the health and put into the shooting scope of camera completely with the face, payment system just can realize the payment of swipes face, consequently makes user's payment experience not friendly enough. The current unmanned retail vehicle realizes following of human face image through the structure with the help of complicacy, the removal framework that utilizes the camera and claps, however, this kind of mode need reform transform the outward appearance structure of unmanned vehicle, has increased the load of unmanned vehicle, has promoted the manufacturing cost of unmanned vehicle.
Disclosure of Invention
In view of this, the embodiments of the present disclosure provide a user convenient authentication method and device for an unmanned vehicle authentication system, so as to solve the problems that the user cannot be authenticated conveniently and the user experience is not friendly enough in the prior art.
In a first aspect of the embodiments of the present disclosure, a user convenient authentication method for an unmanned vehicle authentication system is provided, including: detecting users around the unmanned vehicle, and acquiring a first image of the user when detecting that at least one user enters a preset range boundary around the unmanned vehicle; responding to an operation instruction of a user to the authentication system, acquiring a second image of the user, taking the second image as an authentication certificate of the user, sending the authentication certificate to the authentication system for authentication, wherein the authentication certificate is prestored in the authentication system and corresponds to the authentication image of the user, and when the matching degree of the second image and the authentication image exceeds a first threshold value, the authentication is passed; when the matching degree of the second image and the authentication image does not exceed the first threshold value, the authentication is not passed; and if the authentication is not passed, calling the first image, calculating the matching degree of the first image and the second image, taking the first image as a new authentication certificate of the user if the matching degree exceeds a second threshold value, and sending the new authentication certificate to an authentication system for user authentication, wherein the second threshold value is less than or equal to the first threshold value.
In a second aspect of the embodiments of the present disclosure, a user convenient authentication apparatus for an unmanned vehicle authentication system is provided, including: the detection module is configured to detect users around the unmanned vehicle, and when at least one user is detected to enter a preset range boundary around the unmanned vehicle, a first image of the user is obtained; the acquisition module is configured to respond to an operation instruction of a user on the authentication system, acquire a second image of the user, take the second image as an authentication certificate of the user, send the authentication certificate to the authentication system for authentication, pre-store the authentication image corresponding to the operation user in the authentication system, and pass the authentication when the matching degree of the second image and the authentication image exceeds a first threshold; when the matching degree of the second image and the authentication image does not exceed the first threshold value, the authentication is not passed; and the authentication module is configured to call the first image and calculate the matching degree of the first image and the second image if the authentication is not passed, and if the matching degree exceeds a second threshold value, the first image is used as a new authentication certificate of the user and the new authentication certificate is sent to the authentication system for user authentication, wherein the second threshold value is less than or equal to the first threshold value.
In a third aspect of the embodiments of the present disclosure, an electronic device is provided, which includes a memory, a processor and a computer program stored in the memory and executable on the processor, and the processor implements the steps of the method when executing the program.
In a fourth aspect of the embodiments of the present disclosure, a computer-readable storage medium is provided, which stores a computer program, which when executed by a processor, implements the steps of the above-mentioned method.
The embodiment of the present disclosure adopts at least one technical scheme that can achieve the following beneficial effects:
detecting users around the unmanned vehicle, and acquiring a first image of the user when detecting that at least one user enters a preset range boundary around the unmanned vehicle; responding to an operation instruction of a user to the authentication system, acquiring a second image of the user, taking the second image as an authentication certificate of the user, sending the authentication certificate to the authentication system for authentication, wherein the authentication certificate is prestored in the authentication system and corresponds to the authentication image of the user, and when the matching degree of the second image and the authentication image exceeds a first threshold value, the authentication is passed; when the matching degree of the second image and the authentication image does not exceed the first threshold value, the authentication is not passed; if the authentication is not passed, the first image is called, the matching degree of the first image and the second image is calculated, if the matching degree exceeds a second threshold value, the first image is used as a new authentication certificate of the user, the new authentication certificate is sent to an authentication system for user authentication, and the second threshold value is smaller than or equal to the first threshold value, so that the user authentication can be conveniently realized, the user use experience is improved, and the production cost of the unmanned vehicle is reduced.
Drawings
To more clearly illustrate the technical solutions in the embodiments of the present disclosure, the drawings needed for the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and other drawings can be obtained by those skilled in the art without inventive efforts.
Fig. 1 is an overall architecture diagram related to an embodiment of the present disclosure in a practical application scenario;
fig. 2 is a schematic flow chart of a user convenience authentication method for an unmanned vehicle authentication system according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of a user convenience authentication device for an unmanned vehicle authentication system according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of an electronic device provided in an embodiment of the present disclosure.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the disclosed embodiments. However, it will be apparent to one skilled in the art that the present disclosure may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present disclosure with unnecessary detail.
As described above, although the existing unmanned vehicle has more and more abundant functions and higher and more intelligent degree, the unmanned vehicle still cannot well meet the requirements of users in some application scenarios due to the body shape and structure of the unmanned vehicle, and still has some defects in the aspect of user experience. For example, taking an unmanned retail vehicle as an example, the unmanned retail vehicle can also be called an unmanned vending vehicle, functions such as new energy power, unmanned driving, 5G communication, user identification door opening, unmanned retail, face brushing payment and the like are carried in the unmanned retail vehicle, real-time positioning, remote scheduling and the like can be realized, and the unmanned retail vehicle can run in a complex outdoor environment and is sold without people.
In the correlation technique, because the vehicle height of unmanned retail vehicle is often than lower, when the higher user of height swipes the face payment after shopping, need half squat down the health and put into the shooting scope of camera completely with the face, payment system just can realize the payment of swipes the face, obviously, this kind of mode through swipes the face to user authentication, need the user go to the position of the camera of cooperation unmanned retail vehicle and swipes the face payment, and not unmanned retail vehicle comes the demand and the experience of serving the user. Although, in the prior art, the image of the face of the user can be captured through the movement of the camera, for example, when the vehicle height cannot meet the demand of face brushing payment of the user in a normal standing posture, the following shooting of the face of the user is realized by installing a mobile framework on the camera through a complex structure. However, the mode of improving the external structure of the unmanned retail vehicle greatly increases the load of the unmanned retail vehicle and increases the production cost of the unmanned retail vehicle.
The following describes a system architecture of the embodiments of the present disclosure in a practical application scenario with reference to the drawings. Fig. 1 is an overall architecture diagram related to an embodiment of the present disclosure in a practical application scenario. As shown in fig. 1, the system architecture under this scenario mainly includes the following:
the automatic vending machine comprises at least one unmanned retail vehicle 101, wherein a goods shelf is arranged in a carriage of the unmanned retail vehicle 101, goods are placed in the goods shelf, and a display screen is arranged right in front of one side of the carriage; the body of the unmanned retail vehicle 101 is provided with a plurality of cameras in different directions and angles. There are a plurality of users 102 (users who potentially use the unmanned retail vehicle) around the unmanned retail vehicle 101, each user 102 has a certain distance from the unmanned retail vehicle 101, and when one or more users 102 among the users 102 get closer to the unmanned retail vehicle 101 and an operation instruction is given to the unmanned retail vehicle through a touch screen on the unmanned retail vehicle 101, the user 102 who is operating the unmanned retail vehicle can be considered as an operating user (a user who is using the unmanned retail vehicle).
In the embodiment of the disclosure, by detecting users in a certain range near the unmanned retail vehicle, when the sensor senses that the users approach the unmanned retail vehicle and reaches a preset distance, a plurality of user images are continuously captured by adopting a plurality of cameras in different directions and angles, wherein the user images at least comprise face images of the users; after an operation user (namely a user who is in operation and use) is determined, acquiring an operation user image, wherein the operation user image at least comprises a part of human body features corresponding to the operation user; extracting corresponding human body characteristics from the user image and the operation user image respectively, comparing the human body characteristics of the two objects (the user and the operation user) to determine the similarity between the human body characteristics, and taking the human face image of the user as a certificate for the operation of the operation user when the similarity between the human body characteristics of the operation user and the human body characteristics of a certain user reaches a preset threshold value. The method and the device for authenticating the unmanned retail vehicle do not need to modify the structure of the existing unmanned retail vehicle in a complex way, can conveniently and quickly authenticate the operation of the user, improve the user experience and reduce the production cost of the unmanned retail vehicle.
It should be noted that the above practical application scenario only represents a scenario under an example, the embodiment of the present disclosure is not limited to user authentication in a face payment scenario of an unmanned retail vehicle, and other scenarios are also applicable to the scheme, for example, when an unmanned delivery vehicle delivers an order, the authority and identity of a user may be authenticated in a face brushing manner, and after the user authentication is passed, a shelf delivery package is automatically opened.
The embodiments of the present disclosure are explained in detail below.
Fig. 2 is a schematic flow chart of a user convenient authentication method for an unmanned vehicle authentication system according to an embodiment of the present disclosure. The user convenience authentication method for the unmanned vehicle authentication system of fig. 2 may be performed by an electronic device in an autonomous driving system. As shown in fig. 2, the user convenient authentication method may specifically include:
s201, detecting users around the unmanned vehicle, and acquiring a first image of the user when detecting that at least one user enters a preset range boundary around the unmanned vehicle;
s202, responding to an operation instruction of a user to the authentication system, acquiring a second image of the user, taking the second image as an authentication certificate of the user, sending the authentication certificate to the authentication system for authentication, wherein the authentication certificate is prestored in the authentication system and corresponds to the operation user, and when the matching degree of the second image and the authentication image exceeds a first threshold value, the authentication is passed; when the matching degree of the second image and the authentication image does not exceed the first threshold value, the authentication is not passed;
s203, if the authentication is not passed, calling the first image, calculating the matching degree of the first image and the second image, if the matching degree exceeds a second threshold value, taking the first image as a new authentication certificate of the user, and sending the new authentication certificate to an authentication system for user authentication, wherein the second threshold value is smaller than or equal to the first threshold value.
Specifically, the unmanned vehicles include an unmanned retail vehicle, an unmanned distribution vehicle, and the like, and the following embodiments of the present disclosure are described by taking a face payment scene of a user of the unmanned retail vehicle as an example. The embodiment of the disclosure does not specifically limit the structure of the unmanned retail vehicle, and the existing unmanned retail vehicle can be applied to the scheme. In the embodiment of the present disclosure, the number of users and operation users is not limited to one, and the embodiment of the present disclosure is also applicable in a scenario of multiple operation users.
Further, when detecting a user near the unmanned vehicle, a distance between the user and the unmanned vehicle may be detected by a sensing device (e.g., a sensor) mounted on the unmanned vehicle, and when the distance of the sensing user reaches a certain range, a user image (i.e., a first image) is acquired by the camera. Since the embodiment of the disclosure is directed to the face payment scene of the unmanned retail vehicle, the operation instruction directed to the unmanned retail vehicle can be regarded as an operation instruction of face payment. The human features extracted from the user image and the operation user image include, but are not limited to, the following features: neck, chin, nose, eyes, forehead, hair, ears, shoulders, etc.
It should be noted that the authentication system in the embodiment of the present disclosure may be an independent third-party system, may be a subsystem embedded in the unmanned vehicle shopping system, or may be a system virtualized by a part of functional units in the unmanned vehicle system. The authentication system is connected with the shopping system of the unmanned vehicle through a port, so that interaction between the authentication system and the internal system of the unmanned vehicle is realized.
According to the technical scheme provided by the embodiment of the disclosure, users in a preset range around the unmanned vehicle are detected, and when at least one user is detected, a first image of the user is obtained; responding to an operation instruction aiming at the unmanned vehicle, determining an operation user who operates currently, and acquiring a second image of the operation user, wherein the second image at least comprises part of human body characteristics of the operation user; the method comprises the steps of performing feature extraction operation on a first image and a second image, comparing the human features corresponding to the second image with the human features corresponding to the first image, taking the first image corresponding to the comparison result meeting preset requirements as a certificate of an operation user, authenticating the operation of the operation user based on the certificate of the operation user, conveniently and quickly authenticating the user, improving user experience and reducing the production cost of the unmanned vehicle.
In some embodiments, detecting users around the unmanned vehicle, and when it is detected that at least one user enters a preset range boundary around the unmanned vehicle, acquiring a first image of the user includes: in the process that a user approaches the unmanned vehicle from far to near, detecting the user around the unmanned vehicle by using an induction device arranged on the unmanned vehicle, and when the fact that the distance between the user and the unmanned vehicle reaches a preset range is detected, acquiring a plurality of image frames within a period of time by using a first camera, and taking the plurality of image frames as first images of the user; wherein at least one image frame of the first image comprises a face image of the user.
In particular, sensors may be utilized to sense users within the vicinity of the unmanned retail vehicle, such as radar sensors, sonic sensors, and the like. In practical application, a distance range (for example, a range of 2-3 meters from the unmanned retail vehicle) can be set, when the distance between the user and the unmanned retail vehicle is detected to be within the distance range, the first camera mounted on the unmanned retail vehicle is used for continuously shooting the user, image frames generated by continuously shooting the user for a period of time are acquired, and the continuous image frames are used as the first image of the user. It is to be understood that continuous shooting is not to be narrowly understood as shooting without a time interval, and in practical applications, a user may be shot with a certain time interval set according to the configuration of the camera and the light conditions of the external environment.
Here, the first camera may include a plurality of cameras, for example, the plurality of first cameras may be mounted on the unmanned retail vehicle at different orientations and angles to ensure that the full-face image of the user can be contained in at least one image frame of the continuously captured first images.
Further, in addition to acquiring a plurality of image frames within a period of time as the first image of the user, the number of the captured image frames may be set, and a preset number of image frames are acquired by using the first camera, and the plurality of image frames are used as the first image of the user, for example: when a user enters a preset distance range of the unmanned retail vehicle, continuously shooting 20 image frames as a first image; the first image may be an image set including a plurality of image frames, or may be a single image, and specifically, one image frame (for example, a full-face image) including the most facial features of the user may be selected as the first image of the user from among a plurality of image frames continuously captured.
After the first image of the user is obtained, the first image can be stored in a memory corresponding to an automatic driving system of the unmanned retail vehicle, and the first image can also be sent to a background server and stored in a database of the background server, so that the user can be authenticated based on the first image in the following process.
In some embodiments, since the first image is an image captured by the user outside a certain range from the unmanned vehicle, the full-face feature of the user can be collected in the first image, and the human feature information amount corresponding to the first image exceeds the human feature information amount corresponding to the second image.
According to the technical scheme provided by the embodiment of the disclosure, in the process that the user approaches the unmanned vehicle from far to near, when the user enters the preset range, the first camera is started to shoot, and when the user operates the unmanned vehicle to carry out face brushing payment, the second image is obtained, and the first image contains the full-face image of the user, so that the system can obtain the image capable of passing authentication without extra operation of the user. In addition, a plurality of user images are continuously captured through a plurality of cameras with different directions and angles on the unmanned retail vehicle, wherein the user images at least comprise full face images of the users, so that the face images of potential operation users are stored in advance for subsequent authentication of the operation users. According to the embodiment of the disclosure, when the user is away from the unmanned retail vehicle within a certain range, the face image of the user can be collected, so that the efficiency of subsequent user authentication is improved, and the waiting time of the user is saved.
In some embodiments, the obtaining a second image of the user in response to an operation instruction of the authentication system by the user includes: utilizing a second camera mounted on the unmanned vehicle in response to a user's operation instruction to the authentication systemClap the user Taking a photograph(ii) a The operation instruction comprises a face authentication operation instruction.
Specifically, since the embodiment of the present disclosure collects the user image when the user is within a certain range from the unmanned retail vehicle, the user who actually performs face brushing payment is not exactly the same as the user who previously collects the image, and therefore, it is necessary to determine the operating user who is currently operating, that is, the operating user who is performing face brushing payment.
Further, before the operation user performs the face-brushing payment, other operations need to be performed, such as selecting a commodity, clicking to place an order, selecting the face-brushing payment, and the like, and since the user may not purchase the commodity at last, in order to avoid generating unnecessary data, the user who clicks to place the order and selects the face-brushing payment is used as the operation user, and in practical application, the operation user is determined by using the second camera or the sensing device in response to the order placement and face-brushing payment instruction of the user.
In another embodiment of the present disclosure, since there may be more than one user in front of the second camera of the unmanned retail vehicle when the user places an order and performs face brushing payment, for example, there may be two users in front of the second camera when a friend accompanies the purchase, in order to determine which user is the final operating user (the user who actually performs face brushing payment), the second camera or the sensing device may be used to determine the operating user. The following describes two ways of determining the operation user with reference to a specific embodiment, which may specifically include the following:
when the second camera is used for determining the operation user, the front image can be collected through the second camera, the collected image is analyzed, the position and the occupied area of each user in the image are determined, and the operation user is determined according to the position and the occupied area of the user in the image.
When the operating user is determined by the sensing device, the position of the user in the user payment area (the position area where the user is in payment) of the unmanned retail vehicle and the distance and angle between the user and the unmanned retail vehicle can be determined by the sensing device to determine which user is most likely to be the operating user.
In some embodiments, obtaining a second image of the operating user comprises: when the operation instruction for the authentication system is responded, the image frame acquired by the second camera is used as a second image of the user; or, after responding to an operation instruction for the authentication system, using the continuous image frames collected by the second camera as a second image of the operation user; the second camera comprises a camera arranged above the user operation interface.
Specifically, when the second image of the operating user is acquired after the operating user is determined, at least one of the following two ways may be adopted, specifically:
in a first mode
And after the operating user issues an operating instruction of face payment to the unmanned retail vehicle, the automatic driving system responds to the operating instruction, starts the second camera to shoot an image frame, and takes the image frame as a second image of the operating user. Because this disclosed embodiment need not the user to meet the position of camera, consequently, even when operation user's height is higher, when operation user's face position was a little higher than the position of second camera, still probably contain operation user's some human body characteristic in the second image of gathering, like neck, chin, mouth, nose point etc.. Therefore, in order to improve the efficiency of user feature extraction and shorten the time for subsequent user authentication, the first image frame acquired by the second camera may be used as the second image corresponding to the subsequently processed operation user.
Mode two
The automatic driving system responds to a payment operation instruction, the second camera is started to shoot continuous image frames, in actual operation, the face position of a user possibly completely exceeds the shooting range of the second camera, the user can slightly adjust the height of the user in order to complete payment, and the face position of the user is dynamically changed in the process of adjusting the height of the user, so that images containing partial human body features can be obtained only by collecting continuous image frames.
It should be noted that, in the second mode, the process of acquiring the continuous image frames by using the second camera may be executed synchronously with the subsequent human body feature comparison operation, that is, each image frame is acquired, and compared with the stored first image, until the matching degree between the human body features in the acquired image frames and the human body features in the first image reaches the threshold, the acquisition is stopped.
Further, since the operation user is determined among the users, and the user includes the operation user, the image set B corresponding to the second image will be smaller than or equal to the image set a corresponding to the first image. In addition, the second image is not necessarily a face image, that is, the image frame of the second image does not necessarily include face features, and may include only body features such as shoulders and neck.
In some embodiments, calculating a matching degree between the first image and the second image, and if the matching degree exceeds a second threshold, taking the first image as a new authentication credential of the user includes: sequentially comparing the human body features corresponding to each image frame of the second image with the human body features corresponding to each image frame of the first image, and determining the feature matching degree between the human body features of the second image and the human body features of the first image; and when the feature matching degree exceeds a second threshold value, taking the first image exceeding the second threshold value as a new authentication certificate of the user.
Specifically, human features in each image frame (including the image frames corresponding to the first image and the second image) are sequentially extracted, and the human features corresponding to the image frame of the second image are sequentially compared with the human features corresponding to the image frame of the first image, for example: human body features of the neck and chin parts are extracted from the image frame B1 of the second image, human body features of the hair, forehead, eyes, ears, nose, mouth, chin, neck and the like are extracted from the image frame a1 of the first image, and the features of the neck and chin parts of the image frame B1 and the human body features of the parts in the image frame a1 are sequentially compared.
Further, the similarity (namely the feature matching degree) of the human body features between the two image frames is determined, and when the similarity of the human body features between the image frames reaches a fixed threshold, the image frame of the first image corresponding to the image frame of the second image when the similarity of the human body features between the image frames reaches a second threshold is used as a certificate of the operation user. Continuing with the above embodiment, for example, setting the fixed threshold to 50%, when the similarity between the features of the neck and chin of image frame B1 and the features of the neck and chin of image frame a1 reaches 50%, image frame a1 corresponding to the first image is taken as the face payment credential of the operating user.
In some embodiments, before sending the new authentication credentials to the authentication system for user authentication, further comprising: sending the new authentication voucher to a display screen of a user operation interface for displaying so that a user can confirm the new authentication voucher; the authentication voucher comprises a face payment voucher, and the new authentication voucher is sent to an authentication system for user authentication, wherein the authentication method comprises the following steps: and verifying the face payment certificate of the user by using the authorization license certificate prestored in the authentication system, and performing consumption payment based on the face payment certificate of the user after the verification is passed.
Specifically, after an image (an image frame corresponding to a first image) for operating a user to perform face payment is determined, the image frame of the first image is called from a database and is sent to a display screen (which may be a touch screen integrated with the user operation) of a user operation interface for displaying. In practical application, when the comparison result is sent to the display screen to confirm the user, the value of the similarity between the first image and the second image can also be sent to the display screen so that the user can confirm the value.
According to the technical scheme provided by the embodiment of the disclosure, the face image of a user is obtained in advance before the user uses the unmanned retail vehicle to carry out face brushing payment, the user image is collected again when the user carries out face brushing payment, and the human body feature similarity comparison is carried out on the user images collected twice before and after the user image is collected, so that the corresponding relation between the current operating user and the user image stored before is determined, namely the user image which is the same person as the operating user is determined; and the comparison result is sent to the operation user for confirmation, so that the identification error is avoided, and the error risk is reduced.
In some embodiments, the credentials include face payment credentials, authenticating an operation of the operating user based on the operating user's credentials, including: and verifying the face payment certificate of the operation user by utilizing the prestored authorization license certificate, and performing consumption payment based on the face payment certificate of the operation user after the face payment certificate passes the verification.
Specifically, after the first image with the human body feature similarity reaching the threshold is used as the face payment certificate, the authorization permission certificate of the operating user is obtained, where the authorization permission certificate is used to verify the face payment certificate of the operating user, and the authorization permission certificate may be pre-collected human body feature data (such as face feature data) of the user, or may be collected human body feature data when the user obtains the authorization in the actual business process.
Further, in the embodiment of the disclosure, after the face brushing payment of the operation user is successful, the unmanned retail vehicle automatically transports the goods from the goods shelf to the goods taking port to complete the goods discharging action; after the operation user gets the goods, the previously stored first image corresponding to the operation user is deleted, or the first images of all the users may be deleted.
According to the technical scheme provided by the embodiment of the disclosure, when a pedestrian is in a certain range from an unmanned retail vehicle, some cameras are used for collecting images, the collected images can be used for shooting the full-face image of a user, then, when the user conducts face brushing payment, the images of the user are collected through the cameras, and the collected images may only contain partial face features; the similarity comparison of human body features is carried out on the images collected before and after, so that the face image corresponding to the current operation user is judged, and the face image of the operation user is used as a face brushing payment certificate. Therefore, when the height of the operating user is high, the face image of the operating user can be judged based on the collected partial features, the user does not need to actively cooperate with the camera to collect the face image, the operation of the user can be quickly authenticated, and the user experience is more friendly.
The following are embodiments of the disclosed apparatus that may be used to perform embodiments of the disclosed methods. For details not disclosed in the embodiments of the apparatus of the present disclosure, refer to the embodiments of the method of the present disclosure.
Fig. 3 is a schematic structural diagram of a user convenience authentication device for an unmanned vehicle authentication system according to an embodiment of the present disclosure. As shown in fig. 3, the user convenience authentication apparatus for the unmanned vehicle authentication system includes:
the detection module 301 is configured to detect users around the unmanned vehicle, and when it is detected that at least one user enters a preset range boundary around the unmanned vehicle, obtain a first image of the user;
the obtaining module 302 is configured to obtain a second image of the user in response to an operation instruction of the user on the authentication system, use the second image as an authentication credential of the user, send the authentication credential to the authentication system for authentication, where the authentication credential is pre-stored with an authentication image corresponding to the operation user, and when a matching degree of the second image and the authentication image exceeds a first threshold, the authentication is passed; when the matching degree of the second image and the authentication image does not exceed the first threshold value, the authentication is not passed;
and the authentication module 303 is configured to, if the authentication fails, retrieve the first image, calculate a matching degree between the first image and the second image, if the matching degree exceeds a second threshold, use the first image as a new authentication credential of the user, and send the new authentication credential to the authentication system for user authentication, where the second threshold is less than or equal to the first threshold.
In some embodiments, the detection module 301 of fig. 3 detects a user around the unmanned vehicle by using an induction device installed on the unmanned vehicle during the process that the user approaches the unmanned vehicle from far to near, and when it is detected that the distance between the user and the unmanned vehicle reaches a preset range, acquires a plurality of image frames within a period of time by using a first camera, and uses the plurality of image frames as a first image of the user; wherein at least one image frame of the first image comprises a face image of the user.
In some embodiments, the obtaining module 302 of fig. 3 utilizes a second camera mounted on the unmanned vehicle in response to a user operating an authentication systemPhotographing a user(ii) a The operation instruction comprises a face authentication operation instruction.
In some embodiments, the obtaining module 302 of fig. 3 will utilize the image frame captured by the second camera as the second image of the user when responding to the operation instruction for the authentication system; or, after responding to an operation instruction for the authentication system, using the continuous image frames acquired by the second camera as a second image of the user; the second camera comprises a camera arranged above the user operation interface.
In some embodiments, the authentication module 303 in fig. 3 sequentially compares the human body features corresponding to each image frame of the second image with the human body features corresponding to each image frame of the first image, and determines a feature matching degree between the human body features of the second image and the human body features of the first image; and when the feature matching degree exceeds a second threshold value, taking the first image exceeding the threshold value as a new authentication certificate of the user.
In some embodiments, the validation module 304 of fig. 3 further comprises, before sending the new authentication credentials to the authentication system for user authentication: and sending the new authentication voucher to a display screen of a user operation interface for displaying so that the user can confirm the new authentication voucher.
In some embodiments, the authentication credentials include face payment credentials, and the authentication module 303 in fig. 3 verifies the face payment credentials of the user by using authorization permission credentials pre-stored in the authentication system, and when the verification passes, performs consumption payment based on the face payment credentials of the user.
Fig. 4 is a schematic structural diagram of the electronic device 4 provided in the embodiment of the present disclosure. As shown in fig. 4, the electronic apparatus 4 of this embodiment includes: a processor 401, a memory 402 and a computer program 403 stored in the memory 402 and executable on the processor 401. The steps in the various method embodiments described above are implemented when the processor 401 executes the computer program 403. Alternatively, the processor 401 implements the functions of the respective modules/units in the above-described respective apparatus embodiments when executing the computer program 403.
Illustratively, the computer program 403 may be partitioned into one or more modules/units, which are stored in the memory 402 and executed by the processor 401 to accomplish the present disclosure. One or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 403 in the electronic device 4.
The electronic device 4 may be a desktop computer, a notebook, a palm computer, a cloud server, or other electronic devices. The electronic device 4 may include, but is not limited to, a processor 401 and a memory 402. Those skilled in the art will appreciate that fig. 4 is merely an example of the electronic device 4, and does not constitute a limitation of the electronic device 4, and may include more or less components than those shown, or combine certain components, or different components, e.g., the electronic device may also include input-output devices, network access devices, buses, etc.
The Processor 401 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 402 may be an internal storage unit of the electronic device 4, for example, a hard disk or a memory of the electronic device 4. The memory 402 may also be an external storage device of the electronic device 4, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the electronic device 4. Further, the memory 402 may also include both internal storage units of the electronic device 4 and external storage devices. The memory 402 is used for storing computer programs and other programs and data required by the electronic device. The memory 402 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules, so as to perform all or part of the functions described above. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
In the embodiments provided in the present disclosure, it should be understood that the disclosed apparatus/computer device and method may be implemented in other ways. For example, the above-described apparatus/computer device embodiments are merely illustrative, and for example, a division of modules or units, a division of logical functions only, an additional division may be made in actual implementation, multiple units or components may be combined or integrated with another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, the present disclosure may implement all or part of the flow of the method in the above embodiments, and may also be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of the above methods and embodiments. The computer program may comprise computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain suitable additions or additions that may be required in accordance with legislative and patent practices within the jurisdiction, for example, in some jurisdictions, computer readable media may not include electrical carrier signals or telecommunications signals in accordance with legislative and patent practices.
The above examples are only intended to illustrate the technical solutions of the present disclosure, not to limit them; although the present disclosure has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present disclosure, and are intended to be included within the scope of the present disclosure.

Claims (10)

1. A user convenient authentication method for an unmanned vehicle authentication system is characterized by comprising the following steps:
detecting users around the unmanned vehicle, and acquiring a first image of at least one user when the user is detected to enter a preset range boundary around the unmanned vehicle;
responding to an operation instruction of a user to an authentication system, acquiring a second image of the user, taking the second image as an authentication certificate of the user, sending the authentication certificate to the authentication system for authentication, wherein the authentication system stores an authentication image corresponding to the operation user in advance, and when the matching degree of the second image and the authentication image exceeds a first threshold value, the authentication is passed; when the matching degree of the second image and the authentication image does not exceed a first threshold value, the authentication is not passed;
if the authentication is not passed, the first image is called, the matching degree of the first image and the second image is calculated, if the matching degree exceeds a second threshold value, the first image is used as a new authentication certificate of the user, and the new authentication certificate is sent to an authentication system for user authentication, wherein the second threshold value is smaller than or equal to the first threshold value.
2. The method according to claim 1, wherein the amount of human characteristic information corresponding to the first image exceeds the amount of human characteristic information corresponding to the second image.
3. The method of claim 1, wherein the detecting users around the unmanned vehicle, and when it is detected that at least one user enters a preset range boundary around the unmanned vehicle, acquiring a first image of the user comprises:
in the process that a user approaches an unmanned vehicle from far to near, detecting the user around the unmanned vehicle by using an induction device arranged on the unmanned vehicle, and when the fact that the distance between the user and the unmanned vehicle reaches a preset range is detected, acquiring a plurality of image frames within a period of time by using a first camera, and taking the plurality of image frames as first images of the user;
wherein at least one image frame of the first image comprises a face image of the user.
4. The method of claim 2, wherein the obtaining a second image of the user in response to an operation instruction of the authentication system by the user comprises:
responding to an operation instruction of a user to an authentication system, and shooting the user by using a second camera installed on the unmanned vehicle;
the operation instruction comprises a face authentication operation instruction.
5. The method of claim 4, wherein said obtaining a second image of the user comprises:
taking the image frame acquired by the second camera as a second image of the user when the operation instruction for the authentication system is responded; alternatively, the first and second electrodes may be,
using the continuous image frames acquired by the second camera after responding to the operation instruction of the authentication system as a second image of the user;
wherein the second camera comprises a camera mounted above the user interface.
6. The method of claim 1, wherein the calculating the matching degree between the first image and the second image, and if the matching degree exceeds a second threshold, taking the first image as a new authentication credential of the user comprises:
sequentially comparing the human body features corresponding to each image frame of the second image with the human body features corresponding to each image frame of the first image, and determining the feature matching degree between the human body features of the second image and the human body features of the first image;
and when the feature matching degree exceeds the second threshold value, taking the first image exceeding the second threshold value as a new authentication certificate of the user.
7. The method of claim 1, further comprising, prior to said sending the new authentication credentials to an authentication system for user authentication:
sending the new authentication voucher to a display screen of a user operation interface for displaying so that the user can confirm the new authentication voucher;
the authentication voucher comprises a face payment voucher, and the step of sending the new authentication voucher to an authentication system for user authentication comprises the following steps:
and verifying the face payment certificate of the user by using the authorization license certificate prestored in the authentication system, and performing consumption payment based on the face payment certificate of the user after the verification is passed.
8. A user convenience authentication device for an unmanned vehicle authentication system, comprising:
the detection module is configured to detect users around the unmanned vehicle, and when at least one user is detected to enter a preset range boundary around the unmanned vehicle, a first image of the user is obtained;
the acquisition module is configured to respond to an operation instruction of a user on an authentication system, acquire a second image of the user, use the second image as an authentication certificate of the user, send the authentication certificate to the authentication system for authentication, wherein the authentication system stores an authentication image corresponding to an operation user in advance, and when the matching degree of the second image and the authentication image exceeds a first threshold value, the authentication is passed; when the matching degree of the second image and the authentication image does not exceed a first threshold value, the authentication is not passed;
and the authentication module is configured to call the first image and calculate the matching degree of the first image and the second image if the authentication does not pass, and if the matching degree exceeds a second threshold value, the first image is used as a new authentication certificate of the user and the new authentication certificate is sent to an authentication system for user authentication, wherein the second threshold value is smaller than or equal to the first threshold value.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method of any one of claims 1 to 7 when executing the program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
CN202110765446.3A 2021-07-07 2021-07-07 User convenient authentication method and device for unmanned vehicle authentication system Active CN113254910B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110765446.3A CN113254910B (en) 2021-07-07 2021-07-07 User convenient authentication method and device for unmanned vehicle authentication system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110765446.3A CN113254910B (en) 2021-07-07 2021-07-07 User convenient authentication method and device for unmanned vehicle authentication system

Publications (2)

Publication Number Publication Date
CN113254910A CN113254910A (en) 2021-08-13
CN113254910B true CN113254910B (en) 2021-10-26

Family

ID=77190898

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110765446.3A Active CN113254910B (en) 2021-07-07 2021-07-07 User convenient authentication method and device for unmanned vehicle authentication system

Country Status (1)

Country Link
CN (1) CN113254910B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113781058A (en) * 2021-11-11 2021-12-10 新石器慧通(北京)科技有限公司 Payment method and device for unmanned vehicle, electronic equipment and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10956544B1 (en) * 2016-04-01 2021-03-23 Massachusetts Mutual Life Insurance Company Access control through head imaging and biometric authentication
CN110245894A (en) * 2019-06-03 2019-09-17 杭州小伊智能科技有限公司 A kind of self-service machine based on recognition of face is got in stocks the method and device of picking
CN111339929B (en) * 2020-02-25 2023-05-16 浙江大华技术股份有限公司 Retail system of unmanned supermarket
CN112365255B (en) * 2020-10-28 2021-08-31 中标慧安信息技术股份有限公司 Non-inductive payment method and system for supermarket

Also Published As

Publication number Publication date
CN113254910A (en) 2021-08-13

Similar Documents

Publication Publication Date Title
US10339402B2 (en) Method and apparatus for liveness detection
CN109326058B (en) Identity verification method and device based on intelligent teller machine, terminal and readable medium
WO2020135096A1 (en) Method and device for determining operation based on facial expression groups, and electronic device
US9985963B2 (en) Method and system for authenticating liveness face, and computer program product thereof
US10127439B2 (en) Object recognition method and apparatus
WO2019091012A1 (en) Security check method based on facial recognition, application server, and computer readable storage medium
AU2017201463B2 (en) Methods and systems for authenticating users
CN104574599A (en) Authentication method and device, and intelligent door lock
CN104933344A (en) Mobile terminal user identity authentication device and method based on multiple biological feature modals
KR20200006987A (en) Access control method, access control device, system and storage medium
CN105243740A (en) Card safety identity authentication system and implementation method based on biometric feature identification technology
TW201629851A (en) A system and a method for image recognition
CN111402480A (en) Visitor information management method, device, system, equipment and storage medium
CN204791017U (en) Mobile terminal users authentication device based on many biological characteristics mode
CN107622246B (en) Face recognition method and related product
CN108108711B (en) Face control method, electronic device and storage medium
CN108335099A (en) Method, apparatus, mobile terminal and the storage medium of mobile payment
CN113657903A (en) Face-brushing payment method and device, electronic equipment and storage medium
CN110765851A (en) Registration method, device and equipment
CN113254910B (en) User convenient authentication method and device for unmanned vehicle authentication system
WO2019218905A1 (en) Object verification method, device and system
CN113409056A (en) Payment method and device, local identification equipment, face payment system and equipment
CN113033243A (en) Face recognition method, device and equipment
WO2020152917A1 (en) Face authentication device, face authentication method, program, and recording medium
US11036971B2 (en) Face recognition method and electronic device employing the method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20221109

Address after: 455,000 No. 2 plant at the northeast corner of the intersection of Xinwa Road and Gong'an Road, Gaozhuang Town, Anyang City, Henan Province, urban-rural integration demonstration zone

Patentee after: Neolithic Zhiye (Anyang) Intelligent Technology Co.,Ltd.

Address before: 100176 room 613, 6 / F, area 2, building a, 12 Hongda North Road, Beijing Economic and Technological Development Zone, Daxing District, Beijing

Patentee before: NEOLIX TECHNOLOGIES Co.,Ltd.

TR01 Transfer of patent right