CN117765656A - Control method and control system for gate of each ward of inpatient department - Google Patents

Control method and control system for gate of each ward of inpatient department Download PDF

Info

Publication number
CN117765656A
CN117765656A CN202410191485.0A CN202410191485A CN117765656A CN 117765656 A CN117765656 A CN 117765656A CN 202410191485 A CN202410191485 A CN 202410191485A CN 117765656 A CN117765656 A CN 117765656A
Authority
CN
China
Prior art keywords
gate
information
application
accompanying
processing module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410191485.0A
Other languages
Chinese (zh)
Inventor
舒启航
查文中
罗东东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Cancer Hospital
Original Assignee
Sichuan Cancer Hospital
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Cancer Hospital filed Critical Sichuan Cancer Hospital
Priority to CN202410191485.0A priority Critical patent/CN117765656A/en
Publication of CN117765656A publication Critical patent/CN117765656A/en
Pending legal-status Critical Current

Links

Landscapes

  • Collating Specific Patterns (AREA)

Abstract

The application discloses a control method and a control system for gate machines of various ward areas of a hospitalization department. A control method of gate of each ward of a hospitalization department comprises the following steps: step 1: the information acquisition module acquires a accompany application of each patient, wherein the accompany application comprises a facial image of a accompany person and an identity mark of the accompany patient; step 2: the information monitoring module receives the accompanying application collected by the information collecting module, audits the accompanying application, gives corresponding authority to the accompanying application, and then sends the accompanying application and corresponding authority information to the data processing module; step 3: the data processing module stores the accompanying application and the corresponding authority information. The technical scheme provided by the application can enable the accompanying personnel to quickly enter and exit the gate, reduce the congestion degree near the gate and ensure the normal travel of the hospital personnel.

Description

Control method and control system for gate of each ward of inpatient department
Technical Field
the application relates to the technical field of electronic information, in particular to a control method and a control system for gate machines of all ward areas of a hospitalization department.
Background
the hospital department needs access management. For medical staff and hospital staff, the mobility of the medical staff and hospital staff is small, and the medical staff and hospital staff have uniform identity marks and are used for access management of inpatient departments. However, the patient and the accompanying person of the patient have strong mobility, and for this purpose, in order to carry out the access management to the department of inpatients, accompanying persons can be issued for the accompanying person, and the accompanying person passes through the accompanying person and goes into the hospital.
at present, most hospitals use paper accompanying and nursing staff to manage the accompanying and nursing staff, and nurses register, issue and withdraw the paper accompanying and nursing staff every day. Paper accompanying is easy to lose, repeated touching is easy to cause pollution, and the problems that the paper accompanying is easy to forge, difficult to manage, incapable of tracing accompanying personnel and the like exist.
And part of hospitals can adopt electronic accompanying and nursing, so that paperless and digital management of accompanying and nursing is realized. However, in practice, the electronic accompanying person needs to take out the corresponding electronic accompanying person from the electronic device through the electronic device such as a mobile phone. In this way, the adaptation difficulty is very high in the aged serious accompanying people, and many accompanying people are difficult to proficiently call out the electronic accompanying card from the electronic equipment such as the mobile phone, so that the accompanying people stay near the gate, and the gate is seriously blocked, so that the normal travel of hospital personnel is influenced.
Disclosure of Invention
The summary of the application is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. The summary of the application is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
As a first aspect of the present application, in order to solve the technical problem that when an aged attendant calls an electronic accompanying card by operating a mobile phone, the time is long, and the gate of an entrance and an exit of a hospitalization part is seriously blocked, and the normal travel of the hospital personnel is affected, the present application provides a control method for the gate of each disease area of the hospitalization part, which comprises the following steps:
Step 1: the information acquisition module acquires a accompany application of each patient, wherein the accompany application comprises a facial image of a accompany person and an identity mark of the accompany patient;
Step 2: the information monitoring module receives the accompanying application collected by the information collecting module, audits the accompanying application, gives corresponding authority to the accompanying application, and then sends the accompanying application and corresponding authority information to the data processing module;
step 3: the data processing module stores the accompanying application and the corresponding authority information;
Step 4: the information processing module is matched with the facial image stored in the data processing module according to the facial image of the person entering and exiting collected by the gate management module; if the facial image of the person entering and exiting is successfully matched with the facial image stored in the data processing module, the corresponding accompanying application has the authority of entering and exiting the gate, a pass instruction is sent to the gate management module, otherwise, a pass-not instruction is sent to the gate management module, the gate management module starts the gate when receiving the pass-not instruction, and the gate management module does not start the gate when receiving the pass-not instruction.
According to the technical scheme provided by the application, the information acquisition module can acquire the accompanying application of each patient, so that the facial image of the accompanying person of each patient is obtained, and the corresponding authority is given to the accompanying application, and then the facial image is stored in the data processing module. When the attendant goes in and out of the gate, the gate management module collects facial data of the attendant, and then controls whether the gate is opened or not after comparison by the information processing module. Therefore, in the process that the accompanying person enters and exits the gate, the accompanying person is not required to have any information, so that the operation difficulty of the accompanying person is reduced, the accompanying person can enter and exit the gate very quickly, the congestion degree near the gate is reduced, and the normal travel of the hospital personnel is ensured.
In the hospital department of hospitalization, generally divide into a plurality of ward areas, and the personnel are close to each other between different ward areas can appear more serious hospital infection accident. The patient's accompanying staff generally will not be restrained from moving in the hospitalization department, and may further move between different sick areas, resulting in increased difficulty in hospital-feeling protection in the hospitalization department. Aiming at the problem, the application provides the following technical scheme:
further, step 1 includes the following steps:
Step 11: the acquisition end display acquires identity information of a person accompanying the patient, wherein the identity information comprises a name, an identity card number, a mobile phone number and a facial image of the person accompanying the patient;
Step 12: the acquisition end display takes the identity information, the facial image and the authority requirements of the accompanying personnel as the accompanying application of the patient, and then sends the accompanying application to the information monitoring module, wherein the authority requirements comprise the disease areas which can come in and go out and the time which can come in and go out in the corresponding disease areas.
In this scheme, every accompanying and attending to the application and have clear and definite range of motion when applying for, so can let accompanying and attending to personnel can only remove in predetermined ward through authority management, avoided accompanying and attending to personnel and remove in a plurality of ward, and the problem that the sense risk of hospital increases that leads to.
Further, step 2 includes the steps of:
Step 21: the auditing end communication unit acquires the accompanying application and sends the accompanying application to the auditing end display;
Step 22: checking the accompanying application and adjusting the authority of the accompanying application;
step 23: and sending the approved accompany application to the data processing module.
further, the companion application is accompanied by one of four tags that are unchecked, passed, failed, and expired.
further, step 4 includes the steps of:
step 41: the gate management module collects face images of the people in and out and then sends the face images to the information processing module;
step 42: the information processing module is matched with the facial images corresponding to the accompanying applications stored in the data processing module;
step 43: if the pass accompany application is matched, information of opening the gate is sent to the gate management module, otherwise, information of insufficient access personnel authority is sent.
According to the scheme provided by the application, the face images of the gate entrance personnel are extracted, and then the face images are matched with the face images stored in the data processing module, so that whether the entrance personnel are the accompanying personnel passing the examination can be directly matched, and whether the gate is released can be further controlled.
In the scheme, the matching of information in the access personnel and the accompanying application is finished by very relying on the face recognition technology, so that the face recognition is required to have very high precision. However, in hospitals, the accompanying staff is mainly aged, the five sense organs of the aged are not obvious in characteristics, the eye sockets are deep, and the faces are wrinkled, so that the accuracy is not high in face recognition. In order to solve the problem, the application provides the following scheme:
Further, in step 3: when the data processing module stores the facial image in the accompanying application, the data processing module needs to perform feature extraction on the facial image to obtain a first matching feature, and the first matching feature and the original facial image are stored in the data processing module;
in step 4, after obtaining the face image of the person in and out, the information processing module extracts the face image of the person in and out in advance to obtain a second matching feature, and matches the second matching feature with the first matching feature.
In the scheme provided by the application, the collected facial images are not directly matched, but the first matching feature is extracted firstly when the facial images in the accompanying application are stored, then the second feature is extracted when the facial images of people are obtained, and the first feature and the second feature are matched, so that the accuracy of image matching can be further improved compared with a mode of directly using the images for matching.
further, in step 3, the manner of extracting the first matching feature is as follows:
step 31: and sequentially carrying out gray level change, gray level equalization, image smoothing and sharpening, edge detection and feature extraction on the face image to finally obtain a first matching feature.
in this way, in the scheme provided by the application, the gray level change converts the color image into the gray level image, the gray level image only has brightness information, and the information of the image is expressed by different brightness values, so that the unwanted color features can be blurred.
the gray balance can increase the dynamic range of gray values, so that the brightness of the face image is homogenized, and the overall contrast of the image is improved. Image smoothing and sharpening can make the face and surrounding environment clearer and easily recognizable. The edge detection can identify the actual edge of the image as much as possible, thereby improving the facial image recognition. Feature extraction: face feature data is extracted through the facial feature shape, and the face feature data and the image are stored in a face database.
further, the method for extracting the first matching feature comprises the following steps:
S1: carrying out gray processing on each pixel grid in face imaging, wherein the face image is provided with m pixel grids, the pixel grid j represents the j-th pixel grid, m and j are positive integers, and k is less than or equal to m;
Gray j= 0.299Rj+0.587Gj+0.114Bj;GrayjRepresenting the gray value of pixel cell j, Rj、Gj、BjRespectively representing the values of the pixel grid j on three R, G, B channels;
S2, gray level equalization: carrying out nonlinear stretching on the face image through a transformation function, and reassigning gray values of the face image;
S3: image smoothing and sharpening: smoothing and filtering the output image by adopting a two-dimensional Gaussian filter;
s4: extracting feature information of an edge area of the face image and local texture feature information of the face image, and fusing the feature information of the edge area and the local texture feature information to obtain a first matching feature.
In the scheme, after the picture is subjected to gray level processing, smoothing and sharpening, the corresponding image characteristics can be extracted more accurately.
Further, the gray level equalization in S2 is as follows:
S21: the number of pixels of each gray value is counted and set as niWhere i=0, 1,2, …, L-1, where L is the total number of gray values, niA pixel number representing an i-th gradation value;
S22: the probability density of each gray value is calculated for the input face image as:where n is the total number of pixels of the image,/>a probability density representing an ith gray value;
S23, calculating an equalization cumulative distribution function of the facial image to obtain an output gray value:
;
Wherein,Balanced gray value representing facial image coordinates (x, y)/>The original gray value of the coordinates (x, y) of the face image is represented, wherein k is more than or equal to 0 and less than or equal to L-1.
S24: rounding the calculated output gray value;
S25: the gray value of the face image is subjected to equalization conversion by the mapping relation of the gray value function of the face image, and an equalized output image F (x, y) is obtained.
Further, in S3, gaussian filtering is performed on the face image subjected to gray-scale equalization;
Expression of a two-dimensional gaussian filter function G (x, y):
;
Wherein, choose to useG (x, y) represents a two-dimensional gaussian filter function, σ represents a function standard deviation, and (x, y) represents the coordinates of the convolution kernel.
Convolving the facial image F (x, y) with G (x, y) to minimize noise interference, resulting in a smoothed image function I (x, y): i (x, y) =g (x, y) ×f (x, y); * Representing the convolution operator.
Further, S4 includes the following steps:
s41: feature information of an edge region of the face image subjected to gaussian filtering is extracted as follows:
Edge detection and feature extraction: obtaining second-order directional derivatives of the filtered image M (x, y) by using a Laplacian algorithm to obtain image edge characteristic information M (x, y):
Wherein,Representing a second order derivative of the laplace algorithm performed on the gaussian smoothed filtered image function I (x, y). /(I)The image is subjected to Gaussian-Laplace algorithm (LOG), namely, the image is subjected to Gaussian smoothing filtering and then subjected to second-order directional derivative by using the Laplace algorithm.
Wherein,Representing a second order differentiation of the function.
S42: the local texture characteristic information of the facial image is extracted in the following manner:
s421: constructing a Gabor filter bank, setting the kernel size and direction of the filter, constructing the Gabor filter bank, setting the kernel size of the filter to 3, 5, 7, 9, 11, setting the direction to 0 °, 45 °, 90 °, 135 °, generating one Gabor filter bank, and obtaining 4×5=20 different Gabor filters;
s422: extracting Gabor features, and extracting image Gabor features by using the Gabor filter bank to obtain 20 Gabor feature graphs, wherein the feature graphs are primary features;
s423: and extracting LBP characteristics, dividing each Gabor characteristic map into 5 multiplied by 5 sub-blocks, calculating the histogram of each sub-block by using an LBP characteristic extraction algorithm, and connecting the histograms of each sub-block in series to obtain the human face LBP characteristics of one Gabor characteristic map. Respectively extracting LBP characteristics of 20 Gabor characteristic diagrams to obtain 20 characteristic diagrams, wherein the characteristic diagrams at the moment are secondary characteristics;
s44: and compressing and splicing the features, compressing the two-level feature map into one-dimensional feature vectors, and then connecting the feature vectors in series to form texture features.
According to the technical scheme provided by the application, aiming at the characteristics of no blood color, unobvious five-sense organ characteristics, full face wrinkles, deep eye sockets and the like of the skin of the old, the noise is eliminated by image preprocessing, the definition of the image is improved, and the detectability of the image is increased. Aiming at the problems that the skin of the old is free from blood color and the difference between the skin and the surrounding environment is not obvious, the LOG algorithm is insensitive to illumination, and can well distinguish the edge information of the face, so that the complete extraction of the edge information of the face is ensured. Meanwhile, in order to ensure that texture information such as facial features and facial wrinkles of the old people is extracted, firstly, a multi-scale multi-directional Gabor filter bank is used for extracting texture features of different layers of images, irrelevant interference information is filtered, redundancy of the texture information is increased through different scales, then local statistical features are extracted for each texture feature image through an LBP feature extraction algorithm, and local feature information of a human face is extracted. The method combines the global texture features and the edge features, fully reserves the integrity of the facial image, reduces the interference of the fuzzy part on the recognition effect, simultaneously completely extracts the local texture features of the human face and improves the recognition rate of the human face features.
further, the second matching feature is extracted in the same way as the first matching feature.
As a second aspect of the present application, in some embodiments, there is provided a control system for each ward gate of a hospitalization unit, including: the system comprises an information acquisition module, an information monitoring module, a data processing module, a gate management module and an information processing module, wherein the information acquisition module is in signal connection with the information monitoring module; the control system of each ward gate of the inpatient department controls each gate of the ward according to the control method of each ward gate of the inpatient department.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application, are incorporated in and constitute a part of this specification. The drawings and their description are illustrative of the application and are not to be construed as unduly limiting the application.
In addition, the same or similar reference numerals denote the same or similar elements throughout the drawings. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale.
in the drawings:
FIG. 1 is a flow chart of a method for controlling the gates of each ward in a hospital department.
FIG. 2 is a schematic diagram of a control system for each ward gate of the hospitalization department.
Detailed Description
Embodiments of the present application will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the application have been illustrated in the accompanying drawings, it is to be understood that the application may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete. It should be understood that the drawings and embodiments of the application are for illustration purposes only and are not intended to limit the scope of the present application.
it should be noted that, for convenience of description, only the portions related to the present application are shown in the drawings. Embodiments of the application and features of the embodiments may be combined with each other without conflict.
The application will be described in detail below with reference to the drawings in connection with embodiments.
in the hospital department of hospitalizing, can divide into individual ward, the export and the entry of every ward all can set up the floodgate machine, come in and go out of control personnel through the floodgate machine.
Example 1: referring to fig. 1, the control method of each ward gate of the hospitalization department includes the following steps:
step 1: the information acquisition module acquires a accompany application of each patient, wherein the accompany application comprises facial images of a accompany person and identity marks of the accompany patient.
Step 11: the acquisition end display acquires identity information of a person accompanying the patient, wherein the identity information comprises a name, an identity card number, a mobile phone number and a facial image of the person accompanying the patient;
Step 12: the acquisition end display takes the identity information, the facial image and the authority requirements of the accompanying personnel as the accompanying application of the patient, and then sends the accompanying application to the information monitoring module, wherein the authority requirements comprise the disease areas which can come in and go out and the time which can come in and go out in the corresponding disease areas.
It will be appreciated that step 1 is actually the registration phase of the attendant. The information acquisition module is used for acquiring the related information of the accompanying personnel. The information acquisition module comprises an acquisition end communication unit, an acquisition end display and a camera. The acquisition end communication unit is used for acquiring a patient list in a hospital, the acquisition end display is required to be a touch screen display, the acquisition end display can have two functions of display and input, and the camera can be required to accompany facial figures of a person. Thus, the information acquisition module can be set as mobile terminal equipment such as a tablet personal computer, a mobile phone and the like.
In practice, when a patient needs to add a career, the information acquisition unit executes the steps 11-12 to generate a corresponding career application, and the career application is sent to the information monitoring module of the background for auditing.
In practice, the acquisition end display will display a patient list in the ward, and when the accompanying personnel of the corresponding patient need to be added, the corresponding patient is selected from the patient list, and then the patient is applied for accompanying application.
Step 2: the information monitoring module receives the accompanying application collected by the information collecting module, audits the accompanying application, gives corresponding permission to the accompanying application, and then sends the accompanying application and corresponding permission information to the data processing module.
Step2 comprises the following steps:
Step 21: the auditing end communication unit acquires the accompanying application and sends the accompanying application to the auditing end display;
Step 22: checking the accompanying application and adjusting the authority of the accompanying application;
step 23: and sending the approved accompany application to the data processing module.
Specifically, the information monitoring module comprises an auditing end communication unit and an auditing end display, wherein the auditing end communication unit plays a role in communication, and the auditing end display is the same as the acquisition end display and has the function of synchronous display and input. Because all the accompanying applications can be sent to the information monitoring module, the monitoring personnel can call out all the accompanying applications, audit the accompanying applications or change the authority of the accompanying applications.
Further, the companion application has one of four tags that are unchecked, passed, failed, and expired
one of 4 labels is arranged for the accompanying application, so that the auditing work of background personnel is facilitated. Background personnel can work according to label type to increase work efficiency.
step 3: the data processing module stores the accompanying application and the corresponding authority information.
because the data processing module is in signal connection with the background monitoring module, in practice the background monitoring module will retrieve the required information from the data processing module in order for the background personnel to modify the rights of the respective co-pending application. The background monitoring module essentially functions as display, communication, and modification, while the storage function is stored by the data processing module.
Step 4: the information processing module is matched with the facial image stored in the data processing module according to the facial image of the person entering and exiting collected by the gate management module; if the facial image of the person entering and exiting is successfully matched with the facial image stored in the data processing module, the corresponding accompanying application has the authority of entering and exiting the gate, a pass instruction is sent to the gate management module, otherwise, a pass-not instruction is sent to the gate management module, the gate management module starts the gate when receiving the pass-not instruction, and the gate management module does not start the gate when receiving the pass-not instruction.
Step 4 comprises the following steps:
step 41: the gate management module collects face images of the access personnel and then sends the face images to the information processing module.
step 42: the information processing module is matched with the facial images corresponding to the accompanying applications stored in the data processing module.
step 43: if the pass accompany application is matched, information of opening the gate is sent to the gate management module, otherwise, information of insufficient access personnel authority is sent.
Specifically, the gate management module at least needs to include a camera and a communication unit, and the camera is used for obtaining facial images of people in and out. And the communication unit is used for sending the collected facial images of the access personnel to the information processing module and communicating with the gate.
in practice, to save costs, as well as from an intranet perspective, the gate management module and the information acquisition module may use the same type of device, rather than the patient's cell phone. Therefore, the information of the accompanying personnel can be transmitted in the local area network in the hospital, and the introduction of the Internet is avoided.
In step 3: when the data processing module stores the facial image in the accompanying application, the data processing module needs to perform feature extraction on the facial image to obtain a first matching feature, and the first matching feature and the original facial image are stored in the data processing module;
In step 4, after obtaining the face image of the person in and out, the information processing module extracts the face image of the person in and out in advance to obtain a second matching feature, and matches the second matching feature with the first matching feature. The second matching feature is extracted in the same way as the first matching feature.
The extraction method of the first matching feature comprises the following steps:
S1: and carrying out gray processing on each pixel grid in the face imaging, wherein the face image is provided with m pixel grids, the pixel grid j represents the j-th pixel grid, m and j are positive integers, and k is less than or equal to m. Grayj= 0.299Rj+0.587Gj+0.114Bj;GrayjRepresenting the gray value of pixel cell j, Rj、Gj、BjThe values of pixel bin j on the three channels R, G, B are shown, respectively.
S2, gray level equalization: the face image is subjected to nonlinear stretching through a transformation function, and gray values of the face image are reassigned.
The gray scale equalization in S2 is as follows:
S21: the number of pixels of each gray value is counted and set as niWhere i=0, 1,2, …, L-1, where L is the total number of gray values, niA pixel number representing an i-th gradation value;
S22: the probability density of each gray value is calculated for the input face image as:Where n is the total number of pixels of the image;
S23, calculating an equalization cumulative distribution function of the facial image to obtain an output gray value:
Wherein S (x, y) represents an equilibrium gray value of the face image coordinates (x, y),The original gray value of the coordinates (x, y) of the face image is represented, wherein k is more than or equal to 0 and less than or equal to L-1.
S24: rounding the calculated output gray value;
S25: the gray value of the face image is subjected to equalization conversion by the mapping relation of the gray value function of the face image, and an equalized output image F (x, y) is obtained.
S3: image smoothing and sharpening: and smoothing and filtering the output image by adopting a two-dimensional Gaussian filter.
Expression of a two-dimensional gaussian filter function G (x, y):
;
Wherein, choose to useG (x, y) represents a two-dimensional gaussian filter function, σ represents a function standard deviation, and (x, y) represents the coordinates of the convolution kernel.
Convolving the facial image F (x, y) with G (x, y) to minimize noise interference, resulting in a smoothed image function I (x, y): i (x, y) =g (x, y) ×f (x, y); * Representing the convolution operator.
s4: extracting feature information of an edge area of the face image and local texture feature information of the face image, and fusing the feature information of the edge area and the local texture feature information to obtain a first matching feature.
S4 comprises the following steps:
s41: feature information of an edge region of the face image subjected to gaussian filtering is extracted as follows:
Edge detection and feature extraction: obtaining second-order directional derivatives of the filtered image M (x, y) by using a Laplacian algorithm to obtain image edge characteristic information M (x, y):
Wherein,Representing a second order derivative of the laplace algorithm performed on the gaussian smoothed filtered image function I (x, y). /(I)The image is subjected to Gaussian-Laplace algorithm (LOG), namely, the image is subjected to Gaussian smoothing filtering and then subjected to second-order directional derivative by using the Laplace algorithm.
;
Wherein,Representing a second order differentiation of the function.
S42: the local texture characteristic information of the facial image is extracted in the following manner:
s421: constructing a Gabor filter bank, setting the kernel size and direction of the filter, constructing the Gabor filter bank, setting the kernel size of the filter to 3, 5, 7, 9, 11, setting the direction to 0 °, 45 °, 90 °, 135 °, generating one Gabor filter bank, and obtaining 4×5=20 different Gabor filters;
s422: extracting Gabor features, and extracting image Gabor features by using the Gabor filter bank to obtain 20 Gabor feature graphs, wherein the feature graphs are primary features;
s423: and extracting LBP characteristics, dividing each Gabor characteristic map into 5 multiplied by 5 sub-blocks, calculating the histogram of each sub-block by using an LBP characteristic extraction algorithm, and connecting the histograms of each sub-block in series to obtain the human face LBP characteristics of one Gabor characteristic map. Respectively extracting LBP characteristics of 20 Gabor characteristic diagrams to obtain 20 characteristic diagrams, wherein the characteristic diagrams at the moment are secondary characteristics;
s44: and compressing and splicing the features, compressing the two-level feature map into one-dimensional feature vectors, and then connecting the feature vectors in series to form texture features.
referring to fig. 2, example 2: a control system for each ward gate of a hospitalization department, comprising: the system comprises an information acquisition module, an information monitoring module, a data processing module, a gate management module and an information processing module, wherein the information acquisition module is in signal connection with the information monitoring module; the control system of each ward gate of the inpatient department controls each gate of the ward according to the control method of each ward gate of the inpatient department.
It should be noted that, the gate of each area in the hospital and how the gate is opened and closed are not described here, and the gate opening and closing control is the prior art in the field. In the scheme, a signal for opening a gate is obtained mainly by collecting face data of an access person. The design of the gate and the opening mode of the gate are not limited herein.
The above description is only illustrative of the few preferred embodiments of the present application and of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the application in the embodiments of the present application is not limited to the specific combination of the above technical features, but also encompasses other technical features formed by any combination of the above technical features or their equivalents without departing from the spirit of the application. Such as the above-described features, are mutually replaced with the technical features having similar functions (but not limited to) disclosed in the embodiments of the present application.

Claims (10)

1. A control method of gate of each ward of a hospitalization department is characterized in that: the method comprises the following steps:
Step 1: the information acquisition module acquires a accompany application of each patient, wherein the accompany application comprises a facial image of a accompany person and an identity mark of the accompany patient;
Step 2: the information monitoring module receives the accompanying application collected by the information collecting module, audits the accompanying application, gives corresponding authority to the accompanying application, and then sends the accompanying application and corresponding authority information to the data processing module;
step 3: the data processing module stores the accompanying application and the corresponding authority information;
Step 4: the information processing module is matched with the facial image stored in the data processing module according to the facial image of the person entering and exiting collected by the gate management module; if the facial image of the person entering and exiting is successfully matched with the facial image stored in the data processing module, the corresponding accompanying application has the authority of entering and exiting the gate, a pass instruction is sent to the gate management module, otherwise, a pass-not instruction is sent to the gate management module, the gate management module starts the gate when receiving the pass-not instruction, and the gate management module does not start the gate when receiving the pass-not instruction.
2. the method for controlling the gate of each ward of a hospitalization department according to claim 1, wherein: step 1 comprises the following steps:
Step 11: the acquisition end display acquires identity information of a person accompanying the patient, wherein the identity information comprises a name, an identity card number, a mobile phone number and a facial image of the person accompanying the patient;
Step 12: the acquisition end display takes the identity information, the facial image and the authority requirements of the accompanying personnel as the accompanying application of the patient, and then sends the accompanying application to the information monitoring module, wherein the authority requirements comprise the disease areas which can come in and go out and the time which can come in and go out in the corresponding disease areas.
3. The method for controlling the gate of each ward of a hospitalization department according to claim 2, wherein: step 2 comprises the following steps:
Step 21: the auditing end communication unit acquires the accompanying application and sends the accompanying application to the auditing end display;
Step 22: checking the accompanying application and adjusting the authority of the accompanying application;
step 23: and sending the approved accompany application to the data processing module.
4. the method for controlling the gate of each ward of a hospitalization department according to claim 3, wherein: the co-care application is accompanied by one of four labels that are unchecked, passed, failed, and expired.
5. the method for controlling the gate of each ward of a hospitalization department according to claim 3, wherein: step 4 comprises the following steps:
step 41: the gate management module collects face images of the people in and out and then sends the face images to the information processing module;
step 42: the information processing module is matched with the facial images corresponding to the accompanying applications stored in the data processing module;
step 43: if the pass accompany application is matched, information of opening the gate is sent to the gate management module, otherwise, information of insufficient access personnel authority is sent.
6. The method for controlling the gate of each ward of a hospitalization unit according to claim 5, wherein: in step 3: when the data processing module stores the facial image in the accompanying application, the data processing module needs to perform feature extraction on the facial image to obtain a first matching feature, and the first matching feature and the original facial image are stored in the data processing module;
in step 4, after obtaining the face image of the person in and out, the information processing module extracts the face image of the person in and out in advance to obtain a second matching feature, and matches the second matching feature with the first matching feature.
7. the method for controlling the gate of each ward of a hospitalization unit according to claim 6, wherein: in step3, the manner of extracting the first matching feature is as follows:
And sequentially carrying out gray level change, gray level equalization, image smoothing and sharpening, edge detection and feature extraction on the face image to finally obtain a first matching feature.
8. The method for controlling the gate of each ward of a hospitalization unit according to claim 7, wherein: the extraction method of the first matching feature comprises the following steps:
S1: carrying out gray processing on each pixel grid in face imaging, wherein the face image is provided with m pixel grids, the pixel grid j represents the j-th pixel grid, m and j are positive integers, and k is less than or equal to m;
Gray j= 0.299Rj+0.587Gj+0.114Bj;GrayjRepresenting the gray value of pixel cell j, Rj、Gj、BjRespectively representing the values of the pixel grid j on three R, G, B channels;
S2, gray level equalization: carrying out nonlinear stretching on the face image through a transformation function, and reassigning gray values of the face image;
S3: image smoothing and sharpening: smoothing and filtering the output image by adopting a two-dimensional Gaussian filter;
s4: extracting feature information of an edge area of the face image and local texture feature information of the face image, and fusing the feature information of the edge area and the local texture feature information to obtain a first matching feature.
9. The method for controlling the gate of each ward of a hospitalization unit according to claim 8, wherein: the second matching feature is extracted in the same way as the first matching feature.
10. A control system of gate machine of each ward of department of hospitalization, its characterized in that: the system comprises an information acquisition module, an information monitoring module, a data processing module, a gate management module and an information processing module, wherein the information acquisition module is in signal connection with the information monitoring module, the data processing module is in signal connection with the information monitoring module, the gate management module is connected with the information processing module, the data processing module and the gate management module are respectively in signal connection with the information processing module, and the control system of the gate of each disease area of the hospitalization department controls the gate of each disease area according to the control method of the gate of each disease area of the hospitalization department as claimed in any one of claims 1-9.
CN202410191485.0A 2024-02-21 2024-02-21 Control method and control system for gate of each ward of inpatient department Pending CN117765656A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410191485.0A CN117765656A (en) 2024-02-21 2024-02-21 Control method and control system for gate of each ward of inpatient department

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410191485.0A CN117765656A (en) 2024-02-21 2024-02-21 Control method and control system for gate of each ward of inpatient department

Publications (1)

Publication Number Publication Date
CN117765656A true CN117765656A (en) 2024-03-26

Family

ID=90324037

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410191485.0A Pending CN117765656A (en) 2024-02-21 2024-02-21 Control method and control system for gate of each ward of inpatient department

Country Status (1)

Country Link
CN (1) CN117765656A (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104331683A (en) * 2014-10-17 2015-02-04 南京工程学院 Facial expression recognition method with noise robust
CN104408780A (en) * 2014-11-28 2015-03-11 四川浩特通信有限公司 Face recognition attendance system
CN104657947A (en) * 2015-02-06 2015-05-27 哈尔滨工业大学深圳研究生院 Noise reducing method for basic group image
CN105005765A (en) * 2015-06-29 2015-10-28 北京工业大学 Facial expression identification method based on Gabor wavelet and gray-level co-occurrence matrix
CN106529447A (en) * 2016-11-03 2017-03-22 河北工业大学 Small-sample face recognition method
CN106599870A (en) * 2016-12-22 2017-04-26 山东大学 Face recognition method based on adaptive weighting and local characteristic fusion
CN107066966A (en) * 2017-04-17 2017-08-18 宜宾学院 A kind of face identification method based on key point area image
CN110135254A (en) * 2019-04-12 2019-08-16 华南理工大学 A kind of fatigue expression recognition method
CN110188639A (en) * 2019-05-20 2019-08-30 深圳供电局有限公司 Face image processing method and system, computer equipment and readable storage medium
CN111260616A (en) * 2020-01-13 2020-06-09 三峡大学 Insulator crack detection method based on Canny operator two-dimensional threshold segmentation optimization
CN111639619A (en) * 2020-06-08 2020-09-08 金陵科技学院 Face recognition device and recognition method based on deep learning
CN114283930A (en) * 2021-11-10 2022-04-05 四川省肿瘤医院 System for realizing sharing of accompanying person based on mobile phone APP
CN115472299A (en) * 2021-07-12 2022-12-13 山东亚华电子股份有限公司 Intelligent passing method, system and medium based on epidemic situation prevention and control
CN117116438A (en) * 2022-12-26 2023-11-24 成都市青白江区妇幼保健院 Hospital accompanying personnel access management method and system

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104331683A (en) * 2014-10-17 2015-02-04 南京工程学院 Facial expression recognition method with noise robust
CN104408780A (en) * 2014-11-28 2015-03-11 四川浩特通信有限公司 Face recognition attendance system
CN104657947A (en) * 2015-02-06 2015-05-27 哈尔滨工业大学深圳研究生院 Noise reducing method for basic group image
CN105005765A (en) * 2015-06-29 2015-10-28 北京工业大学 Facial expression identification method based on Gabor wavelet and gray-level co-occurrence matrix
CN106529447A (en) * 2016-11-03 2017-03-22 河北工业大学 Small-sample face recognition method
CN106599870A (en) * 2016-12-22 2017-04-26 山东大学 Face recognition method based on adaptive weighting and local characteristic fusion
CN107066966A (en) * 2017-04-17 2017-08-18 宜宾学院 A kind of face identification method based on key point area image
CN110135254A (en) * 2019-04-12 2019-08-16 华南理工大学 A kind of fatigue expression recognition method
CN110188639A (en) * 2019-05-20 2019-08-30 深圳供电局有限公司 Face image processing method and system, computer equipment and readable storage medium
CN111260616A (en) * 2020-01-13 2020-06-09 三峡大学 Insulator crack detection method based on Canny operator two-dimensional threshold segmentation optimization
CN111639619A (en) * 2020-06-08 2020-09-08 金陵科技学院 Face recognition device and recognition method based on deep learning
CN115472299A (en) * 2021-07-12 2022-12-13 山东亚华电子股份有限公司 Intelligent passing method, system and medium based on epidemic situation prevention and control
CN114283930A (en) * 2021-11-10 2022-04-05 四川省肿瘤医院 System for realizing sharing of accompanying person based on mobile phone APP
CN117116438A (en) * 2022-12-26 2023-11-24 成都市青白江区妇幼保健院 Hospital accompanying personnel access management method and system

Similar Documents

Publication Publication Date Title
Du et al. Union Laplacian pyramid with multiple features for medical image fusion
CN105917353B (en) Feature extraction and matching for biological identification and template renewal
CN107038692A (en) X-ray rabat bone based on wavelet decomposition and convolutional neural networks suppresses processing method
WO2023283980A1 (en) Artificial intelligence medical image quality control method applied to clinical images
Vankdothu et al. Brain image identification and classification on Internet of Medical Things in healthcare system using support value based deep neural network
CN111178130A (en) Face recognition method, system and readable storage medium based on deep learning
WO2021114623A1 (en) Method, apparatus, computer device, and storage medium for identifying persons having deformed spinal columns
CN110211205A (en) Image processing method, device, equipment and storage medium
CN112990339B (en) Gastric pathological section image classification method, device and storage medium
Srivastava et al. Automated emergency paramedical response system
CN113610746A (en) Image processing method and device, computer equipment and storage medium
CN117765656A (en) Control method and control system for gate of each ward of inpatient department
Mitra et al. Lung disease prediction using deep learning
Iqbal et al. Privacy-preserving collaborative AI for distributed deep learning with cross-sectional data
Saxena et al. Classification of leaf disease on using triangular thresholding method and machine learning
Gupta et al. Comparison and analysis of skin lesion on pretrained architectures
Jai-Andaloussi et al. Content Based Medical Image Retrieval based on BEMD: optimization of a similarity metric
Songram et al. Classification of chest X-ray images using a hybrid deep learning method
Vignesh Baalaji et al. Autonomous face mask detection using single shot multibox detector, and ResNet-50 with identity retrieval through face matching using deep siamese neural network
Yuan et al. Classification and recognition method of fundus images based on SE-DenseNet
Kinshakov et al. Application of Machine Learning Techniques to Solve the Problem of Skin Diseases Diagnosis
Bhosale et al. Facial Recognition Based Smart Attendance System
Abdulaal et al. Intelligent Control Techniques for the Detection of Biomedical Ear Infections
WO2024123057A1 (en) Method and analysis device for visualizing bone tumor in humerus using chest x-ray image
Sujatha et al. Deep convolution classification model-based COVID-19 chest CT image classification

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination