WO2022045562A1 - Electronic device for controlling opening and closing of a door, a door opening/closing device disposed to the door, and a method therefor - Google Patents

Electronic device for controlling opening and closing of a door, a door opening/closing device disposed to the door, and a method therefor Download PDF

Info

Publication number
WO2022045562A1
WO2022045562A1 PCT/KR2021/008436 KR2021008436W WO2022045562A1 WO 2022045562 A1 WO2022045562 A1 WO 2022045562A1 KR 2021008436 W KR2021008436 W KR 2021008436W WO 2022045562 A1 WO2022045562 A1 WO 2022045562A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
processor
electronic device
door
door opening
Prior art date
Application number
PCT/KR2021/008436
Other languages
French (fr)
Inventor
Youngjun Kim
Sun Mi Choi
Youngmin Park
Original Assignee
Lg Electronics Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lg Electronics Inc. filed Critical Lg Electronics Inc.
Publication of WO2022045562A1 publication Critical patent/WO2022045562A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/00174Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys
    • G07C9/00563Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys using personal physical data of the operator, e.g. finger prints, retinal images, voicepatterns
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L2/00Methods or apparatus for disinfecting or sterilising materials or objects other than foodstuffs or contact lenses; Accessories therefor
    • A61L2/24Apparatus using programmed or automatic operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/14Vascular patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/30Individual registration on entry or exit not involving the use of a pass
    • G07C9/32Individual registration on entry or exit not involving the use of a pass in combination with an identity check
    • G07C9/37Individual registration on entry or exit not involving the use of a pass in combination with an identity check using biometric data, e.g. fingerprints, iris scans or voice recognition
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/25Image signal generators using stereoscopic image cameras using two or more image sensors with different characteristics other than in their location or field of view, e.g. having different resolutions or colour pickup characteristics; using image signals from one sensor to control the characteristics of another sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/51Housings
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L2/00Methods or apparatus for disinfecting or sterilising materials or objects other than foodstuffs or contact lenses; Accessories therefor
    • A61L2/02Methods or apparatus for disinfecting or sterilising materials or objects other than foodstuffs or contact lenses; Accessories therefor using physical phenomena
    • A61L2/08Radiation
    • A61L2/10Ultraviolet radiation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L2202/00Aspects relating to methods or apparatus for disinfecting or sterilising materials or objects
    • A61L2202/10Apparatus features
    • A61L2202/11Apparatus for generating biocidal substances, e.g. vaporisers, UV lamps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals

Definitions

  • the present disclosure relates to an electronic device for controlling opening and closing of a door, a door opening/closing device disposed to the door, and a method therefor.
  • Doors e.g., an entry door
  • Doors are generally rotated and opened in one direction or in both directions to allow users to come in and out of an indoor space. Additionally, doors can be provided with a lock for controlling their opening and closing to prevent an unspecified person to come into the indoor space.
  • the lock is usually a mechanical one that is opened only with a specific key.
  • An electronic lock or a digital lock denotes a lock capable of recognizing a password, or a semiconductor chip or a smart card instead of a key.
  • the lock can be opened only when input information matches information stored therein as a result of comparison therebetween.
  • the electronic lock or the digital lock of the related art is vulnerable to theft when a password for opening the lock is stolen by another person.
  • a digital lock of the related art cause inconvenience of carrying a key and bring the key into contact with the locks, and are susceptible to the risk of loss and manipulation.
  • the present disclosure is directed to a smart door system that may open a lock using biometric information of a person coming in and out.
  • the present disclosure is also directed to an electronic device that may be disposed at a position near a door and readily obtain biometric information from a person coming in and out.
  • the present disclosure is also directed to a door opening/closing device that may receive a signal for controlling opening and closing of a door from the electronic device having obtained biometric information of a person coming in and out, and control the opening and closing of the door.
  • the present disclosure is also directed to a smart door system that may perform authentication of a person coming in and out more than once.
  • the present disclosure is also directed to an electronic device that may include at least one IR camera capable of identifying a person coming in and out even in darkness.
  • the present disclosure is also directed to a smarter method for opening and closing a door by providing various types of information in relation to the opening and closing of the door through the electronic device.
  • the present disclosure is also directed to opening/closing of a door device that may sterilize a contaminated portion contacted by unspecified multiple people.
  • a smart door system based on a communication of between an electronic device and a door opening/closing device may be provided to authenticate a person coming in and out, and determine whether to open or close a door.
  • an electronic device may be disposed at a position near a door, and may include an RGB camera and at least one IR camera, to obtain biometric information from a person coming in and out regardless of surrounding brightness.
  • a communication of between an electronic device and a door opening/closing device may be provided, to control an operation of the door opening/closing device through the electronic device.
  • an authentication based on biometric information of a person coming in and out may be performed more than once through a plurality of cameras included in an electronic device.
  • a UVC LED may be disposed in a door opening/closing device to sterilize a contaminated portion by contacting of unspecified multiple peoples.
  • opening and closing of a handle of a door opening/closing device may be controlled, to prevent unauthenticated people from coming in out.
  • an electronic device may include a display, a camera assembly, a communicator, a storage, and a processor electrically connected to the display, the camera assembly, the communicator, and the storage.
  • the processor may activate a first camera of the camera assembly based on recognizing a user, and obtain a first image of the user through the activated first camera. Additionally, the processor may perform authentication of the user based on the obtained first image and a second image pre-stored in the storage, and transmit a signal for controlling an operation of a door opening/closing device disposed in a door to the door opening/closing device through the communicator based on the authentication of the user.
  • a door opening/closing device may include a body attached to a first side of a door and configured to protrude, and a handle operatively connected to the door opening/closing device.
  • the body may include a communicator, a motor, and a processor electrically connected to the communicator and the motor.
  • the processor may receive, through the communicator, a signal for controlling the door opening/closing device from an electronic device disposed at a position near the door, and unlock the door opening/closing device, based on the received control signal.
  • a method for controlling opening and closing of a door may include an electronic device's activating a first camera based on recognizing a user and obtaining a first image of the user through the activated first camera.
  • the method may include the electronic device's performing authentication of the user based on the obtained first image and a pre-stored second image, and transmitting, to a door opening/closing device, a signal for controlling an operation of the door opening/closing device installed in the door based on the authentication of the user.
  • the door opening/closing device may receive a signal for controlling the door opening/closing device from the electronic device, and unlock the door opening/closing device based on the received control signal.
  • An electronic device may obtain a first image of a user through one or more cameras, perform authentication of the user based on the obtained first image and a pre-stored second image, and control an operation of a door opening/closing device disposed in a door, to open the door in a convenient manner without an additional key for opening the door.
  • the electronic device may include an RGB camera and one or more IR cameras, to authenticate a person coming in and out day and night.
  • the electronic device may display a guide line for guiding biometric information of a person coming in and out, to obtain biometric information from a person coming in and out more readily.
  • the electronic device may extract information on veins of a body part of a person coming in and out through one or more IR cameras, and perform authentication of the person coming in and out using the extracted information on veins, to provide tight security.
  • the electronic device may display at least piece of information on one or more operation states of the electronic device and one or more operation states of the door opening/closing device, to allow a person to come in and out readily.
  • a door opening/closing device may receive a signal for controlling opening and closing of the door from the electronic device, and, based on the received signal, control the opening and closing of the door, to allow a person to come in and out conveniently.
  • the door opening/closing device may control a handle such that the handle protrudes from a body, to prevent an unauthenticated person from coming in and out.
  • the door opening/closing device may sterilize the door opening/closing device based on the closing of the door, to remove contamination caused by unspecified multiple people.
  • the door opening/closing device may emit one or more light emitting elements in different colors based on the opening and closing of the door, to allow a user to determine an operation state of the door opening/closing device.
  • FIG. 1 is an exemplary view showing a smart door system according to one embodiment.
  • FIG. 2 is a block diagram showing an electronic device and a door opening/closing device according to one embodiment.
  • FIG. 3 is a flow chart showing a process of authenticating a user in a smart door system according to one embodiment.
  • FIG. 4 is a flow chart showing a process of authenticating a user in a smart door system according to another embodiment.
  • FIG. 5a is an exemplary view showing a screen on a display when an electronic device according to one embodiment does not sense an approach of a user.
  • FIG. 5b is an exemplary view showing a user approaching an electronic device according to one embodiment.
  • FIG. 5c is an exemplary view showing a screen on a display of an electronic device sensing an approach of a user according to one embodiment.
  • FIG. 6a is an exemplary view showing a process of performing authentication using an image obtained by an electronic device according to one embodiment from a user.
  • FIG. 6b is an exemplary view indicating a result that the electronic device according to one embodiment performs an authentication of a user by using an image obtained from a user.
  • FIG. 7a is an exemplary view showing a screen where an electronic device according to one embodiment performs authentication using a face image obtained from a user.
  • FIG. 7b is an exemplary view showing a result of primary authentication performed by an electronic device according to one embodiment using an image obtained from a user.
  • FIG. 7c is an exemplary view where an electronic device according to one embodiment obtains an image of a palm of a user.
  • FIG. 7d is an exemplary view showing a preview image of a palm of a user obtained by an electronic device according to one embodiment.
  • FIG. 7e is an exemplary view showing a result of secondary authentication performed by an electronic device according to one embodiment.
  • FIG. 8a is an exemplary view showing a screen indicating a touch gesture being input from a user through an electronic device according to one embodiment.
  • FIG. 8b is an exemplary view showing a screen to which a password is input from a user in an electronic device according to one embodiment.
  • FIG. 8c is an exemplary view showing a result of authentication performed by an electronic device according to one embodiment using an input password.
  • FIG. 9a is an exemplary view showing a handle of a door opening/closing device, being separated from a body, based on authentication of a user according to one embodiment.
  • FIG. 9b is an exemplary view showing a handle of a door opening/closing device, completely separated from a body, based on authentication of a user according to one embodiment.
  • FIG. 9c is an exemplary view showing a handle of a door opening/closing device according to one embodiment, gripped by a user.
  • first the terms “first”, “second” and the like, are used herein only to distinguish one component from another component. Thus, the components should not be limited by the terms. For instance, a first component can be a second component unless stated to the contrary.
  • one component When one component is described as being “connected”, “coupled”, or “connected” to another component, one component can be directly connected, coupled or connected to another component; however, it is also to be understood that an additional component can be “interposed” between the two components, or the two components can be “connected”, “coupled”, or “connected” through an additional component.
  • each component can be provided as a single one or a plurality of ones, unless explicitly stated to the contrary.
  • a and/or B as used herein can denote A, B or A and B, and the terms “C to D” can denote greater than C and less than D, unless stated to the contrary.
  • FIG. 1 is an exemplary view showing a smart door system according to one embodiment.
  • a smart door system 100 may include an electronic device 110, a door opening/closing device 120, and a door 130.
  • the door opening/closing device 120 may be disposed at the door 130, and the electronic device 110 may be disposed on a wall outside a home.
  • the door opening/closing device 120 and the electronic device 110 may receive and transmit a signal in a wired or wireless manner.
  • the electronic device 110 may recognize a user, and, based on recognizing the user, activate a first camera disposed on one side (e.g., an upper side, a lower side, a left side, or a right side) of a front surface of the electronic device 110. Additionally, the electronic device 110 may obtain a first image of the user through the first camera, and, based on the obtained first image and a second image pre-stored in the electronic device 110, may authenticate the user. Additionally, the electronic device 110 may a signal for controlling an operation of the door opening/closing device 120 installed in the door 130 to the door opening/closing device 120, based on the authentication of the user.
  • a first camera disposed on one side (e.g., an upper side, a lower side, a left side, or a right side) of a front surface of the electronic device 110. Additionally, the electronic device 110 may obtain a first image of the user through the first camera, and, based on the obtained first image and a second image pre-stored in the
  • the door opening/closing device 120 may receive a signal for controlling the door opening/closing device 120 from the electronic device 110 disposed at a position near the door 130. Additionally, the door opening/closing device 120 may unlock the door opening/closing device 120, based on the received control signal.
  • the door 130 may include a door (e.g., an entry door) through which the user freely comes in and out. Opening and closing of the door 130 may be controlled by a lock part of the door opening/closing device 120.
  • a door e.g., an entry door
  • Opening and closing of the door 130 may be controlled by a lock part of the door opening/closing device 120.
  • FIG. 1 shows a configuration of the smart door system 100 according to one embodiment.
  • Components of the smart door system 100 may not be limited to those of the embodiment illustrated in FIG. 1, and, when necessary, some components may be added, modified or removed.
  • FIG. 2 is a block diagram showing an electronic device and a door opening/closing device according to one embodiment.
  • the electronic device 110 may include a communicator 210, an input part 211, a display 212a, a speaker 212b, a storage 213, a processor 214, a camera assembly 216 and a sensor assembly 219.
  • FIG. 2 shows a configuration of the electronic device 110 according to one embodiment.
  • Components of the electronic device 110 may not be limited to those of the embodiment illustrated in FIG. 2, and, when necessary, some components may be added, modified or removed.
  • the communicator 210 may include one or more circuits capable of transmitting one or more signals or one or more pieces of information to one or more components (e.g., an input part 211, a display 212a, a speaker 212b, a storage 213, a processor 214, a camera assembly 216 and a sensor assembly 219) included in the electronic device 110, and receiving the same from one or more of the components, based on wired communication or wireless communication.
  • components e.g., an input part 211, a display 212a, a speaker 212b, a storage 213, a processor 214, a camera assembly 216 and a sensor assembly 21
  • the communicator 210 may include one or more circuits capable of transmitting one or more signals or one or more pieces of information to components (e.g., a speaker 222, a battery 223, a motor 224, a lock part 225, a processor 226, a UVC sterilizer 231, a light emitter 232, and a sensor assembly 233) included in the door opening/closing device 120, and receiving the same from the components, based on wired communication or wireless communication.
  • components e.g., a speaker 222, a battery 223, a motor 224, a lock part 225, a processor 226, a UVC sterilizer 231, a light emitter 232, and a sensor assembly 23
  • the communicator 210 may apparently receive a signal or data from various types of external devices or transmit the same to the external devices.
  • the input part 211 may include an interface configured to transmit/receive data to/from an external device (e.g., a universal serial bus (USB)), an external hard drive device (not illustrated) and the like).
  • the input part 211 may various types of information, input by the user, to the processor 214.
  • the input part 211 may include a physical operation member such as a switch, a button and the like or an electric operation member such as a touch key, a touch pad, a touch screen and the like.
  • the input part 211 may further include a microphone configured to receive a voice signal of the user.
  • the display 212a may display various types of information (e.g., multimedia data or text data and the like).
  • the display 212a may display results processed, being processed or to be processed by the processor 214.
  • the display 212a may visually provide various types of information regarding an operation state of the door opening/closing device 120 through the electronic device 110, and may include a control circuit configured to display various types of information.
  • the display 212a may include a touch circuitry configured to sense a touch, or a pressure sensor configured to measure an intensity of pressure of a touch.
  • the speaker 212b may change a sound into an electric signal or vice versa.
  • the speaker 212b may output a sound through an acoustic output device (e.g., ear buds or headphones).
  • the speaker 212b may output an acoustic signal to the outside of the electronic device 110.
  • the speaker 212b may output information provided to the user who comes in and out through a door as a voice.
  • the speaker 212b may be disposed in the electronic device 110 as an additional component or may be included in an output part (not illustrated) along with the display 212a.
  • the storage 213 may include a volatile memory or a non-volatile memory.
  • the storage 213 may store information, data, programs and the like required for an operation of the electronic device 110 or the door opening/closing device 120.
  • the processor 214 may perform a control operation, described below, with reference to the information stored in the storage 213.
  • the storage 213 may also store various types of platforms.
  • the storage 213, for example, may include at least one storage media of a flash memory type storage device, a hard disk type storage device, a multimedia card micro type storage device, a card type memory (e.g., an SD memory or an XD memory and the like), RAM, ROM (EEPROM and the like), and a USB memory.
  • the storage 213 may store various types of data (e.g., software, an application, obtained information, measured information, a control signal, and the like), and instructions in relation to data obtained or used by at least one component (e.g., a communicator 210, an input part 211, a display 212a, a speaker 212b, a storage 213, a processor 214, a camera assembly 216 and a sensor assembly 219) of the electronic device 110.
  • the storage 213 may store a control signal or data received from the door opening/closing device 120.
  • the storage 213 may store information on a preregistered user (e.g., a person coming in and out or a family member).
  • the storage 213 may store an image (e.g., a face image and the like) and a biometric information image (e.g., an image including information on veins) of the preregistered user (e.g., a person coming in and out or a family member), and information (e.g., a name, a date of birth, gender, age and the like) on each user (e.g., each family member).
  • the storage 213 may store various types of information (e.g., success in authentication, failure in authentication, a state where a door opening/closing device is being sterilized, a home address, a password for opening the door and the like) provided to a user coming in and out through the door.
  • information e.g., success in authentication, failure in authentication, a state where a door opening/closing device is being sterilized, a home address, a password for opening the door and the like
  • the camera assembly 216 may include a first camera 217a, a second camera 217b, and a light part 218.
  • One or more of the cameras of the camera assembly 216 may be disposed at a position where an image of a user (or a body part of the user) coming in and out through the door is readily obtained.
  • the camera assembly 216 may include the first camera 217a (e.g., a red-green-blue (RGB) camera) capable of obtaining an image of a user despite a high illumination caused by the light part 218.
  • the camera assembly 216 may include one or more of the second cameras 217b (e.g., an RGB camera, an infrared (IR) camera and the like).
  • an artificial intelligence chip (e.g., DQ1) may be built in one or more of the cameras of the camera assembly 216. Through the artificial intelligence chip, the camera assembly 216 may recognize the face of a user and sense the presence of a user.
  • the camera assembly 216 may be a camera to which a deep learning-based algorithm operational in the artificial intelligence chip (e.g., DQ1 of LG electronics) is applied, and may provide personalized services such as registration and recognition of a user, sensing of an approach of a user, sensing of an occupant, sensing of a user returning home, self-monitoring, capturing of an image and the like.
  • a deep learning-based algorithm operational in the artificial intelligence chip e.g., DQ1 of LG electronics
  • the processor 314 may transmit, to a server (not illustrated; e.g., the ThinQ artificial intelligence server), information (e.g., an image) collected from one or more users outside a home through one or more cameras (e.g., an image intelligence camera) provided with an artificial intelligence chip through the communicator 210, and the server (not illustrated) may generate information for convenience of an occupant and then transmit the information to the electronic device 110 or a smart mirror apparatus (not illustrated) again. Accordingly, the electronic device 110 may provide more convenient services to the user.
  • a server e.g., the ThinQ artificial intelligence server
  • information e.g., an image
  • cameras e.g., an image intelligence camera
  • the electronic device 110 may provide more convenient services to the user.
  • one or more of the first camera and the second camera may obtain an image of the user (or a body part of the user), and may deliver the obtained image to the processor 214.
  • the second camera 217b may include one or more IR cameras capable of obtaining an image of a user even in darkness (at low illumination).
  • the light part 218 may include an element (e.g., an LED, an IR pattern light) configured to emit light required for obtaining an image through one or more of the first camera 217a and the second camera 217b.
  • the camera assembly 216 may include a Compact CMOS camera demonstrator (C3D) camera.
  • the C3D camera may extract a distance (e.g., a depth) from the user according to an active stereo method.
  • the camera assembly 216 may obtain and extract depth information through the first IR camera 217b1, the IR pattern light 218b, and the second IR camera 217b2.
  • the camera assembly 216 may obtain an image of the user respectively through two or more IR cameras and may extract depth information on the user from the two obtained images. Additionally, based on the extracted depth information, the camera assembly 216 may obtain a stereo-scopic image (e.g., a three-dimensional image) of the user. Additionally, the processor 214 may authenticate the user based on the obtained three-dimensional image.
  • the camera assembly 216 may be disposed on one side (e.g., an upper side, a lower side, a left side or a right side) of the electronic device 110.
  • the electronic device 110 as described above, may extract the depth information through two or more IR cameras of the camera assembly 216, and perform authentication, thereby preventing an un-authenticated user from coming in and out through the door using the images.
  • the camera assembly 216 may obtain a face image of a user (or a person coming in and out) through one or more cameras, and may extract depth information from the obtained face image. Additionally, the processor 214 may generate an ID (e.g., a face ID) based on the face image (including depth information) obtained by the camera assembly 216. The processor 214, as described above, may determine that face images of various users, obtained by the camera assembly 216, differ based on the depth information.
  • an ID e.g., a face ID
  • the sensor assembly 219 may include one or more sensors capable of sensing a distance from a user who comes in and out of the home, movement of a user, or brightness in a place in which the door is installed and the like.
  • the sensor assembly 219 may include a distance measuring sensor 219a, a movement detecting sensor 219b and an illuminance sensor 219c.
  • the distance measuring sensor 219a may be disposed in a position (e.g., one or more of an upper, lower, left and right side of an electronic device 110) where a distance from the user may be measured.
  • the movement detecting sensor 219b may be disposed in a position (e.g., one or more of an upper, lower, left and right side of an electronic device 110) where movement of the user may be detected.
  • the illuminance sensor 219c may be disposed in a position (e.g., one or more of an upper, lower, left and right side of an electronic device 110) where brightness outside the door may be sensed.
  • the processor 214 may drive software to control one or more components (e.g., a communicator 210, an input part 211, a display 212a, a speaker 212b, a storage 213, a camera assembly 216 and a sensor assembly 219) connected to the processor 214 based on wired communication or wireless communication. Additionally, the processor 214 may process various types of data and perform calculation based on the wired communication or the wireless communication.
  • a communicator 210 e.g., a communicator 210, an input part 211, a display 212a, a speaker 212b, a storage 213, a camera assembly 216 and a sensor assembly 219
  • the processor 214 may process various types of data and perform calculation based on the wired communication or the wireless communication.
  • the processor 214 may load, to the storage 213, and process an instruction or data received from the communicator 210, the input part 211, the display 212a, the speaker 212b, the storage 213, the camera assembly 216, the sensor assembly 219 and the like, and may store the processed data in the storage 213.
  • the processor 214 may display the processed data through the display 212a, or output the same through the speaker 212b.
  • the processor 214 may store an identifier (e.g., a phone number) of the mobile device, and determine that the mobile device is an authenticated device. Additionally, the processor 214 may set a communication channel, through the mobile device (e.g., a smartphone) of one or more of the users (e.g., a family member), optionally (e.g., at the user's request) authenticated, and the communicator 210. Additionally, the processor 214 may receive an instruction from the mobile device remotely placed through the communication channel and perform a function corresponding to the received instruction.
  • a mobile device e.g., a smartphone
  • the processor 214 may receive an instruction from the mobile device remotely placed through the communication channel and perform a function corresponding to the received instruction.
  • the processor 214 may activate the first camera 217a of the camera assembly 216 based on recognizing a user, and obtain a first image of the user through the activated first camera.
  • the processor 214 may measure a distance between a user (e.g., a person coming in and out) adjacent to the door 130 and the electronic device 110 through the distance measuring sensor 219a. When the measured distance is within a predetermined distance (e.g., 2 m), the processor 214 may determine that the user is to come in and out through the door 130. When determining the user is to come in and out through the door 130, the processor 214 may execute an operation of the smart door system 100.
  • a predetermined distance e.g. 2 m
  • the processor 214 may authenticate a user based on the obtained first image and the second image pre-stored in the storage 213.
  • the processor 214 may obtain a first image (e.g., a first face image) of the user (e.g., Hong Gildong) through the first camera 217a (e.g., an RGB camera), compare the obtained first image and a second image (e.g., a second face image) of the user (e.g., Hong Gildong) pre-stored in the storage 213, and authenticate the user (e.g., Hong Gildong).
  • a first image e.g., a first face image
  • a second image e.g., a second face image
  • the processor 214 may extract at least one feature point (e.g., lengths of brows with respect to a length of the face, a distance between the brows, a size of the mouth, a size of the nose, sizes of the eyes and the like) from the first image, compare the extracted feature point with a feature point (e.g., lengths of brows with respect to a length of the face, a distance between the brows, a size of the mouth, a size of the nose, sizes of the eyes and the like) of the second image, and perform authentication (e.g., primary authentication) of the user (e.g., Hong Gildong).
  • a feature point e.g., lengths of brows with respect to a length of the face, a distance between the brows, a size of the mouth, a size of the nose, sizes of the eyes and the like
  • the processor 214 may display a message indicating success of the authentication through the display 212a or output the same as a voice through the speaker 212b.
  • the processor 214 may transmit an identifier of the user to a smart mirror apparatus (not illustrated) in a home.
  • the identifier may include various types of information such as an image of the user, the name of the user, time at which the user comes in and out, and the like.
  • the processor 214 may display a message indicating the authentication fails through the display 212a, or output the same as a voice through the speaker 212b.
  • the processor 214 may display a guide message (e.g., Stand in front of the camera.) encouraging the user to be positioned in the right position with respect to the first camera 217a through the display 212a, or output the guide message as a voice through the speaker 212b.
  • the processor 214 may perform authentication procedures with predetermined frequency (e.g., three times).
  • the processor 214 may display a message indicating the authentication cannot be performed through the display 212a. Alternatively, the processor 214 may output the message as a voice through the speaker 212b.
  • the processor 214 may transmit a signal for controlling an operation of the door opening/closing device 120 mounted onto the door 130 to the door opening/closing device 120 through the communicator 210, based on the authentication (e.g., primary authentication) of the user.
  • the processor 214 may transmit a signal for unlocking (or locking) the door opening/closing device 120 to the door opening/closing device 120 (e.g., a communicator 221) through the communicator 210, based on a success of the authentication (e.g., primary authentication) of the user (e.g., Hong Gildong).
  • the processor 214 may transmit a user authentication (e.g., primary authentication)-based identifier (e.g., a user image, a user ID, and the like) to the smart mirror apparatus (not illustrated) in the home.
  • a user authentication e.g., primary authentication
  • identifier e.g., a user image, a user ID, and the like
  • the processor 214 may activate the second camera 217b of the camera assembly 216. Additionally, the processor 214 may display a message indicating movement of a body part (e.g., the palm) of the user (e.g., Hong Gildong) to a position of the activated second camera 217b through the display 212a, or output the same as a voice through the speaker 212b.
  • the second camera 217b may include two IR cameras.
  • the processor 214 may display a guide line for guiding the body part to an area where the body part needs to be placed through the display 212a, while outputting the message.
  • the processor 214 may display the guide line through the display 212a such that the body part (e.g., the palm) of the user (e.g., Hong Gildong) is placed in a position adequate for the second camera 217b to capture an image of the body part.
  • the processor 214 may obtain a preview image of the body part through the first camera 217a, and display the preview image on the display 212a, while displaying the guide line through the display 212a.
  • the processor 214 may obtain a preview image of the body part in real time and display the preview image obtained in real time on the display 212a.
  • the processor 214 may display the preview image obtained in real time and the guide line together on the display 212a.
  • the processor 214 may calculate a gap between the displayed guide line and the displayed preview image in real time. Additionally, based on the gap calculated in real time, the processor 214 may make an alert sound at a higher volume level through the speaker 212b as the gap is larger, and make an alert sound at a lower volume level as the gap is smaller.
  • the processor 214 may display visual information, depending on whether the preview image and the displayed guide line are matched, through the display 212a.
  • the visual information may include information (e.g., a color, an arrow, a gradation effect and the like) that encourages the user to move the body part at a position corresponding to the guide line.
  • the user may move the body part based on the visual information such that the second camera 217b easily obtains an image of the body part (e.g., the palm).
  • the second camera 217b may include two or more IR cameras.
  • the processor 214 may obtain information on veins of the body part (e.g., the palm) through the second camera 217b.
  • the processor 214 may obtain an image (e.g., two or more images) of the body part of the user respectively through the second camera 217b (e.g., two or more IR cameras), and extract depth information on the user using the two obtained images.
  • the processor 214 may obtain a stereo-scopic image (e.g., a three-dimensional image) of the user based on the extracted depth information.
  • the processor 214 may also obtain information on veins (e.g., a thickness, a direction and the like of a vein) of the body part (e.g., the palm) through the second camera 217b.
  • the processor 214 may perform authentication (e.g., secondary authentication) of the user based on a third image obtained by the second camera 217b and a fourth image pre-stored in the storage 213.
  • the processor 214 may obtain the third image (e.g., a first palm image) of the user (e.g., Hong Gildong) through the second camera 217b (e.g., two or more IR cameras), compare the obtained third image and the fourth image (e.g., a second palm image) of the user (e.g., Hong Gildong) pre-stored in the storage 213, and perform authentication (e.g., secondary authentication) of the user (e.g., Hong Gildong).
  • the processor 214 may extract a feature point (e.g., a thickness, a direction, a shape, a flexural degree and the like of a vein) from the third image, compare the extracted feature point with a feature point (e.g., a thickness, a direction, a shape, a flexural degree and the like of a vein) of the fourth image, and perform authentication (e.g., secondary authentication) of the user (e.g., Hong Gildong).
  • a feature point e.g., a thickness, a direction, a shape, a flexural degree and the like of a vein
  • the processor 214 may transmit an identifier of the user to a smart mirror apparatus (not illustrated) in a home.
  • the identifier may include various types of information such as an image of the user, the name of the user, time at which the user comes in and out, and the like).
  • the processor 214 may transmit a signal for controlling an operation of the door opening/closing device 120 installed in the door 130 to the door opening/closing device 120 through the communicator 210, based on the authentication (e.g., secondary authentication) of the user.
  • the processor 214 may transmit a signal for unlocking (or locking) the door opening/closing device 120 to the door opening/closing device 120 (e.g., a communicator 221) through the communicator 210, based on a success of the authentication (e.g., secondary authentication) of the user (e.g., Hong Gildong).
  • the processor 214 may identify whether a signal in relation to closing of the door 130 is received from the door opening/closing device 120 through the communicator 210.
  • the door 130 may be opened and closed.
  • the sensor assembly 233 of the door opening/closing device 120 may sense the opening and closing of the door, and the door opening/closing device 120 may transmit a signal in relation to opening and closing of the door to the electronic device 110 through the communicator 221.
  • the electronic device 110 having received the signal of opening and closing of the door may sense the opening and closing of the door 130.
  • the processor 214 may transmit a signal for controlling sterilization of the door opening/closing device 120 to the door opening/closing device 120 through the communicator 210, based on receipt of the signal in relation to closing of the door 130.
  • the processor 214 may identify the opening and/or closing of the door 130 based on the signal received from the door opening/closing device 120.
  • the processor 214 may transmit a control signal for sterilizing the door opening/closing device 120 (e.g., a handle 230) to the door opening/closing device 120.
  • the door opening/closing device 120 e.g., a processor 226) itself may sterilize the handle 230 through a UVC sterilizer 231.
  • the processor 214 may identify brightness around one or more of the electronic device 110 and the door opening/closing device 120 through the illuminance sensor 219c.
  • the processor 214 may select and operate one or more of the first camera 217a and the second camera 217b, based on the sensed surrounding brightness. For example, when the brightness is a reference value or greater, the processor 214 may operate the first camera 217a, and when the brightness is less than the reference value, the processor 214 may operate the second camera 217b. Alternatively, the processor 214 may operate the first camera 217a and the second camera 217b to obtain an image of a user regardless of brightness.
  • the processor 214 may display one or more operation states of the electronic device 110, one or more operation states of the door opening/closing device 120, and at least piece of information on the opening and closing of the door 130 on the display 212a.
  • the processor 214 may display various types of information on access through the smart door such as sensing of a user, an operation state of the cameras, whether to obtain an image, whether to perform authentication, whether to control opening and closing of the door and the like, on the display 212a.
  • the door opening/closing device 120 may include a body 220 and a handle 230.
  • the body 220 may include a communicator 221, a speaker 222, a battery 223, a motor 224 and a lock part 225 and a processor 226.
  • the handle 230 may include a UVC sterilizer 231, a light emitter 232, and a sensor assembly 233.
  • FIG. 2 shows components included in the body 220 and the handle 230 of the door opening/closing device 120 according to one embodiment and are not limited to the ones of the embodiment in FIG.2. When necessary, some components may be added, modified or removed.
  • the communicator 221 may include one or more circuits capable of transmitting one or more signals or at least piece of information to one or more components (e.g., a speaker 222, a battery 223, a motor 224, a lock part 225, a processor 226, a UVC sterilizer 231, a light emitter 232, and a sensor assembly 233) included in the door opening/closing device 120 and receiving the same from one or more of the components, based on wired communication or wireless communication.
  • components e.g., a speaker 222, a battery 223, a motor 224, a lock part 225, a processor 226, a UVC sterilizer 231, a light emitter 232, and a sensor assembly 23
  • the communicator 221 may include one or more circuits capable of transmitting one or more signals or at least piece of information to components (e.g., an input part 211, a display 212a, a speaker 212b, a storage 213, a processor 214, a camera assembly 216 and a sensor assembly 219) included in the electronic device 110 and receiving the same from the components, based on wired communication or wireless communication.
  • components e.g., an input part 211, a display 212a, a speaker 212b, a storage 213, a processor 214, a camera assembly 216 and a sensor assembly 21
  • the communicator 221 may apparently receive a signal or data from various types of external devices or transmit the same to various types of external devices.
  • the speaker 222 may change a sound into an electric signal or vice versa.
  • the speaker 222 may output a sound through an acoustic output device (e.g., ear buds or headphones).
  • the speaker 222 may output an acoustic signal to the outside of the door opening/closing device 120.
  • the speaker 222 may output information provided to a user coming in and out through the door as a voice.
  • the speaker 222 may be included in the door opening/closing device 120 or disposed as an additional component.
  • the battery 223 may supply power to one or more components of the door opening/closing device 120.
  • the battery 223, for example, may include a primary battery that is not rechargeable and a secondary battery that is rechargeable.
  • the battery 223 may include a USB terminal (not illustrated) configured to receive power from an external power source or to supply power to the door opening/closing device 120, on one side thereof.
  • the motor 224 may control movement of the handle 230 such that the handle 230 is spaced from the body 220.
  • the processor 226 may control an operation of the handle 230 such that the handle 230 separates from the body 220 while rotating clockwise or counterclockwise through the motor 224.
  • the handle 230 may separate from the body 220 to allow the user to easily grip the handle 230 under the control of the motor 224.
  • a member (not illustrated) for preventing opening of the door 130 may protrude toward a frame of the door 130, to prevent the opening of the door 130.
  • the lock part 225 may be disposed in the frame, and the member (not illustrated) for preventing the opening of the door 130 may protrude toward the door to prevent the opening of the door 130.
  • the UVC sterilizer 231 may radiate ultraviolet rays having short wavelengths (e.g., about 100 nm to 280 nm) to sterilize (or disinfect) the door opening/closing device 120.
  • the UVC sterilizer 231 may sterilize (or disinfect) the handle 230 of the door opening/closing device 120.
  • the UVC sterilizer 231 may be disposed on one side of the door opening/closing device 120.
  • the UVC sterilizer 231 may be disposed in a portion where the body 220 contacts the handle 230, to sterilize a portion of the handle 230 of the door opening/closing device 120, gripped by the user.
  • the light emitter 232 may include at least one light emitting element (e.g., a light emitting diode (LED)) emitting light in different colors.
  • the light emitter 232 may emit light under the control of the processor 226 to visually show an operation state of the body 220 and/or the handle 230.
  • the light emitter 232 may emit light in different colors depending on different operations of the body 220 and/or the handle 230.
  • an LED disposed on one side (e.g., an edge) of the handle 230 may emit a predetermined color (e.g., white) of light under the control of the processor 226.
  • a predetermined color e.g., violet
  • the LED on one side (e.g., an edge) of the handle 230 may emit a predetermined color (e.g., violet) of light.
  • the sensor assembly 233 may include one or more sensors capable of sensing a distance from a user trying to come in and out of a home, movement of the user or brightness in a place where the door is installed and the like.
  • the sensor assembly 233 may include a distance measuring sensor, a movement detecting sensor, and an illuminance sensor.
  • the sensor assembly 233 may include a proximity sensor 233a.
  • the sensor assembly 233 (e.g., a proximity sensor 233a) may be disposed in a gap between the body 220 and the handle 230 to protect the hand of the user trying to grip the handle 230.
  • the sensor assembly 233 may sense the object and provide a signal based on the sensing of the object to the processor 226.
  • the sensor assembly 233 e.g., a proximity sensor 233a
  • the processor 226 may stop an operation of the door opening/closing device 120 temporarily.
  • the processor 226 may drive software to control one or more components (e.g., a communicator 221, a speaker 222, a battery 223, a motor 224, a lock part 225, a UVC sterilizer 231, a light emitter 232, and a sensor assembly 233) connected to the processor 226, based on wired communication or wireless communication. Additionally, the processor 226 may process various types of data and perform calculation based on the wired communication or wireless communication.
  • one or more components e.g., a communicator 221, a speaker 222, a battery 223, a motor 224, a lock part 225, a UVC sterilizer 231, a light emitter 232, and a sensor assembly 233
  • the processor 226 may process various types of data and perform calculation based on the wired communication or wireless communication.
  • the processor 226 may process instructions or data received from the communicator 221, the speaker 222, the battery 223, the motor 224, the lock part 225, the UVC sterilizer 231, the light emitter 232, the sensor assembly 233 and the like, and transmit the processed data to the electronic device 110.
  • the processor 226 may transmit the processed data to the electronic device 110, or output the processed data through the speaker 222.
  • the processor 226 may receive a control signal for controlling the door opening/closing device 120 through the communicator 221 from the electronic device 110 disposed near the door 130.
  • the control signal may include various types of signals for controlling operations (e.g., opening or closing of the door) of the door opening/closing device 120.
  • the control signal may include various types of signals for controlling operations (e.g., opening or closing of the handle) of the handle 230 of the door opening/closing device 120.
  • the processor 226 may unlock (or lock) the door opening/closing device 120 based on the received control signal.
  • the processor 226 may obtain the signal, transmitted through the communicator 210 of the electronic device 110, through the communicator 221, and, based on the obtained signal, control the locking or unlocking of the door opening/closing device 120.
  • the communicator 210 of the electronic device 110 and the communicator 221 of the door opening/closing device 120 may connect in a wired or wireless manner.
  • the processor 226 may control the motor 224 based on the received control signal such that the handle 230 protrudes from the body 220 while rotating.
  • the processor 226 may obtain a signal in relation to closing or opening of the door 130 through the sensor assembly 233 (e.g., an opening/closing detecting sensor (not illustrated)), and identify the closing or opening of the door 130.
  • the processor 226 may activate the lock part 225 to lock the door opening/closing device 120.
  • the processor 226 may sterilize the door opening/closing device 120 (e.g., a handle 230) through the light emitter 232 (e.g., at least one ultraviolet-C light emitting diode (UVC LED)).
  • the handle 230 may include at least one UVC LED, and the at least one UVC LED may be disposed at a position adequate to disinfect (e.g., sterilize) an area where the hand of the user gripping the handle 230 is positioned.
  • the processor 226 my control at least one light emitting element such that the at least one the light emitting element emits light in different colors, based on the opening and closing of the door 130.
  • the processor 226 may allow the at least one light emitting element to emit light in different colors based on the opening or closing of the door 130.
  • the processor 226 may drive a timer 227 to display, through the speaker 222, a message indicating that the handle 230 is not gripped unless the user grips the handle 230 within a predetermined period (e.g., 10 seconds).
  • the processor 226 may allow the light emitter 232 (e.g., an LED) to emit light indicating that the handle 230 is not gripped.
  • FIG. 3 is a flow chart showing a process of authenticating a user in a smart door system according to one embodiment.
  • the electronic device 110 may sense an approaching user (S310).
  • the electronic device 110 e.g., a processor 214) may sense whether a user approaching the door 130.
  • the electronic device 110 e.g., a processor 214) may sense a user (e.g., a person coming in and out) approaching the door 130 through the sensor assembly 219 (e.g., a distance measuring sensor 219a or the movement detecting sensor 219b).
  • the electronic device 110 may obtain a first image of the user (S312).
  • the electronic device 110 e.g., a processor 214 may obtain a first image of a body part (e.g., the face) of the user (e.g., Hong Gildong) approaching the door 130 through one or more of the cameras 217a, 217b of the camera assembly 216 disposed on one surface (e.g., a front surface) of the electronic device 110.
  • the electronic device 110 may compare the obtained first image with a pre-stored second image to perform authentication of the user (S314).
  • the electronic device 110 e.g., a processor 2114 may obtain a first image (e.g., a first face image) of the user (e.g., Hong Gildong) through a first camera 217a (e.g., an RGB camera), compare the obtained first image with a second image (e.g., a second face image) of the user (e.g., Hong Gildong) pre-stored in the storage 213, and perform authentication of the user (e.g., Hong Gildong).
  • a first image e.g., a first face image
  • a second image e.g., a second face image
  • the electronic device 110 may obtain a first image (e.g., a first face image) of the user (e.g., Hong Gildong) through a second camera 217b (e.g., at least one IR camera), compare the obtained first image with a second image (e.g., a second face image) of the user (e.g., Hong Gildong), pre-stored in the storage 213, and perform authentication of the user (e.g., Hong Gildong).
  • a first image e.g., a first face image
  • a second camera 217b e.g., at least one IR camera
  • the electronic device 110 may identify success in authentication (S316).
  • the electronic device 110 e.g., a processor 2114 may extract a feature point (e.g., lengths of brows with respect to a length of the face, a distance between the brows, a size of the mouth, a size of the nose, sizes of the eyes and the like) from the first image, compare the extracted feature point with a feature point (e.g., lengths of brows with respect to a length of the face, a distance between the brows, a size of the mouth, a size of the nose, sizes of the eyes and the like) of the second image, perform authentication (e.g., primary authentication) of the user (e.g., Hong Gildong), and determine whether the authentication succeeds.
  • a feature point e.g., lengths of brows with respect to a length of the face, a distance between the brows, a size of the mouth, a size of the nose, sizes of the eyes and the like
  • the electronic device 110 may determine that the two feature points are the same when a difference between the feature points is within a predetermined range (e.g., 5 mm).
  • the predetermined range e.g., 5 mm
  • the predetermined range may be variably adjusted.
  • the electronic device 110 may output a message indicating failure in authentication (S318).
  • the electronic device 110 e.g., a processor 214) may determine that the first image and the second image differ and identify failure in authentication, when a difference between the feature points is out of the predetermined range (e.g., 5 mm).
  • the electronic device 110 e.g., a processor 214) may display a message indicating the failure in the authentication through the display 212a, or output the message as a voice through the speaker 212b.
  • the electronic device 110 may transmit a signal for controlling an operation of the door opening/closing device 120 to the door opening/closing device 120 (S320).
  • the electronic device 110 e.g., a processor 214 may transmit a signal for controlling an operation of the door opening/closing device 120 to the door opening/closing device 120 (e.g., a communicator 221).
  • the electronic device 110 e.g., a processor 214) may transmit different signals to the door opening/closing device 120 (e.g., a communicator 221) depending on whether the authentication (e.g., primary authentication) succeeds.
  • the door opening/closing device 120 may unlock the door and open the handle (S322).
  • the door opening/closing device 120 e.g., a processor 226) may control and open the lock part 225 and start to open the handle 230 when receiving a signal indicating that the authentication (e.g., primary authentication) succeeds through the communicator 221 from the electronic device 110 (e.g., a communicator 210).
  • the door opening/closing device 120 e.g., a processor 226) may drive the motor 224 such that the handle 230 escapes from the body 220 while rotating clockwise or counterclockwise.
  • the door opening/closing device 120 may be kept locked by the lock part 225, when receiving a signal indicating that the authentication (e.g., primary authentication) fails through the communicator 221 from the electronic device 110 (e.g., a communicator 210).
  • the authentication e.g., primary authentication
  • the door opening/closing device 120 may sense the closing of the door (S324).
  • the door opening/closing device 120 e.g., a processor 226) may sense that the door 130 is closed after the door 130 is opened.
  • the door opening/closing device 120 e.g., a processor 226) may receive a signal in relation to closing of the door 130 from the electronic device 110 and determine the closing of the door 130.
  • the door opening/closing device 120 may sterilize the handle (S326).
  • the door opening/closing device 120 e.g., a processor 226 may sterilize the door opening/closing device 120 (e.g., a handle 230) through the UVC sterilizer 231.
  • the door opening/closing device 120 e.g., a processor 226) may receive a signal based on the opening and closing of the door 130 from the electronic device 110, and, based on the received signal, sterilize the door opening/closing device 120 (e.g., a handle 230) through the UVC sterilizer 231.
  • FIG. 4 is a flow chart showing a process of authenticating a user in a smart door system according to another embodiment.
  • the electronic device 110 may sense a user approaching the door (S410).
  • the electronic device 110 e.g., a processor 214) may sense whether a user approaches the door 130.
  • the electronic device 110 e.g., a processor 214) may sense a user (e.g., a person coming in and out) approaching the door 130 through the sensor assembly 219 (e.g., one or more of a distance measuring sensor 219a and a movement detecting sensor 219b).
  • the electronic device 110 may obtain a first image of the user through a first camera 217a (S412).
  • the electronic device 110 e.g., a processor 214 may obtain an image of a body part (e.g., the face) of the user (e.g., Hong Gildong) approaching the door 130 through at least one camera (e.g., an RGB camera, and one or more IR cameras) of the camera assembly 216 disposed on the front surface of the electronic device 110.
  • the electronic device 110 may compare the obtained first image with a pre-stored second image and perform primary authentication of the user (S414).
  • the electronic device 110 e.g., a processor 214) may obtain a first image (e.g., a first face image) of the user (e.g., Hong Gildong) through a first camera 217a (e.g., an RGB camera), compare the obtained first image with a second image (e.g., a second face image) of the user (e.g., Hong Gildong) pre-stored in the storage 213, and perform authentication of the user (e.g., Hong Gildong).
  • a first image e.g., a first face image
  • a second image e.g., a second face image
  • the electronic device 110 may obtain a first image (e.g., a first face image) of the user (e.g., Hong Gildong) through a second camera 217b (e.g., one or more IR cameras), compare the obtained first image with a second image (e.g., a second face image) of the user (e.g., Hong Gildong) pre-stored in the storage 213, and determine whether an authentication (e.g., primary authentication) of the user (e.g., Hong Gildong) is performed based on the results of the comparison.
  • a first image e.g., a first face image
  • a second camera 217b e.g., one or more IR cameras
  • an authentication e.g., primary authentication
  • the electronic device 110 may identify success in the primary authentication (S416).
  • the electronic device 110 e.g., a processor 2114 may extract a feature point (e.g., lengths of brows with respect to a length of the face, a distance between the brows, a size of the mouth, a size of the nose, sizes of the eyes and the like) from the first image, and compare the extracted feature point with a feature point (e.g., lengths of brows with respect to a length of the face, a distance between the brows, a size of the mouth, a size of the nose, sizes of the eyes and the like) of the second image.
  • a feature point e.g., lengths of brows with respect to a length of the face, a distance between the brows, a size of the mouth, a size of the nose, sizes of the eyes and the like
  • the electronic device 110 may determine that the two feature points are the same when a difference between the feature points is within a predetermined range (e.g., 5 mm), and determine that the authentication of the user succeeded.
  • the predetermined range e.g., 5 mm
  • the predetermined range may be variably adjusted.
  • the electronic device 110 may output a message indicating failure in authentication (S418).
  • the electronic device 110 e.g., a processor 214) may determine that the first image and the second image differ, and identify that the authentication is failed, when a difference between the feature points is out of the predetermined range (e.g., 5 mm).
  • the electronic device 110 e.g., a processor 214) may display a message indicating the failure in the authentication through the display 212a, or output the message as a voice through the speaker 212b.
  • the electronic device 110 may activate the second camera 217b and display a guide message (S420).
  • the electronic device 110 e.g., a processor 2114 may compare the feature point of the first image and the feature point of the second image, and then when determining that the authentication (e.g., primary authentication) succeeded, activate one or more of the IR cameras 217b to perform additional authentication (e.g., secondary authentication) of the user (e.g., Hong Gildong).
  • the electronic device 110 may display a guide message (e.g., Spread your palm before the camera.) encouraging the user (e.g., Hong Gildong) to place a body part (e.g., the palm) in the right position with respect to the second camera 217b through the display 212a, or output the guide message as a voice through the speaker 212b.
  • a guide message e.g., Spread your palm before the camera.
  • the user e.g., Hong Gildong
  • a body part e.g., the palm
  • the electronic device 110 may obtain a third image (S422).
  • the electronic device 110 e.g., a processor 214) may obtain a third image of a body part (e.g., the palm) of the user (e.g., Hong Gildong).
  • the obtained third image may include information on veins (e.g., a thickness, direction and the like of the veins) in the palm of the user (e.g., Hong Gildong).
  • the electronic device 110 e.g., a processor 214) may reoutput the guide message (e.g., Spread your palm before the camera.).
  • the electronic device 110 may recapture an image of a body part (e.g., the palm) of the user (e.g., Hong Gildong) through the second camera 217b.
  • a body part e.g., the palm
  • the user e.g., Hong Gildong
  • the electronic device 110 may compare the obtained third image and a pre-stored forth image, and perform secondary authentication of the user (S424).
  • the electronic device 110 e.g., a processor 214) may obtain a third image (e.g., a first palm image) of the user (e.g., Hong Gildong) through the second camera 217b (e.g., one or more IR cameras), compare the obtained third image and a fourth image (e.g., a second palm image) of the user (e.g., Hong Gildong) pre-stored in the storage 213, and perform authentication of the user (e.g., Hong Gildong).
  • a third image e.g., a first palm image
  • the second camera 217b e.g., one or more IR cameras
  • a fourth image e.g., a second palm image of the user (e.g., Hong Gildong) pre-stored in the storage 213, and perform authentication of the user (e.g., Hong Gil
  • the electronic device 110 may identify success in the secondary authentication (S426).
  • the electronic device 110 e.g., a processor 2114 may extract a feature point (e.g., a length of the veins, a distance between the veins, a thickness of the veins and the like) from the third image, and compare the extracted feature point and a feature point (e.g., a length of the veins, a distance between the veins, a thickness of the veins and the like) of the fourth image.
  • a feature point e.g., a length of the veins, a distance between the veins, a thickness of the veins and the like
  • the electronic device 110 may determine that the two feature points are the same and that the authentication (e.g., secondary authentication) of the user (e.g., Hong Gildong) succeeded, when a difference between the feature points is within a predetermined range (e.g., 2 mm).
  • a predetermined range e.g. 2 mm
  • the electronic device 110 may transmit a signal for controlling an operation of the door opening/closing device to the door opening/closing device (S428).
  • the electronic device 110 e.g., a processor 214 may transmit a signal for controlling an operation of the door opening/closing device 120 to the door opening/closing device 120 (e.g., a communicator 221).
  • the electronic device 110 e.g., a processor 214) may transmit different signals to the door opening/closing device 120 (e.g., a communicator 221) depending on whether the authentication (e.g., secondary authentication) succeeds.
  • the door opening/closing device 120 may unlock the door and open the handle (S430).
  • the door opening/closing device 120 may control and open the lock part 225 and start to open the handle 230.
  • the door opening/closing device 120 e.g., a processor 226) may drive the motor 224 such that the handle 230 escapes from the body 220 while rotating.
  • the door opening/closing device 120 may be kept locked by the lock part 225.
  • the door opening/closing device 120 may sense the closing of the door (S432).
  • the door opening/closing device 120 e.g., a processor 226) may sense the opening and closing of the door 130.
  • the door opening/closing device 120 e.g., a processor 226) may receive a signal in relation to closing of the door 130 from the electronic device 110, and determine that the door 130 is closed.
  • the door opening/closing device 120 may sterilize the handle (S434).
  • the door opening/closing device 120 e.g., a processor 2266 may sterilize the door opening/closing device 120 (e.g., a handle 230) through the UVC sterilizer 231.
  • the door opening/closing device 120 e.g., a processor 226) may receive a signal based on the opening and closing of the door 130 from the electronic device 110, and, based on the received signal, sterilize the door opening/closing device 120 (e.g., a handle 230) through the UVC sterilizer 231.
  • FIG. 5a is an exemplary view showing a screen on a display when an electronic device according to one embodiment does not sense an approach of a user.
  • FIG. 5b is an exemplary view showing a user approaching an electronic device according to one embodiment.
  • FIG. 5c is an exemplary view showing a screen on a display of an electronic device sensing an approach of a user according to one embodiment.
  • the electronic device 110 may configure a screen 510 including various types of information and display the screen 510 on the display 212a.
  • the screen 510 may include a first area 511 and a second area 512.
  • the electronic device 110 may display, in the first area 511 of the screen 510, one or more of a state message (e.g., Do not disturb.) set by a user, a message (e.g., Ring the door bell.) that is displayed when face recognition continues to fail with predetermined frequency (e.g., three times) or more, a message (e.g., Door opening/closing) in relation to the closing or opening of the door 130, and a message (e.g., In security mode) indicating an operation in a security mode.
  • a state message e.g., Do not disturb.
  • a message e.g., Ring the door bell.
  • predetermined frequency e.g., three times
  • a message e.g., Door opening/closing
  • a message e.g., In security mode
  • the electronic device 110 may display position information (e.g., unit No. 1302) which is set by the user, and a security mode icon in case a security mode is set, in the second area 512 of the screen 510.
  • position information e.g., unit No. 1302
  • security mode icon in case a security mode is set, in the second area 512 of the screen 510.
  • one or more of the camera assembly 216 and the sensor assembly 219 of the electronic device 110 may be disposed in a third area (e.g., an upper side, a lower side, a left side, or a right side) of the electronic device 110.
  • the first area 511 and the second area 512 may be areas on the display 212a, and the third area 513 may be an area on one side (e.g., the lower side) of the electronic device 110.
  • the camera assembly 216 of the electronic device 110 may be disposed on the lower side of the electronic device 110.
  • the camera assembly 216 may include a compact CMOS camera demonstrator (C3D) camera.
  • the C3D camera may extract a distance (e.g., a depth) from a user based on an active stereo method. For example, if the camera assembly 216 includes a C3D camera, a LED light 218a, a first IR camera 217b1, an RGB camera 217a, an IR pattern light 218b, and a second IR camera 217b2 may consecutively disposed in the third area 513 from left to right. The disposition is described as an example, and, apparently, may vary in the present disclosure. The camera assembly 216 may obtain and extract depth information through the first IR camera 217b1, the IR pattern light 218b, and the second IR camera 217b2.
  • a distance e.g., a depth
  • the first IR camera 217b1 which is used as a reference camera in extracting depth information from an image obtained from a user, may be disposed near the RGB camera 217a for image matching and an image obtained by the RGB camera 217a, for example.
  • the IR pattern light 218b may be disposed in the middle of the two IR cameras 217b1, 217b2 to ensure depth performance, and may emit light when the two IR cameras 217b1, 217b2 obtain an image, for example. Additionally, a distance between the two IR cameras 217b1, 217b2 may be determined considering the depth performance (e.g., a distance resolution capability).
  • the two IR cameras 217b1, 217b2 are spaced apart from each other by a predetermined distance (e.g., 50 mm), and the two IR cameras 217b1, 217b2 may obtain high-quality depth results in a range of 30 cm to 2 M.
  • the predetermined distance e.g., 50 mm
  • the predetermined distance may be variably adjusted.
  • the third area 513 may further include a distance measuring sensor 219a, a movement detecting sensor 219b, an illuminance sensor 219c, and a speaker 212b.
  • the distance measuring sensor 219a, the movement detecting sensor 219b, the illuminance sensor 219c, and the speaker 212b may be disposed in an area different from the third area 513 on the front surface of the electronic device 110.
  • the electronic device 110 may configure the screen 530 in FIG. 5c and display the screen 530 on the display 212a.
  • the screen 530 may include a first icon 531, a first area 532, a second area 533, and a second icon 534.
  • the first icon 531 e.g., a lock
  • the first icon 531 may denote success/failure in authentication in relation to face recognition. For example, when the authentication fails three times or more, the first icon 531 may be kept displayed. For example, after the authentication fails five times, the first icon 531 may not be displayed for 30 seconds. Additionally, when the first icon 531 is pressed after 30 seconds, a reauthentication procedure may be performed.
  • various types of information may be displayed in the first area 532 and the second area 533.
  • a guide message e.g., Welcome., Your face will be captured., and the like
  • the second icon 534 may perform the function of a door bell.
  • the electronic device 110 may be configured to output a sound indicating a visit through a speaker (not illustrated) in a home.
  • FIG. 6a is an exemplary view showing a process of performing authentication using an image obtained by an electronic device according to one embodiment from a user.
  • FIG. 6b is an exemplary view indicating a result that the electronic device according to one embodiment performs an authentication of a user by using an image obtained from a user.
  • the electronic device may configure a screen 610 performing an authentication by using an image obtained from a user and display the configured screen 610 through the display 212a.
  • the screen 610 may include a first area 611, a second area 612, a third area 613, and a settings menu 614.
  • the electronic device 110 when an image (e.g., a first image) is obtained from a body part (e.g., the face) of a user 520, the electronic device 110 (e.g., a processor 214) may compare the obtained first image and an image (e.g., a second image) pre-stored in the storage 213 and perform authentication (e.g., primary authentication).
  • the electronic device 110 e.g., a processor 214) may display a rate (e.g., 50 % of authentication performed) at which an authentication procedure is currently being performed in the first area 611 of the screen 610.
  • the electronic device 110 may display a message indicating failure in the authentication (e.g., primary authentication) in the first area 611.
  • the electronic device 110 e.g., a processor 2114 may display information on closing/opening of the door 130 in the first area 611.
  • the electronic device 110 may display an authentication guide image or various types of information in relation to authentication in a second area 612.
  • the electronic device 110 e.g., a processor 2114 may display a guide in the second area 612 for a user unfamiliar with the smart door system 100.
  • the electronic device 110 e.g., a processor 214) may display a preview image obtained from a user or information on an authentication process in the second area 612.
  • the electronic device 110 may display various types of guide phrases in relation to a current authentication procedure in a third area 613.
  • the electronic device 110 when the electronic device 110 (e.g., a processor 214) senses that the settings menu 614 is selected, the electronic device 110 (e.g., a processor 214) may go into a setting mode in relation to the electronic device 110. Instructions in relation to one or more functions of the electronic device 110, such as a change in a password, storage or renewal of a user's image and the like may be input in the setting mode.
  • the electronic device may configure a screen 620 where authentication is performed by using an image obtained from a user, and display the screen through the display 212a.
  • the screen 620 may include a first area 621, a second area 622, a third area 623, and a settings menu 614.
  • the electronic device 110 when authentication (e.g., primary authentication) is completed, the electronic device 110 (e.g., a processor 214) may display a message (e.g., Authentication completed), indicating that the authentication (e.g., primary authentication) is completed, in the first area 621.
  • a message e.g., Authentication completed
  • the electronic device 110 may display an image (e.g., an image preset by a user) of an authenticated user (e.g., an authenticated person who comes in and out) in the second area 622.
  • an authenticated user e.g., an authenticated person who comes in and out
  • the electronic device 110 e.g., a processor 2114 may display an authentication reattempt button (not illustrated) or an authentication cancelation button (not illustrated) in the second area 622.
  • the electronic device 110 e.g., a processor 214) may display a profile image of a user in the second area 622.
  • the electronic device 110 may display information indicating that predetermined time (e.g., 30 seconds) is being counted in the second area 622.
  • the electronic device 110 may display various types of information in relation to success in the authentication, such as the name of a user (e.g., Hong Gildong), a welcome message (e.g., Welcome.) and the like, in the third area 623.
  • a user e.g., Hong Gildong
  • a welcome message e.g., Welcome.
  • FIG. 7a is an exemplary view showing a screen where an electronic device according to one embodiment performs authentication using a face image obtained from a user.
  • FIG. 7b is an exemplary view showing a result of primary authentication performed by an electronic device according to one embodiment using an image obtained from a user.
  • FIG. 7c is an exemplary view where an electronic device according to one embodiment obtains an image of a palm of a user.
  • FIG. 7d is an exemplary view showing a preview image of a palm of a user obtained by an electronic device according to one embodiment.
  • FIG. 7e is an exemplary view showing a result of secondary authentication performed by an electronic device according to one embodiment.
  • the electronic device 110 may configure a screen 710 performing an authentication by using a face image obtained from a user, and may display the configured screen 710 through the display 212a.
  • the screen 710 may include a first area 711, a second area 712, a third area 713 and an icon 714.
  • Information displayed in the first area 711 may be identical or similar to information displayed in the first area 611 of the screen 610 in FIG. 6a.
  • the electronic device 110 may display an authentication guide image or information in the second area 712.
  • the electronic device 110 e.g., a processor 214 may display a guide for a user unfamiliar with the smart door system 100 in the second area 712.
  • the electronic device 110 e.g., a processor 214 may display a preview image 712a obtained from the user 520 and information in relation to an authentication procedure in the second area 712.
  • the electronic device 110 may display various types of guiding phrases in relation to current authentication in the third area 713.
  • the icon 714 may perform a function as a door bell.
  • the electronic device 110 e.g., a processor 2114
  • the electronic device 110 may configure a screen 720 performing a primary authentication being performed by using the face image from the user, and display the configured screen 720 through the display 212a.
  • the screen 720 may include a first area 721, a second area 722, a third area 623 and a settings menu 614.
  • the electronic device 110 may display a message (e.g., Face recognition completed) indicating that the authentication is competed in the first area 721.
  • the electronic device 110 may display an image (e.g., an image preset by the user) of the authenticated user (e.g., primarily authenticated user) in the second area 722.
  • the electronic device 110 e.g., a processor 2114 may display information indicating that additional authentication (e.g., secondary authentication) is to be performed in the second area 722.
  • the electronic device 110 e.g., a processor 214) may display a guide for authentication of veins in the second area 722.
  • the electronic device 110 when authentication fails, the electronic device 110 (e.g., a processor 214) may display an authentication reattempt button or an authentication cancelation button in the second area 722.
  • the electronic device 110 when authentication fails with predetermined frequency (e.g., three times) or more, the electronic device 110 (e.g., a processor 214) may indicate predetermined time (e.g., 30 seconds) is being counted, in the second area 722.
  • the predetermined frequency e.g., three times
  • the predetermined frequency e.g., three times
  • the electronic device 110 may display a message (e.g., Align the palm onto the camera below the screen.) encouraging the user to place a body part (e.g., the palm) to one or more of the first IR camera 217b1, the RGB camera 217a and the second IR camera 217b2 of the camera assembly 216, in the third area 723, for secondary authentication.
  • a message e.g., Align the palm onto the camera below the screen.
  • a body part e.g., the palm
  • two or more cameras may be used such that the electronic device according to one embodiment obtains an image of the body part 734 of the user.
  • the electronic device 110 e.g., a processor 2114 may obtain an image of the body part 734 of the user using one or more of the two IR cameras 217b1, 217b2 and the RGB camera 217a.
  • the electronic device 110 e.g., a processor 214) may obtain an image of the body part 734 (e.g., the palm 735) of the user depending on surrounding brightness through one or more of the two IR cameras 217b1, 217b2 and the RGB camera 217a.
  • the electronic device 110 may display a message (e.g., Authentication is underway.) indicating authentication (e.g., secondary authentication) of the user is being performed in the first area 731. Additionally, the electronic device 110 (e.g., a processor 214) may display a preview image 742 of the obtained body part 734 of the user in the second area 732. Further, the electronic device 110 (e.g., a processor 214) may display a guide line 741 along which the preview image 742 is moved, along with the preview image 742, in the second area 732.
  • the guide line 741 may denote a guide that encourages the user to move the body part (e.g., the palm).
  • the electronic device may configure a screen 750 regarding the authentication (e.g., secondary authentication) being performed by using an image obtained from the user, and display the screen 750 through the display 212a.
  • the screen 750 may include a first area 751, a second area 752, a third area 753 and a settings menu 754.
  • the electronic device 110 e.g., a processor 214 may display a message (e.g., Authentication completed) indicating the authentication is completed, in the first area 751.
  • a message e.g., Authentication completed
  • the electronic device 110 may display an image (e.g., an image preset by a user) of the authenticated user, in the second area 752.
  • the electronic device 110 e.g., a processor 214 may display an authentication reattempt button or an authentication cancelation button in the second area 752.
  • the electronic device 110 e.g., a processor 214) may display a profile image of a user in the second area 752.
  • the electronic device 110 may display a message indicating that predetermined time (e.g., 30 seconds) is being counted, in the second area.
  • predetermined frequency e.g., three times
  • predetermined time e.g., 30 seconds
  • the predetermined frequency e.g., three times
  • predetermined time e.g., 30 seconds
  • the electronic device 110 may display various types of information such as the name (e.g., Hong Gildong) of a user, a welcome message (e.g., Welcome.) and the like in relation to success in authentication in the third area 753.
  • the name e.g., Hong Gildong
  • a welcome message e.g., Welcome.
  • the electronic device 110 when sensing the settings menu is selected, the electronic device 110 (e.g., a processor 214) may go into a setting mode in relation to the electronic device 110.
  • the setting mode may denote a mode in which an instruction for one or more functions of the electronic device 110, such as a change in the password, storage or renewal of an image of the user and the like, is input.
  • FIG. 8a is an exemplary view showing a screen indicating a touch gesture being input from a user through an electronic device according to one embodiment.
  • FIG. 8b is an exemplary view showing a screen to which a password is input from a user in an electronic device according to one embodiment.
  • FIG. 8c is an exemplary view showing a result of authentication performed by an electronic device according to one embodiment using an input password.
  • the electronic device may configure a screen 710 to which a touch gesture is input from the user, and display the screen 710 through the display 212a.
  • the screen 710 may be partially similar to the screen 710 in FIG. 7a.
  • the second area 712 of the screen 710 in FIG. 7a and the second area 712 of the screen 710 in FIG. 8a may be partially different in that, while the second area 712 of the screen 710 in FIG. 7a displays a preview image (e.g., a face image) obtained from the user, the second area 712 of the screen 710 in FIG. 8a may receive a touch and drag of the user.
  • a preview image e.g., a face image
  • the electronic device 110 when receiving an input based on the touch and drag where a finger touches and moves from a first position 801 to a second position 802 on the second area 712, the electronic device 110 (e.g., a processor 214) may recognize the input as an instruction for inputting a password.
  • the electronic device 110 e.g., a processor 2114 may display a screen to which a password for opening the door 130 is input, on the display 212a.
  • the input based on the touch of the left and the drag to the right is provided only as an example, and the touch and drag may vary depending on settings of the user in the disclosure.
  • the electronic device 110 may configure a screen 810 to which a password for opening the door 130 is input, and display the screen 810 through the display 212a, based on the touch and drag.
  • the screen 810 may include a first area 811 in which a message for guiding an input of a password is displayed, a second area 812 in which the input password is displayed, and a third area 813 in which a virtual keypad 814 is displayed to receive the password by the user.
  • the electronic device 110 e.g., a processor 214 may encode numbers, symbols, characters or special characters, input through the virtual keypad 814, and display the encoded numbers, symbols, characters or special characters in the second area 812.
  • the electronic device may configure a screen 820 where authentication has been performed using the password input from the user, and display the screen 820 through the display 212a.
  • the screen 820 may include a first area 821, a second area 822, a third area 823 and a settings menu 824.
  • the electronic device 110 when authentication based on the input of the password is completed, the electronic device 110 (e.g., a processor 214) may display a message (e.g., Authentication completed) indicating the authentication is completed, in the first area 821.
  • a message e.g., Authentication completed
  • the electronic device 110 may display an image (e.g., an image preset by the user) of the authenticated user, in the second area 822.
  • the electronic device 110 e.g., a processor 214 may display an authentication reattempt button (not illustrated) or an authentication cancelation button (not illustrated), in the second area 822.
  • the electronic device 110 e.g., a processor 214) may display a profile image of the user in the second area 822.
  • the electronic device 110 may indicate that predetermined time (e.g., 30 seconds) is being counted, in the second area 822.
  • predetermined frequency e.g., three times
  • predetermined time e.g., 30 seconds
  • the predetermined frequency e.g., three times
  • predetermined time e.g., 30 seconds
  • the electronic device 110 may display various types of information in relation to success in authentication such as a name of the user (e.g., Hong Gildong), a welcome message (e.g., Welcome.) and the like, in the third area 823.
  • the method of authentication with a password may be applied when an image (e.g., a face image, biometric information image and the like) for authentication is not pre-stored (e.g., when the door is opened for a guest who makes a visit to an absent house.
  • the electronic device 110 when sensing that the settings menu 754 is selected, the electronic device 110 (e.g., a processor 214) may go into a setting mode in relation to the electronic device 110.
  • the setting mode may denote a mode in which an instruction for one or more functions of the electronic device 110, such as a change in the password, storage or renewal of an image of the user and the like, is input.
  • FIG. 9a is an exemplary view showing a handle of a door opening/closing device, being separated from a body, based on authentication of a user according to one embodiment.
  • FIG. 9b is an exemplary view showing a handle of a door opening/closing device, completely separated from a body, based on authentication of a user according to one embodiment.
  • FIG. 9c is an exemplary view showing a handle of a door opening/closing device according to one embodiment, gripped by a user.
  • the door opening/closing device 120 may receive a signal for undoing the closing of the door opening/closing device 120 from the electronic device 110 (e.g., a communicator 210).
  • the door opening/closing device 120 e.g., a processor 226
  • the motor 224 may control the motor 224 such that the handle 230 is separated from the body 220.
  • the door opening/closing device 120 may control the lock part 225 to unlock the door opening/closing device 120.
  • the door opening/closing device 120 e.g., a processor 226 may control the motor 224 such that the handle 230 is separated from the body 220 after unlocking the door opening/closing device 120 as a result of control over the lock part 225.
  • the door opening/closing device 120 e.g., a processor 226) may emit light through an LED of the light emitter 232 of the handle 230, to allow the user to recognize the unlocking of the door 130.
  • the door opening/closing device 120 may sense a touch of the body part (e.g., the hand 910) of the user on the handle 230 through the sensor assembly 233.
  • the door opening/closing device 120 e.g., a processor 226) may sense that the user grips the handle 230 with the hand 910, through the sensor assembly 233. For example, when the user pulls the handle 230 in a direction (i.e., a direction where the user is positioned) opposite to the body 220 after gripping the handle 230, the door 130 may be opened by force of the user.
  • the door opening/closing device 120 may sense the closing of the door 130 through the sensor assembly 223. Additionally, the door opening/closing device 120 (e.g., a processor 226) may start to sterilize the handle 230 through the UVC sterilizer 231.
  • FIG. 10(a) is an exemplary view showing a screen, indicating that a handle is currently being sterilized and displayed by an electronic device according to one embodiment.
  • FIG. 10(b) is an exemplary view showing a door opening/closing device being sterilized according to one embodiment.
  • the electronic device 110 may display a screen 1010 indicating the handle 230 is currently being sterilized through the display 212a.
  • the screen 1010 may include a first area 1010 in which a message indicating the handle 230 is currently being sterilized is displayed, a second area 1012 in which position information (e.g., unit No. 1302) set by the user and a security mode icon are displayed, and a third area 1013 indicating a door bell.
  • the electronic device 110 may emit an LED disposed on one side (e.g., an edge) of the handle 230 in a predetermined color (e.g., violet), to show the handle 230 is currently being sterilized.
  • a predetermined color e.g., violet

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • General Health & Medical Sciences (AREA)
  • Signal Processing (AREA)
  • Computer Security & Cryptography (AREA)
  • Epidemiology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Vascular Medicine (AREA)
  • Lock And Its Accessories (AREA)

Abstract

The present disclosure relates to an electronic device controlling opening and closing of a door, a door opening/closing device attached to the door, and a method therefor. To this end, the electronic device may obtain an image of a user, compare the obtained image with a pre-stored image, perform an authentication of the user, and, based on the authentication, transmit, to the door opening/closing device, a signal for controlling an operation of the door opening/closing device disposed in the door. Thus, the present disclosure may provide the convenience of opening the door without having an additional key for opening the door.

Description

ELECTRONIC DEVICE FOR CONTROLLING OPENING AND CLOSING OF A DOOR, A DOOR OPENING/CLOSING DEVICE DISPOSED TO THE DOOR, AND A METHOD THEREFOR
The present disclosure relates to an electronic device for controlling opening and closing of a door, a door opening/closing device disposed to the door, and a method therefor.
Doors (e.g., an entry door) are generally rotated and opened in one direction or in both directions to allow users to come in and out of an indoor space. Additionally, doors can be provided with a lock for controlling their opening and closing to prevent an unspecified person to come into the indoor space. The lock is usually a mechanical one that is opened only with a specific key.
However, it is highly likely that users often feel cumbersome to carry the key, lose the key, or the lock is susceptible to manipulation. Accordingly, in recent years, electronic locks or digital locks have been developed and widely used.
An electronic lock or a digital lock denotes a lock capable of recognizing a password, or a semiconductor chip or a smart card instead of a key. The lock can be opened only when input information matches information stored therein as a result of comparison therebetween.
Although an electronic lock or a digital lock of the related art comes without an additional key, various types of means for opening the lock including a card-shaped key need to contact the lock to open the lock.
Additionally, the electronic lock or the digital lock of the related art is vulnerable to theft when a password for opening the lock is stolen by another person.
Thus, there is a growing need for a smart door system that can open a lock using biometric information of a person coming in and out, thereby easing inconvenience of carrying a key and bring the key into contact with the lock and reducing the risk of loss and manipulation.
A digital lock of the related art cause inconvenience of carrying a key and bring the key into contact with the locks, and are susceptible to the risk of loss and manipulation.
The present disclosure is directed to a smart door system that may open a lock using biometric information of a person coming in and out.
The present disclosure is also directed to an electronic device that may be disposed at a position near a door and readily obtain biometric information from a person coming in and out.
The present disclosure is also directed to a door opening/closing device that may receive a signal for controlling opening and closing of a door from the electronic device having obtained biometric information of a person coming in and out, and control the opening and closing of the door.
The present disclosure is also directed to a smart door system that may perform authentication of a person coming in and out more than once.
The present disclosure is also directed to an electronic device that may include at least one IR camera capable of identifying a person coming in and out even in darkness.
The present disclosure is also directed to a smarter method for opening and closing a door by providing various types of information in relation to the opening and closing of the door through the electronic device.
The present disclosure is also directed to opening/closing of a door device that may sterilize a contaminated portion contacted by unspecified multiple people.
Objectives are not limited to the above ones, and other objectives and advantages that have not been mentioned can be clearly understood from the following description and can be more clearly understood from the embodiments set forth herein. Further, the objectives and advantages can be realized via means and combinations thereof in the appended claims.
According to the present disclosure, a smart door system based on a communication of between an electronic device and a door opening/closing device may be provided to authenticate a person coming in and out, and determine whether to open or close a door.
According to the disclosure, an electronic device may be disposed at a position near a door, and may include an RGB camera and at least one IR camera, to obtain biometric information from a person coming in and out regardless of surrounding brightness.
According to the disclosure, a communication of between an electronic device and a door opening/closing device may be provided, to control an operation of the door opening/closing device through the electronic device.
According to the disclosure, an authentication based on biometric information of a person coming in and out may be performed more than once through a plurality of cameras included in an electronic device.
According to the disclosure, a UVC LED may be disposed in a door opening/closing device to sterilize a contaminated portion by contacting of unspecified multiple peoples.
According to the disclosure, opening and closing of a handle of a door opening/closing device may be controlled, to prevent unauthenticated people from coming in out.
To this end, an electronic device according to the disclosure may include a display, a camera assembly, a communicator, a storage, and a processor electrically connected to the display, the camera assembly, the communicator, and the storage. The processor may activate a first camera of the camera assembly based on recognizing a user, and obtain a first image of the user through the activated first camera. Additionally, the processor may perform authentication of the user based on the obtained first image and a second image pre-stored in the storage, and transmit a signal for controlling an operation of a door opening/closing device disposed in a door to the door opening/closing device through the communicator based on the authentication of the user.
A door opening/closing device according to the disclosure may include a body attached to a first side of a door and configured to protrude, and a handle operatively connected to the door opening/closing device. The body may include a communicator, a motor, and a processor electrically connected to the communicator and the motor. The processor may receive, through the communicator, a signal for controlling the door opening/closing device from an electronic device disposed at a position near the door, and unlock the door opening/closing device, based on the received control signal.
A method for controlling opening and closing of a door according to the disclosure may include an electronic device's activating a first camera based on recognizing a user and obtaining a first image of the user through the activated first camera. The method may include the electronic device's performing authentication of the user based on the obtained first image and a pre-stored second image, and transmitting, to a door opening/closing device, a signal for controlling an operation of the door opening/closing device installed in the door based on the authentication of the user.
Additionally, the door opening/closing device may receive a signal for controlling the door opening/closing device from the electronic device, and unlock the door opening/closing device based on the received control signal.
An electronic device according to the present disclosure may obtain a first image of a user through one or more cameras, perform authentication of the user based on the obtained first image and a pre-stored second image, and control an operation of a door opening/closing device disposed in a door, to open the door in a convenient manner without an additional key for opening the door.
The electronic device may include an RGB camera and one or more IR cameras, to authenticate a person coming in and out day and night.
The electronic device may display a guide line for guiding biometric information of a person coming in and out, to obtain biometric information from a person coming in and out more readily.
The electronic device may extract information on veins of a body part of a person coming in and out through one or more IR cameras, and perform authentication of the person coming in and out using the extracted information on veins, to provide tight security.
The electronic device may display at least piece of information on one or more operation states of the electronic device and one or more operation states of the door opening/closing device, to allow a person to come in and out readily.
Additionally, a door opening/closing device according to one embodiment may receive a signal for controlling opening and closing of the door from the electronic device, and, based on the received signal, control the opening and closing of the door, to allow a person to come in and out conveniently.
The door opening/closing device may control a handle such that the handle protrudes from a body, to prevent an unauthenticated person from coming in and out.
The door opening/closing device may sterilize the door opening/closing device based on the closing of the door, to remove contamination caused by unspecified multiple people.
The door opening/closing device may emit one or more light emitting elements in different colors based on the opening and closing of the door, to allow a user to determine an operation state of the door opening/closing device.
Specific effects are described together with the above-described effects in the section of "Detailed Description".
FIG. 1 is an exemplary view showing a smart door system according to one embodiment.
FIG. 2 is a block diagram showing an electronic device and a door opening/closing device according to one embodiment.
FIG. 3 is a flow chart showing a process of authenticating a user in a smart door system according to one embodiment.
FIG. 4 is a flow chart showing a process of authenticating a user in a smart door system according to another embodiment.
FIG. 5a is an exemplary view showing a screen on a display when an electronic device according to one embodiment does not sense an approach of a user.
FIG. 5b is an exemplary view showing a user approaching an electronic device according to one embodiment.
FIG. 5c is an exemplary view showing a screen on a display of an electronic device sensing an approach of a user according to one embodiment.
FIG. 6a is an exemplary view showing a process of performing authentication using an image obtained by an electronic device according to one embodiment from a user.
FIG. 6b is an exemplary view indicating a result that the electronic device according to one embodiment performs an authentication of a user by using an image obtained from a user.
FIG. 7a is an exemplary view showing a screen where an electronic device according to one embodiment performs authentication using a face image obtained from a user.
FIG. 7b is an exemplary view showing a result of primary authentication performed by an electronic device according to one embodiment using an image obtained from a user.
FIG. 7c is an exemplary view where an electronic device according to one embodiment obtains an image of a palm of a user.
FIG. 7d is an exemplary view showing a preview image of a palm of a user obtained by an electronic device according to one embodiment.
FIG. 7e is an exemplary view showing a result of secondary authentication performed by an electronic device according to one embodiment.
FIG. 8a is an exemplary view showing a screen indicating a touch gesture being input from a user through an electronic device according to one embodiment.
FIG. 8b is an exemplary view showing a screen to which a password is input from a user in an electronic device according to one embodiment.
FIG. 8c is an exemplary view showing a result of authentication performed by an electronic device according to one embodiment using an input password.
FIG. 9a is an exemplary view showing a handle of a door opening/closing device, being separated from a body, based on authentication of a user according to one embodiment.
FIG. 9b is an exemplary view showing a handle of a door opening/closing device, completely separated from a body, based on authentication of a user according to one embodiment.
FIG. 9c is an exemplary view showing a handle of a door opening/closing device according to one embodiment, gripped by a user.
The above-described aspects, features and advantages are specifically described hereunder with reference to the accompanying drawings such that one having ordinary skill in the art to which the present disclosure pertains can easily implement the technical spirit in the disclosure. In the disclosure, detailed description of known technologies in relation to the disclosure is omitted if it is deemed to make the gist of the disclosure unnecessarily vague. Below, preferred embodiments according to the disclosure are specifically described with reference to the attached drawings. Throughout the disclosure, identical reference numerals can denote identical or similar components.
It should be understood that the terms "first", "second" and the like, are used herein only to distinguish one component from another component. Thus, the components should not be limited by the terms. For instance, a first component can be a second component unless stated to the contrary.
When one component is described as being "connected", "coupled", or "connected" to another component, one component can be directly connected, coupled or connected to another component; however, it is also to be understood that an additional component can be "interposed" between the two components, or the two components can be "connected", "coupled", or "connected" through an additional component.
Throughout the disclosure, each component can be provided as a single one or a plurality of ones, unless explicitly stated to the contrary.
The singular forms "a", "an" and "the" are intended to include the plural forms as well, unless explicitly indicated otherwise. It should be further understood that the terms "comprise" or "have," set forth herein, are not interpreted as necessarily including all the stated components or steps but can be interpreted as including some of the stated components or steps or can be interpreted as further including additional components or steps.
Throughout the disclosure, the terms "A and/or B" as used herein can denote A, B or A and B, and the terms "C to D" can denote greater than C and less than D, unless stated to the contrary.
Below, an electronic device authenticating a user and a door opening/closing device, and a smart door system therefor are described according to some embodiments.
FIG. 1 is an exemplary view showing a smart door system according to one embodiment.
Referring to FIG. 1, a smart door system 100 according to one embodiment may include an electronic device 110, a door opening/closing device 120, and a door 130. The door opening/closing device 120 may be disposed at the door 130, and the electronic device 110 may be disposed on a wall outside a home. The door opening/closing device 120 and the electronic device 110 may receive and transmit a signal in a wired or wireless manner.
According to one embodiment, the electronic device 110 may recognize a user, and, based on recognizing the user, activate a first camera disposed on one side (e.g., an upper side, a lower side, a left side, or a right side) of a front surface of the electronic device 110. Additionally, the electronic device 110 may obtain a first image of the user through the first camera, and, based on the obtained first image and a second image pre-stored in the electronic device 110, may authenticate the user. Additionally, the electronic device 110 may a signal for controlling an operation of the door opening/closing device 120 installed in the door 130 to the door opening/closing device 120, based on the authentication of the user.
According to one embodiment, the door opening/closing device 120 may receive a signal for controlling the door opening/closing device 120 from the electronic device 110 disposed at a position near the door 130. Additionally, the door opening/closing device 120 may unlock the door opening/closing device 120, based on the received control signal.
According to one embodiment, the door 130 may include a door (e.g., an entry door) through which the user freely comes in and out. Opening and closing of the door 130 may be controlled by a lock part of the door opening/closing device 120.
FIG. 1 shows a configuration of the smart door system 100 according to one embodiment. Components of the smart door system 100 may not be limited to those of the embodiment illustrated in FIG. 1, and, when necessary, some components may be added, modified or removed.
FIG. 2 is a block diagram showing an electronic device and a door opening/closing device according to one embodiment.
Referring to FIG. 2, the electronic device 110 according to one embodiment may include a communicator 210, an input part 211, a display 212a, a speaker 212b, a storage 213, a processor 214, a camera assembly 216 and a sensor assembly 219.
FIG. 2 shows a configuration of the electronic device 110 according to one embodiment. Components of the electronic device 110 may not be limited to those of the embodiment illustrated in FIG. 2, and, when necessary, some components may be added, modified or removed.
According to one embodiment, the communicator 210 may include one or more circuits capable of transmitting one or more signals or one or more pieces of information to one or more components (e.g., an input part 211, a display 212a, a speaker 212b, a storage 213, a processor 214, a camera assembly 216 and a sensor assembly 219) included in the electronic device 110, and receiving the same from one or more of the components, based on wired communication or wireless communication.
According to one embodiment, the communicator 210 may include one or more circuits capable of transmitting one or more signals or one or more pieces of information to components (e.g., a speaker 222, a battery 223, a motor 224, a lock part 225, a processor 226, a UVC sterilizer 231, a light emitter 232, and a sensor assembly 233) included in the door opening/closing device 120, and receiving the same from the components, based on wired communication or wireless communication.
According to one embodiment, the communicator 210 may apparently receive a signal or data from various types of external devices or transmit the same to the external devices.
According to one embodiment, the input part 211 may include an interface configured to transmit/receive data to/from an external device (e.g., a universal serial bus (USB)), an external hard drive device (not illustrated) and the like). The input part 211 may various types of information, input by the user, to the processor 214. To this end, the input part 211 may include a physical operation member such as a switch, a button and the like or an electric operation member such as a touch key, a touch pad, a touch screen and the like. Alternatively, the input part 211 may further include a microphone configured to receive a voice signal of the user.
According to one embodiment, the display 212a may display various types of information (e.g., multimedia data or text data and the like). The display 212a may display results processed, being processed or to be processed by the processor 214. The display 212a may visually provide various types of information regarding an operation state of the door opening/closing device 120 through the electronic device 110, and may include a control circuit configured to display various types of information.
According to one embodiment, the display 212a may include a touch circuitry configured to sense a touch, or a pressure sensor configured to measure an intensity of pressure of a touch.
According to one embodiment, the speaker 212b may change a sound into an electric signal or vice versa. The speaker 212b may output a sound through an acoustic output device (e.g., ear buds or headphones). The speaker 212b may output an acoustic signal to the outside of the electronic device 110. The speaker 212b may output information provided to the user who comes in and out through a door as a voice.
According to one embodiment, the speaker 212b may be disposed in the electronic device 110 as an additional component or may be included in an output part (not illustrated) along with the display 212a.
According to one embodiment, the storage 213 may include a volatile memory or a non-volatile memory. For example, the storage 213 may store information, data, programs and the like required for an operation of the electronic device 110 or the door opening/closing device 120. Accordingly, the processor 214 may perform a control operation, described below, with reference to the information stored in the storage 213. The storage 213 may also store various types of platforms. The storage 213, for example, may include at least one storage media of a flash memory type storage device, a hard disk type storage device, a multimedia card micro type storage device, a card type memory (e.g., an SD memory or an XD memory and the like), RAM, ROM (EEPROM and the like), and a USB memory.
According to one embodiment, the storage 213 may store various types of data (e.g., software, an application, obtained information, measured information, a control signal, and the like), and instructions in relation to data obtained or used by at least one component (e.g., a communicator 210, an input part 211, a display 212a, a speaker 212b, a storage 213, a processor 214, a camera assembly 216 and a sensor assembly 219) of the electronic device 110. For example, the storage 213 may store a control signal or data received from the door opening/closing device 120.
According to one embodiment, the storage 213 may store information on a preregistered user (e.g., a person coming in and out or a family member). The storage 213 may store an image (e.g., a face image and the like) and a biometric information image (e.g., an image including information on veins) of the preregistered user (e.g., a person coming in and out or a family member), and information (e.g., a name, a date of birth, gender, age and the like) on each user (e.g., each family member). The storage 213 may store various types of information (e.g., success in authentication, failure in authentication, a state where a door opening/closing device is being sterilized, a home address, a password for opening the door and the like) provided to a user coming in and out through the door.
According to one embodiment, the camera assembly 216 may include a first camera 217a, a second camera 217b, and a light part 218. One or more of the cameras of the camera assembly 216 may be disposed at a position where an image of a user (or a body part of the user) coming in and out through the door is readily obtained. The camera assembly 216 may include the first camera 217a (e.g., a red-green-blue (RGB) camera) capable of obtaining an image of a user despite a high illumination caused by the light part 218. Further, the camera assembly 216 may include one or more of the second cameras 217b (e.g., an RGB camera, an infrared (IR) camera and the like).
According to one embodiment, an artificial intelligence chip (e.g., DQ1) may be built in one or more of the cameras of the camera assembly 216. Through the artificial intelligence chip, the camera assembly 216 may recognize the face of a user and sense the presence of a user.
According to one embodiment, the camera assembly 216 may be a camera to which a deep learning-based algorithm operational in the artificial intelligence chip (e.g., DQ1 of LG electronics) is applied, and may provide personalized services such as registration and recognition of a user, sensing of an approach of a user, sensing of an occupant, sensing of a user returning home, self-monitoring, capturing of an image and the like. Additionally, the processor 314 may transmit, to a server (not illustrated; e.g., the ThinQ artificial intelligence server), information (e.g., an image) collected from one or more users outside a home through one or more cameras (e.g., an image intelligence camera) provided with an artificial intelligence chip through the communicator 210, and the server (not illustrated) may generate information for convenience of an occupant and then transmit the information to the electronic device 110 or a smart mirror apparatus (not illustrated) again. Accordingly, the electronic device 110 may provide more convenient services to the user.
According to one embodiment, one or more of the first camera and the second camera may obtain an image of the user (or a body part of the user), and may deliver the obtained image to the processor 214. The second camera 217b may include one or more IR cameras capable of obtaining an image of a user even in darkness (at low illumination). The light part 218 may include an element (e.g., an LED, an IR pattern light) configured to emit light required for obtaining an image through one or more of the first camera 217a and the second camera 217b. The camera assembly 216 may include a Compact CMOS camera demonstrator (C3D) camera. The C3D camera may extract a distance (e.g., a depth) from the user according to an active stereo method. The camera assembly 216 may obtain and extract depth information through the first IR camera 217b1, the IR pattern light 218b, and the second IR camera 217b2.
According to one embodiment, the camera assembly 216 may obtain an image of the user respectively through two or more IR cameras and may extract depth information on the user from the two obtained images. Additionally, based on the extracted depth information, the camera assembly 216 may obtain a stereo-scopic image (e.g., a three-dimensional image) of the user. Additionally, the processor 214 may authenticate the user based on the obtained three-dimensional image. The camera assembly 216 may be disposed on one side (e.g., an upper side, a lower side, a left side or a right side) of the electronic device 110. The electronic device 110, as described above, may extract the depth information through two or more IR cameras of the camera assembly 216, and perform authentication, thereby preventing an un-authenticated user from coming in and out through the door using the images.
According to one embodiment, the camera assembly 216 may obtain a face image of a user (or a person coming in and out) through one or more cameras, and may extract depth information from the obtained face image. Additionally, the processor 214 may generate an ID (e.g., a face ID) based on the face image (including depth information) obtained by the camera assembly 216. The processor 214, as described above, may determine that face images of various users, obtained by the camera assembly 216, differ based on the depth information.
According to one embodiment, the sensor assembly 219 may include one or more sensors capable of sensing a distance from a user who comes in and out of the home, movement of a user, or brightness in a place in which the door is installed and the like. The sensor assembly 219, for example, may include a distance measuring sensor 219a, a movement detecting sensor 219b and an illuminance sensor 219c.
According to one embodiment, the distance measuring sensor 219a may be disposed in a position (e.g., one or more of an upper, lower, left and right side of an electronic device 110) where a distance from the user may be measured. Additionally, the movement detecting sensor 219b may be disposed in a position (e.g., one or more of an upper, lower, left and right side of an electronic device 110) where movement of the user may be detected. Additionally, the illuminance sensor 219c may be disposed in a position (e.g., one or more of an upper, lower, left and right side of an electronic device 110) where brightness outside the door may be sensed.
According to one embodiment, the processor 214 may drive software to control one or more components (e.g., a communicator 210, an input part 211, a display 212a, a speaker 212b, a storage 213, a camera assembly 216 and a sensor assembly 219) connected to the processor 214 based on wired communication or wireless communication. Additionally, the processor 214 may process various types of data and perform calculation based on the wired communication or the wireless communication.
According to one embodiment, the processor 214 may load, to the storage 213, and process an instruction or data received from the communicator 210, the input part 211, the display 212a, the speaker 212b, the storage 213, the camera assembly 216, the sensor assembly 219 and the like, and may store the processed data in the storage 213. Alternatively, the processor 214 may display the processed data through the display 212a, or output the same through the speaker 212b.
According to one embodiment, when one or more users (e.g., each family member) performs authentication (e.g., authentication with a QR code) using a mobile device (e.g., a smartphone), the processor 214 may store an identifier (e.g., a phone number) of the mobile device, and determine that the mobile device is an authenticated device. Additionally, the processor 214 may set a communication channel, through the mobile device (e.g., a smartphone) of one or more of the users (e.g., a family member), optionally (e.g., at the user's request) authenticated, and the communicator 210. Additionally, the processor 214 may receive an instruction from the mobile device remotely placed through the communication channel and perform a function corresponding to the received instruction.
According to one embodiment, the processor 214 may activate the first camera 217a of the camera assembly 216 based on recognizing a user, and obtain a first image of the user through the activated first camera. The processor 214 may measure a distance between a user (e.g., a person coming in and out) adjacent to the door 130 and the electronic device 110 through the distance measuring sensor 219a. When the measured distance is within a predetermined distance (e.g., 2 m), the processor 214 may determine that the user is to come in and out through the door 130. When determining the user is to come in and out through the door 130, the processor 214 may execute an operation of the smart door system 100.
According to one embodiment, the processor 214 may authenticate a user based on the obtained first image and the second image pre-stored in the storage 213.The processor 214 may obtain a first image (e.g., a first face image) of the user (e.g., Hong Gildong) through the first camera 217a (e.g., an RGB camera), compare the obtained first image and a second image (e.g., a second face image) of the user (e.g., Hong Gildong) pre-stored in the storage 213, and authenticate the user (e.g., Hong Gildong). The processor 214 may extract at least one feature point (e.g., lengths of brows with respect to a length of the face, a distance between the brows, a size of the mouth, a size of the nose, sizes of the eyes and the like) from the first image, compare the extracted feature point with a feature point (e.g., lengths of brows with respect to a length of the face, a distance between the brows, a size of the mouth, a size of the nose, sizes of the eyes and the like) of the second image, and perform authentication (e.g., primary authentication) of the user (e.g., Hong Gildong).
According to one embodiment, when the authentication of the user succeeds, the processor 214 may display a message indicating success of the authentication through the display 212a or output the same as a voice through the speaker 212b.
According to one embodiment, when the authentication (e.g., primary authentication) of the user succeeds, the processor 214 may transmit an identifier of the user to a smart mirror apparatus (not illustrated) in a home. The identifier may include various types of information such as an image of the user, the name of the user, time at which the user comes in and out, and the like.
According to one embodiment, when the authentication (e.g., primary authentication) of the user fails, the processor 214 may display a message indicating the authentication fails through the display 212a, or output the same as a voice through the speaker 212b. For example, when the authentication fails, the processor 214 may display a guide message (e.g., Stand in front of the camera.) encouraging the user to be positioned in the right position with respect to the first camera 217a through the display 212a, or output the guide message as a voice through the speaker 212b. The processor 214 may perform authentication procedures with predetermined frequency (e.g., three times). When information (e.g., a face image or a biometric information image) on the user is not obtained, or authentication is not performed within predetermined time (e.g., 10 seconds) as a result of driving of a timer 215, the processor 214 may display a message indicating the authentication cannot be performed through the display 212a. Alternatively, the processor 214 may output the message as a voice through the speaker 212b.
According to one embodiment, the processor 214 may transmit a signal for controlling an operation of the door opening/closing device 120 mounted onto the door 130 to the door opening/closing device 120 through the communicator 210, based on the authentication (e.g., primary authentication) of the user. The processor 214 may transmit a signal for unlocking (or locking) the door opening/closing device 120 to the door opening/closing device 120 (e.g., a communicator 221) through the communicator 210, based on a success of the authentication (e.g., primary authentication) of the user (e.g., Hong Gildong). Further, the processor 214 may transmit a user authentication (e.g., primary authentication)-based identifier (e.g., a user image, a user ID, and the like) to the smart mirror apparatus (not illustrated) in the home.
According to one embodiment, when the authentication (e.g., primary authentication) of the user (e.g., Hong Gildong) succeeds using the first image and the second image, the processor 214 may activate the second camera 217b of the camera assembly 216. Additionally, the processor 214 may display a message indicating movement of a body part (e.g., the palm) of the user (e.g., Hong Gildong) to a position of the activated second camera 217b through the display 212a, or output the same as a voice through the speaker 212b. The second camera 217b may include two IR cameras.
According to one embodiment, the processor 214 may display a guide line for guiding the body part to an area where the body part needs to be placed through the display 212a, while outputting the message. The processor 214 may display the guide line through the display 212a such that the body part (e.g., the palm) of the user (e.g., Hong Gildong) is placed in a position adequate for the second camera 217b to capture an image of the body part.
According to one embodiment, the processor 214 may obtain a preview image of the body part through the first camera 217a, and display the preview image on the display 212a, while displaying the guide line through the display 212a. The processor 214 may obtain a preview image of the body part in real time and display the preview image obtained in real time on the display 212a.
According to one embodiment, the processor 214 may display the preview image obtained in real time and the guide line together on the display 212a.
According to one embodiment, the processor 214 may calculate a gap between the displayed guide line and the displayed preview image in real time. Additionally, based on the gap calculated in real time, the processor 214 may make an alert sound at a higher volume level through the speaker 212b as the gap is larger, and make an alert sound at a lower volume level as the gap is smaller.
According to one embodiment, the processor 214 may display visual information, depending on whether the preview image and the displayed guide line are matched, through the display 212a. The visual information may include information (e.g., a color, an arrow, a gradation effect and the like) that encourages the user to move the body part at a position corresponding to the guide line. The user may move the body part based on the visual information such that the second camera 217b easily obtains an image of the body part (e.g., the palm). The second camera 217b may include two or more IR cameras.
According to one embodiment, the processor 214 may obtain information on veins of the body part (e.g., the palm) through the second camera 217b. The processor 214 may obtain an image (e.g., two or more images) of the body part of the user respectively through the second camera 217b (e.g., two or more IR cameras), and extract depth information on the user using the two obtained images. Additionally, the processor 214 may obtain a stereo-scopic image (e.g., a three-dimensional image) of the user based on the extracted depth information. The processor 214 may also obtain information on veins (e.g., a thickness, a direction and the like of a vein) of the body part (e.g., the palm) through the second camera 217b.
According to one embodiment, the processor 214 may perform authentication (e.g., secondary authentication) of the user based on a third image obtained by the second camera 217b and a fourth image pre-stored in the storage 213. The processor 214 may obtain the third image (e.g., a first palm image) of the user (e.g., Hong Gildong) through the second camera 217b (e.g., two or more IR cameras), compare the obtained third image and the fourth image (e.g., a second palm image) of the user (e.g., Hong Gildong) pre-stored in the storage 213, and perform authentication (e.g., secondary authentication) of the user (e.g., Hong Gildong).
According to one embodiment, the processor 214 may extract a feature point (e.g., a thickness, a direction, a shape, a flexural degree and the like of a vein) from the third image, compare the extracted feature point with a feature point (e.g., a thickness, a direction, a shape, a flexural degree and the like of a vein) of the fourth image, and perform authentication (e.g., secondary authentication) of the user (e.g., Hong Gildong).
According to one embodiment, when succeeding in the authentication (e.g., secondary authentication) of the user, the processor 214 may transmit an identifier of the user to a smart mirror apparatus (not illustrated) in a home. The identifier may include various types of information such as an image of the user, the name of the user, time at which the user comes in and out, and the like).
According to one embodiment, the processor 214 may transmit a signal for controlling an operation of the door opening/closing device 120 installed in the door 130 to the door opening/closing device 120 through the communicator 210, based on the authentication (e.g., secondary authentication) of the user. The processor 214 may transmit a signal for unlocking (or locking) the door opening/closing device 120 to the door opening/closing device 120 (e.g., a communicator 221) through the communicator 210, based on a success of the authentication (e.g., secondary authentication) of the user (e.g., Hong Gildong).
According to one embodiment, the processor 214 may identify whether a signal in relation to closing of the door 130 is received from the door opening/closing device 120 through the communicator 210. When the user comes in and out through the door 130, the door 130 may be opened and closed. The sensor assembly 233 of the door opening/closing device 120 may sense the opening and closing of the door, and the door opening/closing device 120 may transmit a signal in relation to opening and closing of the door to the electronic device 110 through the communicator 221. The electronic device 110 having received the signal of opening and closing of the door may sense the opening and closing of the door 130.
According to one embodiment, the processor 214 may transmit a signal for controlling sterilization of the door opening/closing device 120 to the door opening/closing device 120 through the communicator 210, based on receipt of the signal in relation to closing of the door 130. The processor 214 may identify the opening and/or closing of the door 130 based on the signal received from the door opening/closing device 120.When identifying the opening and closing of the door 130, the processor 214 may transmit a control signal for sterilizing the door opening/closing device 120 (e.g., a handle 230) to the door opening/closing device 120. Alternatively, when the door 130 is opened and closed, the door opening/closing device 120 (e.g., a processor 226) itself may sterilize the handle 230 through a UVC sterilizer 231.
According to one embodiment, the processor 214 may identify brightness around one or more of the electronic device 110 and the door opening/closing device 120 through the illuminance sensor 219c. The processor 214 may select and operate one or more of the first camera 217a and the second camera 217b, based on the sensed surrounding brightness. For example, when the brightness is a reference value or greater, the processor 214 may operate the first camera 217a, and when the brightness is less than the reference value, the processor 214 may operate the second camera 217b. Alternatively, the processor 214 may operate the first camera 217a and the second camera 217b to obtain an image of a user regardless of brightness.
According to one embodiment, the processor 214 may display one or more operation states of the electronic device 110, one or more operation states of the door opening/closing device 120, and at least piece of information on the opening and closing of the door 130 on the display 212a. The processor 214 may display various types of information on access through the smart door such as sensing of a user, an operation state of the cameras, whether to obtain an image, whether to perform authentication, whether to control opening and closing of the door and the like, on the display 212a.
Referring to FIG. 2, the door opening/closing device 120 according to one embodiment may include a body 220 and a handle 230. The body 220 may include a communicator 221, a speaker 222, a battery 223, a motor 224 and a lock part 225 and a processor 226. The handle 230 may include a UVC sterilizer 231, a light emitter 232, and a sensor assembly 233.
FIG. 2 shows components included in the body 220 and the handle 230 of the door opening/closing device 120 according to one embodiment and are not limited to the ones of the embodiment in FIG.2. When necessary, some components may be added, modified or removed.
According to one embodiment, the communicator 221 may include one or more circuits capable of transmitting one or more signals or at least piece of information to one or more components (e.g., a speaker 222, a battery 223, a motor 224, a lock part 225, a processor 226, a UVC sterilizer 231, a light emitter 232, and a sensor assembly 233) included in the door opening/closing device 120 and receiving the same from one or more of the components, based on wired communication or wireless communication.
According to one embodiment, the communicator 221 may include one or more circuits capable of transmitting one or more signals or at least piece of information to components (e.g., an input part 211, a display 212a, a speaker 212b, a storage 213, a processor 214, a camera assembly 216 and a sensor assembly 219) included in the electronic device 110 and receiving the same from the components, based on wired communication or wireless communication.
According to one embodiment, the communicator 221 may apparently receive a signal or data from various types of external devices or transmit the same to various types of external devices.
According to one embodiment, the speaker 222 may change a sound into an electric signal or vice versa. The speaker 222 may output a sound through an acoustic output device (e.g., ear buds or headphones). The speaker 222 may output an acoustic signal to the outside of the door opening/closing device 120. The speaker 222 may output information provided to a user coming in and out through the door as a voice.
According to one embodiment, the speaker 222 may be included in the door opening/closing device 120 or disposed as an additional component.
According to one embodiment, the battery 223 may supply power to one or more components of the door opening/closing device 120. The battery 223, for example, may include a primary battery that is not rechargeable and a secondary battery that is rechargeable. The battery 223 may include a USB terminal (not illustrated) configured to receive power from an external power source or to supply power to the door opening/closing device 120, on one side thereof.
According to one embodiment, the motor 224 may control movement of the handle 230 such that the handle 230 is spaced from the body 220. For example, when receiving a signal based on a success of authentication of a user from the electronic device 110 through the communicator 221, the processor 226 may control an operation of the handle 230 such that the handle 230 separates from the body 220 while rotating clockwise or counterclockwise through the motor 224.The handle 230 may separate from the body 220 to allow the user to easily grip the handle 230 under the control of the motor 224.
According to one embodiment, for the lock part 225, a member (not illustrated) for preventing opening of the door 130 may protrude toward a frame of the door 130, to prevent the opening of the door 130. Alternatively, the lock part 225 may be disposed in the frame, and the member (not illustrated) for preventing the opening of the door 130 may protrude toward the door to prevent the opening of the door 130.
According to one embodiment, the UVC sterilizer 231 may radiate ultraviolet rays having short wavelengths (e.g., about 100 nm to 280 nm) to sterilize (or disinfect) the door opening/closing device 120. Alternatively, the UVC sterilizer 231 may sterilize (or disinfect) the handle 230 of the door opening/closing device 120. The UVC sterilizer 231 may be disposed on one side of the door opening/closing device 120. Alternatively, the UVC sterilizer 231 may be disposed in a portion where the body 220 contacts the handle 230, to sterilize a portion of the handle 230 of the door opening/closing device 120, gripped by the user.
According to one embodiment, the light emitter 232 (e.g., an LED) may include at least one light emitting element (e.g., a light emitting diode (LED)) emitting light in different colors. The light emitter 232 may emit light under the control of the processor 226 to visually show an operation state of the body 220 and/or the handle 230. The light emitter 232 may emit light in different colors depending on different operations of the body 220 and/or the handle 230.
According to one embodiment, when the handle 230 is spaced from the body 220 to allow the user to easily grip the handle 230 (e.g., when the handle 230 is spaced from the body 220 at a predetermined angle (e.g., 30 degrees), an LED disposed on one side (e.g., an edge) of the handle 230 may emit a predetermined color (e.g., white) of light under the control of the processor 226. For example, while the handle 230 is being disinfected, the LED on one side (e.g., an edge) of the handle 230 may emit a predetermined color (e.g., violet) of light.
According to one embodiment, the sensor assembly 233 may include one or more sensors capable of sensing a distance from a user trying to come in and out of a home, movement of the user or brightness in a place where the door is installed and the like. For example, the sensor assembly 233 may include a distance measuring sensor, a movement detecting sensor, and an illuminance sensor. Alternatively, the sensor assembly 233 may include a proximity sensor 233a.
According to one embodiment, the sensor assembly 233 (e.g., a proximity sensor 233a) may be disposed in a gap between the body 220 and the handle 230 to protect the hand of the user trying to grip the handle 230.
For example, in case an object (e.g., the hand of a user) is sensed before the handle 230 is completely spaced from the body 220, the sensor assembly 233 (e.g., a proximity sensor 233a) may sense the object and provide a signal based on the sensing of the object to the processor 226.
For example, in case an object (e.g., the hand of a user) is sensed before the handle 230 is completely coupled to the body 220 in the state where the handle 230 is spaced from the body 220, the sensor assembly 233 (e.g., a proximity sensor 233a) may sense the object and provide a signal based on the sensing of the object to the processor 226. For example, when receiving the signal based on the sensing of the object, the processor 226 may stop an operation of the door opening/closing device 120 temporarily.
According to one embodiment, the processor 226 may drive software to control one or more components (e.g., a communicator 221, a speaker 222, a battery 223, a motor 224, a lock part 225, a UVC sterilizer 231, a light emitter 232, and a sensor assembly 233) connected to the processor 226, based on wired communication or wireless communication. Additionally, the processor 226 may process various types of data and perform calculation based on the wired communication or wireless communication.
According to one embodiment, the processor 226 may process instructions or data received from the communicator 221, the speaker 222, the battery 223, the motor 224, the lock part 225, the UVC sterilizer 231, the light emitter 232, the sensor assembly 233 and the like, and transmit the processed data to the electronic device 110. Alternatively, the processor 226 may transmit the processed data to the electronic device 110, or output the processed data through the speaker 222.
According to one embodiment, the processor 226 may receive a control signal for controlling the door opening/closing device 120 through the communicator 221 from the electronic device 110 disposed near the door 130. The control signal may include various types of signals for controlling operations (e.g., opening or closing of the door) of the door opening/closing device 120. Alternatively, the control signal may include various types of signals for controlling operations (e.g., opening or closing of the handle) of the handle 230 of the door opening/closing device 120.
According to one embodiment, the processor 226 may unlock (or lock) the door opening/closing device 120 based on the received control signal. The processor 226 may obtain the signal, transmitted through the communicator 210 of the electronic device 110, through the communicator 221, and, based on the obtained signal, control the locking or unlocking of the door opening/closing device 120. The communicator 210 of the electronic device 110 and the communicator 221 of the door opening/closing device 120 may connect in a wired or wireless manner.
According to one embodiment, the processor 226 may control the motor 224 based on the received control signal such that the handle 230 protrudes from the body 220 while rotating.
According to one embodiment, the processor 226 may obtain a signal in relation to closing or opening of the door 130 through the sensor assembly 233 (e.g., an opening/closing detecting sensor (not illustrated)), and identify the closing or opening of the door 130. When sensing the closing of the door 130, the processor 226 may activate the lock part 225 to lock the door opening/closing device 120. Additionally, when the door opening/closing device 120 is closed by the lock part 225, the processor 226 may sterilize the door opening/closing device 120 (e.g., a handle 230) through the light emitter 232 (e.g., at least one ultraviolet-C light emitting diode (UVC LED)). The handle 230 may include at least one UVC LED, and the at least one UVC LED may be disposed at a position adequate to disinfect (e.g., sterilize) an area where the hand of the user gripping the handle 230 is positioned.
According to one embodiment, the processor 226 my control at least one light emitting element such that the at least one the light emitting element emits light in different colors, based on the opening and closing of the door 130. The processor 226 may allow the at least one light emitting element to emit light in different colors based on the opening or closing of the door 130.
According to one embodiment, the processor 226 may drive a timer 227 to display, through the speaker 222, a message indicating that the handle 230 is not gripped unless the user grips the handle 230 within a predetermined period (e.g., 10 seconds). Alternatively, the processor 226 may allow the light emitter 232 (e.g., an LED) to emit light indicating that the handle 230 is not gripped.
FIG. 3 is a flow chart showing a process of authenticating a user in a smart door system according to one embodiment.
Below, the process of authenticating a user in the smart door system according to one embodiment is described with reference to FIG. 3.
According to one embodiment, the electronic device 110 (e.g., a processor 214) may sense an approaching user (S310). The electronic device 110 (e.g., a processor 214) may sense whether a user approaching the door 130. The electronic device 110 (e.g., a processor 214) may sense a user (e.g., a person coming in and out) approaching the door 130 through the sensor assembly 219 (e.g., a distance measuring sensor 219a or the movement detecting sensor 219b).
According to one embodiment, the electronic device 110 (e.g., a processor 214) may obtain a first image of the user (S312).The electronic device 110 (e.g., a processor 214) may obtain a first image of a body part (e.g., the face) of the user (e.g., Hong Gildong) approaching the door 130 through one or more of the cameras 217a, 217b of the camera assembly 216 disposed on one surface (e.g., a front surface) of the electronic device 110.
According to one embodiment, the electronic device 110 (e.g., a processor 214) may compare the obtained first image with a pre-stored second image to perform authentication of the user (S314). The electronic device 110 (e.g., a processor 214) may obtain a first image (e.g., a first face image) of the user (e.g., Hong Gildong) through a first camera 217a (e.g., an RGB camera), compare the obtained first image with a second image (e.g., a second face image) of the user (e.g., Hong Gildong) pre-stored in the storage 213, and perform authentication of the user (e.g., Hong Gildong).
According to one embodiment, the electronic device 110 (e.g., a processor 214) may obtain a first image (e.g., a first face image) of the user (e.g., Hong Gildong) through a second camera 217b (e.g., at least one IR camera), compare the obtained first image with a second image (e.g., a second face image) of the user (e.g., Hong Gildong), pre-stored in the storage 213, and perform authentication of the user (e.g., Hong Gildong).
According to one embodiment, the electronic device 110 (e.g., a processor 214) may identify success in authentication (S316). The electronic device 110 (e.g., a processor 214) may extract a feature point (e.g., lengths of brows with respect to a length of the face, a distance between the brows, a size of the mouth, a size of the nose, sizes of the eyes and the like) from the first image, compare the extracted feature point with a feature point (e.g., lengths of brows with respect to a length of the face, a distance between the brows, a size of the mouth, a size of the nose, sizes of the eyes and the like) of the second image, perform authentication (e.g., primary authentication) of the user (e.g., Hong Gildong), and determine whether the authentication succeeds. The electronic device 110 (e.g., a processor 214) may determine that the two feature points are the same when a difference between the feature points is within a predetermined range (e.g., 5 mm). The predetermined range (e.g., 5 mm) may be variably adjusted.
According to one embodiment, the electronic device 110 (e.g., a processor 214) may output a message indicating failure in authentication (S318). The electronic device 110 (e.g., a processor 214) may determine that the first image and the second image differ and identify failure in authentication, when a difference between the feature points is out of the predetermined range (e.g., 5 mm). When the authentication (e.g., primary authentication) of the user fails, the electronic device 110 (e.g., a processor 214) may display a message indicating the failure in the authentication through the display 212a, or output the message as a voice through the speaker 212b.
According to one embodiment, the electronic device 110 (e.g., a processor 214) may transmit a signal for controlling an operation of the door opening/closing device 120 to the door opening/closing device 120 (S320). When determining that the authentication (e.g., primary authentication) succeeded in the above step (S316), the electronic device 110 (e.g., a processor 214) may transmit a signal for controlling an operation of the door opening/closing device 120 to the door opening/closing device 120 (e.g., a communicator 221). The electronic device 110 (e.g., a processor 214) may transmit different signals to the door opening/closing device 120 (e.g., a communicator 221) depending on whether the authentication (e.g., primary authentication) succeeds.
According to one embodiment, the door opening/closing device 120 (e.g., a processor 226) may unlock the door and open the handle (S322). The door opening/closing device 120 (e.g., a processor 226) may control and open the lock part 225 and start to open the handle 230 when receiving a signal indicating that the authentication (e.g., primary authentication) succeeds through the communicator 221 from the electronic device 110 (e.g., a communicator 210). For example, the door opening/closing device 120 (e.g., a processor 226) may drive the motor 224 such that the handle 230 escapes from the body 220 while rotating clockwise or counterclockwise. Alternatively, the door opening/closing device 120 (e.g., a processor 226) may be kept locked by the lock part 225, when receiving a signal indicating that the authentication (e.g., primary authentication) fails through the communicator 221 from the electronic device 110 (e.g., a communicator 210).
According to one embodiment, the door opening/closing device 120 (e.g., a processor 226) may sense the closing of the door (S324). The door opening/closing device 120 (e.g., a processor 226) may sense that the door 130 is closed after the door 130 is opened. Alternatively, the door opening/closing device 120 (e.g., a processor 226) may receive a signal in relation to closing of the door 130 from the electronic device 110 and determine the closing of the door 130.
According to one embodiment, the door opening/closing device 120 (e.g., a processor 226) may sterilize the handle (S326). When determining that the door 130 is closed after the door 130 is opened, the door opening/closing device 120 (e.g., a processor 226) may sterilize the door opening/closing device 120 (e.g., a handle 230) through the UVC sterilizer 231. Alternatively, the door opening/closing device 120 (e.g., a processor 226) may receive a signal based on the opening and closing of the door 130 from the electronic device 110, and, based on the received signal, sterilize the door opening/closing device 120 (e.g., a handle 230) through the UVC sterilizer 231.
FIG. 4 is a flow chart showing a process of authenticating a user in a smart door system according to another embodiment.
Below, the process of authenticating a user in the smart door system according to another embodiment is described with reference to FIG.4.
According to one embodiment, the electronic device 110 (e.g., a processor 214) may sense a user approaching the door (S410). The electronic device 110 (e.g., a processor 214) may sense whether a user approaches the door 130. The electronic device 110 (e.g., a processor 214) may sense a user (e.g., a person coming in and out) approaching the door 130 through the sensor assembly 219 (e.g., one or more of a distance measuring sensor 219a and a movement detecting sensor 219b).
According to one embodiment, the electronic device 110 (e.g., a processor 214) may obtain a first image of the user through a first camera 217a (S412). The electronic device 110 (e.g., a processor 214) may obtain an image of a body part (e.g., the face) of the user (e.g., Hong Gildong) approaching the door 130 through at least one camera (e.g., an RGB camera, and one or more IR cameras) of the camera assembly 216 disposed on the front surface of the electronic device 110.
According to one embodiment, the electronic device 110 (e.g., a processor 214) may compare the obtained first image with a pre-stored second image and perform primary authentication of the user (S414). The electronic device 110 (e.g., a processor 214) may obtain a first image (e.g., a first face image) of the user (e.g., Hong Gildong) through a first camera 217a (e.g., an RGB camera), compare the obtained first image with a second image (e.g., a second face image) of the user (e.g., Hong Gildong) pre-stored in the storage 213, and perform authentication of the user (e.g., Hong Gildong). Alternatively, the electronic device 110 (e.g., a processor 214) may obtain a first image (e.g., a first face image) of the user (e.g., Hong Gildong) through a second camera 217b (e.g., one or more IR cameras), compare the obtained first image with a second image (e.g., a second face image) of the user (e.g., Hong Gildong) pre-stored in the storage 213, and determine whether an authentication (e.g., primary authentication) of the user (e.g., Hong Gildong) is performed based on the results of the comparison.
According to one embodiment, the electronic device 110 (e.g., a processor 214) may identify success in the primary authentication (S416). The electronic device 110 (e.g., a processor 214) may extract a feature point (e.g., lengths of brows with respect to a length of the face, a distance between the brows, a size of the mouth, a size of the nose, sizes of the eyes and the like) from the first image, and compare the extracted feature point with a feature point (e.g., lengths of brows with respect to a length of the face, a distance between the brows, a size of the mouth, a size of the nose, sizes of the eyes and the like) of the second image. Additionally, the electronic device 110 (e.g., a processor 214) may determine that the two feature points are the same when a difference between the feature points is within a predetermined range (e.g., 5 mm), and determine that the authentication of the user succeeded. The predetermined range (e.g., 5 mm) may be variably adjusted.
According to one embodiment, the electronic device 110 (e.g., a processor 214) may output a message indicating failure in authentication (S418). The electronic device 110 (e.g., a processor 214) may determine that the first image and the second image differ, and identify that the authentication is failed, when a difference between the feature points is out of the predetermined range (e.g., 5 mm). When the authentication (e.g., primary authentication) of the user fails, the electronic device 110 (e.g., a processor 214) may display a message indicating the failure in the authentication through the display 212a, or output the message as a voice through the speaker 212b.
According to one embodiment, the electronic device 110 (e.g., a processor 214) may activate the second camera 217b and display a guide message (S420). The electronic device 110 (e.g., a processor 214) may compare the feature point of the first image and the feature point of the second image, and then when determining that the authentication (e.g., primary authentication) succeeded, activate one or more of the IR cameras 217b to perform additional authentication (e.g., secondary authentication) of the user (e.g., Hong Gildong). Additionally, the electronic device 110 (e.g., a processor 214) may display a guide message (e.g., Spread your palm before the camera.) encouraging the user (e.g., Hong Gildong) to place a body part (e.g., the palm) in the right position with respect to the second camera 217b through the display 212a, or output the guide message as a voice through the speaker 212b.
According to one embodiment, the electronic device 110 (e.g., a processor 214) may obtain a third image (S422). The electronic device 110 (e.g., a processor 214) may obtain a third image of a body part (e.g., the palm) of the user (e.g., Hong Gildong). The obtained third image may include information on veins (e.g., a thickness, direction and the like of the veins) in the palm of the user (e.g., Hong Gildong).For example, unless the third image includes the information on veins, the electronic device 110 (e.g., a processor 214) may reoutput the guide message (e.g., Spread your palm before the camera.). Further, unless the third image includes the information on veins, the electronic device 110 (e.g., a processor 214) may recapture an image of a body part (e.g., the palm) of the user (e.g., Hong Gildong) through the second camera 217b.
According to one embodiment, the electronic device 110 (e.g., a processor 214) may compare the obtained third image and a pre-stored forth image, and perform secondary authentication of the user (S424). The electronic device 110 (e.g., a processor 214) may obtain a third image (e.g., a first palm image) of the user (e.g., Hong Gildong) through the second camera 217b (e.g., one or more IR cameras), compare the obtained third image and a fourth image (e.g., a second palm image) of the user (e.g., Hong Gildong) pre-stored in the storage 213, and perform authentication of the user (e.g., Hong Gildong).
According to one embodiment, the electronic device 110 (e.g., a processor 214) may identify success in the secondary authentication (S426). The electronic device 110 (e.g., a processor 214) may extract a feature point (e.g., a length of the veins, a distance between the veins, a thickness of the veins and the like) from the third image, and compare the extracted feature point and a feature point (e.g., a length of the veins, a distance between the veins, a thickness of the veins and the like) of the fourth image. Additionally, the electronic device 110 (e.g., a processor 214) may determine that the two feature points are the same and that the authentication (e.g., secondary authentication) of the user (e.g., Hong Gildong) succeeded, when a difference between the feature points is within a predetermined range (e.g., 2 mm).
According to one embodiment, the electronic device 110 (e.g., a processor 214) may transmit a signal for controlling an operation of the door opening/closing device to the door opening/closing device (S428). When determining that the authentication (e.g., secondary authentication) succeeded in the above step (S426), the electronic device 110 (e.g., a processor 214) may transmit a signal for controlling an operation of the door opening/closing device 120 to the door opening/closing device 120 (e.g., a communicator 221). The electronic device 110 (e.g., a processor 214) may transmit different signals to the door opening/closing device 120 (e.g., a communicator 221) depending on whether the authentication (e.g., secondary authentication) succeeds.
According to one embodiment, the door opening/closing device 120 (e.g., a processor 226) may unlock the door and open the handle (S430). When receiving a signal indicating the authentication (e.g., secondary authentication) succeeded from the electronic device 110 (e.g., a communicator 210) through the communicator 221, the door opening/closing device 120 (e.g., a processor 226) may control and open the lock part 225 and start to open the handle 230. For example, the door opening/closing device 120 (e.g., a processor 226) may drive the motor 224 such that the handle 230 escapes from the body 220 while rotating. Additionally, when receiving a signal indicating that the authentication (e.g., primary authentication) failed from the electronic device 110 (e.g., a communicator 210) through the communicator 221, the door opening/closing device 120 (e.g., a processor 226) may be kept locked by the lock part 225.
According to one embodiment, the door opening/closing device 120 (e.g., a processor 226) may sense the closing of the door (S432). The door opening/closing device 120 (e.g., a processor 226) may sense the opening and closing of the door 130. Alternatively, the door opening/closing device 120 (e.g., a processor 226) may receive a signal in relation to closing of the door 130 from the electronic device 110, and determine that the door 130 is closed.
According to one embodiment, the door opening/closing device 120 (e.g., a processor 226) may sterilize the handle (S434). When determining that the door is closed after the door is opened, the door opening/closing device 120 (e.g., a processor 226) may sterilize the door opening/closing device 120 (e.g., a handle 230) through the UVC sterilizer 231. Alternatively, the door opening/closing device 120 (e.g., a processor 226) may receive a signal based on the opening and closing of the door 130 from the electronic device 110, and, based on the received signal, sterilize the door opening/closing device 120 (e.g., a handle 230) through the UVC sterilizer 231.
FIG. 5a is an exemplary view showing a screen on a display when an electronic device according to one embodiment does not sense an approach of a user. FIG. 5b is an exemplary view showing a user approaching an electronic device according to one embodiment. FIG. 5c is an exemplary view showing a screen on a display of an electronic device sensing an approach of a user according to one embodiment.
Referring to FIG. 5a, when the electronic device 110 according to one embodiment does not sense an approach of a user, the electronic device 110 may configure a screen 510 including various types of information and display the screen 510 on the display 212a. The screen 510 may include a first area 511 and a second area 512.
According to one embodiment, the electronic device 110 (e.g., a processor 214) may display, in the first area 511 of the screen 510, one or more of a state message (e.g., Do not disturb.) set by a user, a message (e.g., Ring the door bell.) that is displayed when face recognition continues to fail with predetermined frequency (e.g., three times) or more, a message (e.g., Door opening/closing) in relation to the closing or opening of the door 130, and a message (e.g., In security mode) indicating an operation in a security mode.
According to one embodiment, the electronic device 110 (e.g., a processor 214) may display position information (e.g., unit No. 1302) which is set by the user, and a security mode icon in case a security mode is set, in the second area 512 of the screen 510.
According to one embodiment, one or more of the camera assembly 216 and the sensor assembly 219 of the electronic device 110 may be disposed in a third area (e.g., an upper side, a lower side, a left side, or a right side) of the electronic device 110. The first area 511 and the second area 512 may be areas on the display 212a, and the third area 513 may be an area on one side (e.g., the lower side) of the electronic device 110. For example, the camera assembly 216 of the electronic device 110 may be disposed on the lower side of the electronic device 110. The camera assembly 216 may include a compact CMOS camera demonstrator (C3D) camera. The C3D camera may extract a distance (e.g., a depth) from a user based on an active stereo method. For example, if the camera assembly 216 includes a C3D camera, a LED light 218a, a first IR camera 217b1, an RGB camera 217a, an IR pattern light 218b, and a second IR camera 217b2 may consecutively disposed in the third area 513 from left to right. The disposition is described as an example, and, apparently, may vary in the present disclosure. The camera assembly 216 may obtain and extract depth information through the first IR camera 217b1, the IR pattern light 218b, and the second IR camera 217b2.
According to one embodiment, the first IR camera 217b1, which is used as a reference camera in extracting depth information from an image obtained from a user, may be disposed near the RGB camera 217a for image matching and an image obtained by the RGB camera 217a, for example.
According to one embodiment, the IR pattern light 218b may be disposed in the middle of the two IR cameras 217b1, 217b2 to ensure depth performance, and may emit light when the two IR cameras 217b1, 217b2 obtain an image, for example. Additionally, a distance between the two IR cameras 217b1, 217b2 may be determined considering the depth performance (e.g., a distance resolution capability). The two IR cameras 217b1, 217b2 are spaced apart from each other by a predetermined distance (e.g., 50 mm), and the two IR cameras 217b1, 217b2 may obtain high-quality depth results in a range of 30 cm to 2 M. The predetermined distance (e.g., 50 mm) may be variably adjusted.
According to one embodiment, the third area 513 may further include a distance measuring sensor 219a, a movement detecting sensor 219b, an illuminance sensor 219c, and a speaker 212b. Alternatively, the distance measuring sensor 219a, the movement detecting sensor 219b, the illuminance sensor 219c, and the speaker 212b may be disposed in an area different from the third area 513 on the front surface of the electronic device 110.
Referring to FIGS. 5b and 5c, when sensing a user 520 in a state where the electronic device 110 displays the screen 510 illustrated in FIG. 5a, the electronic device 110 may configure the screen 530 in FIG. 5c and display the screen 530 on the display 212a.
According to one embodiment, the screen 530 may include a first icon 531, a first area 532, a second area 533, and a second icon 534. The first icon 531 (e.g., a lock) may denote success/failure in authentication in relation to face recognition. For example, when the authentication fails three times or more, the first icon 531 may be kept displayed. For example, after the authentication fails five times, the first icon 531 may not be displayed for 30 seconds. Additionally, when the first icon 531 is pressed after 30 seconds, a reauthentication procedure may be performed.
According to one embodiment, various types of information, based on recognizing a user, may be displayed in the first area 532 and the second area 533. For example, when a user is recognized, a guide message (e.g., Welcome., Your face will be captured., and the like) may be displayed in the first area 532 and the second area 533.
According to one embodiment, the second icon 534 may perform the function of a door bell. For example, when sensing a touch on the third icon 534, the electronic device 110 may be configured to output a sound indicating a visit through a speaker (not illustrated) in a home.
FIG. 6a is an exemplary view showing a process of performing authentication using an image obtained by an electronic device according to one embodiment from a user. FIG. 6b is an exemplary view indicating a result that the electronic device according to one embodiment performs an authentication of a user by using an image obtained from a user.
Referring to FIG. 6a, the electronic device according to one embodiment may configure a screen 610 performing an authentication by using an image obtained from a user and display the configured screen 610 through the display 212a. The screen 610 may include a first area 611, a second area 612, a third area 613, and a settings menu 614.
According to one embodiment, when an image (e.g., a first image) is obtained from a body part (e.g., the face) of a user 520, the electronic device 110 (e.g., a processor 214) may compare the obtained first image and an image (e.g., a second image) pre-stored in the storage 213 and perform authentication (e.g., primary authentication). The electronic device 110 (e.g., a processor 214) may display a rate (e.g., 50 % of authentication performed) at which an authentication procedure is currently being performed in the first area 611 of the screen 610. If authentication (e.g., primary authentication) fails, the electronic device 110 (e.g., a processor 214) may display a message indicating failure in the authentication (e.g., primary authentication) in the first area 611. Alternatively, the electronic device 110 (e.g., a processor 214) may display information on closing/opening of the door 130 in the first area 611.
According to one embodiment, the electronic device 110 (e.g., a processor 214) may display an authentication guide image or various types of information in relation to authentication in a second area 612. The electronic device 110 (e.g., a processor 214) may display a guide in the second area 612 for a user unfamiliar with the smart door system 100. The electronic device 110 (e.g., a processor 214) may display a preview image obtained from a user or information on an authentication process in the second area 612.
According to one embodiment, the electronic device 110 (e.g., a processor 214) may display various types of guide phrases in relation to a current authentication procedure in a third area 613.
According to one embodiment, when the electronic device 110 (e.g., a processor 214) senses that the settings menu 614 is selected, the electronic device 110 (e.g., a processor 214) may go into a setting mode in relation to the electronic device 110. Instructions in relation to one or more functions of the electronic device 110, such as a change in a password, storage or renewal of a user's image and the like may be input in the setting mode.
Referring to FIG. 6b, the electronic device according to one embodiment may configure a screen 620 where authentication is performed by using an image obtained from a user, and display the screen through the display 212a. The screen 620 may include a first area 621, a second area 622, a third area 623, and a settings menu 614.
According to one embodiment, when authentication (e.g., primary authentication) is completed, the electronic device 110 (e.g., a processor 214) may display a message (e.g., Authentication completed), indicating that the authentication (e.g., primary authentication) is completed, in the first area 621.
According to one embodiment, the electronic device 110 (e.g., a processor 214) may display an image (e.g., an image preset by a user) of an authenticated user (e.g., an authenticated person who comes in and out) in the second area 622. For example, when authentication fails, the electronic device 110 (e.g., a processor 214) may display an authentication reattempt button (not illustrated) or an authentication cancelation button (not illustrated) in the second area 622. The electronic device 110 (e.g., a processor 214) may display a profile image of a user in the second area 622. For example, when authentication fails with predetermined frequency (e.g., three times) or more, the electronic device 110 (e.g., a processor 214) may display information indicating that predetermined time (e.g., 30 seconds) is being counted in the second area 622.
According to one embodiment, when authentication succeeds, the electronic device 110 (e.g., a processor 214) may display various types of information in relation to success in the authentication, such as the name of a user (e.g., Hong Gildong), a welcome message (e.g., Welcome.) and the like, in the third area 623.
FIG. 7a is an exemplary view showing a screen where an electronic device according to one embodiment performs authentication using a face image obtained from a user. FIG. 7b is an exemplary view showing a result of primary authentication performed by an electronic device according to one embodiment using an image obtained from a user. FIG. 7c is an exemplary view where an electronic device according to one embodiment obtains an image of a palm of a user. FIG. 7d is an exemplary view showing a preview image of a palm of a user obtained by an electronic device according to one embodiment. FIG. 7e is an exemplary view showing a result of secondary authentication performed by an electronic device according to one embodiment.
Referring to FIG. 7a, the electronic device 110 according to one embodiment may configure a screen 710 performing an authentication by using a face image obtained from a user, and may display the configured screen 710 through the display 212a. The screen 710 may include a first area 711, a second area 712, a third area 713 and an icon 714. Information displayed in the first area 711 may be identical or similar to information displayed in the first area 611 of the screen 610 in FIG. 6a.
According to one embodiment, the electronic device 110 (e.g., a processor 214) may display an authentication guide image or information in the second area 712. The electronic device 110 (e.g., a processor 214) may display a guide for a user unfamiliar with the smart door system 100 in the second area 712. Alternatively, the electronic device 110 (e.g., a processor 214) may display a preview image 712a obtained from the user 520 and information in relation to an authentication procedure in the second area 712.
According to one embodiment, the electronic device 110 (e.g., a processor 214) may display various types of guiding phrases in relation to current authentication in the third area 713.
According to one embodiment, the icon 714 may perform a function as a door bell. For example, when sensing a touch on the icon 714 functioning as a door bell, the electronic device 110 (e.g., a processor 214) may transmit a control signal to a speaker (not illustrated) in the home to output a sound indicating a visit through the speaker (not illustrated).
Referring to FIG. 7b, the electronic device 110 according to one embodiment may configure a screen 720 performing a primary authentication being performed by using the face image from the user, and display the configured screen 720 through the display 212a. The screen 720 may include a first area 721, a second area 722, a third area 623 and a settings menu 614.
According to one embodiment, when the authentication (e.g., primary authentication) is completed, the electronic device 110 (e.g., a processor 214) may display a message (e.g., Face recognition completed) indicating that the authentication is competed in the first area 721.
According to one embodiment, the electronic device 110 (e.g., a processor 214) may display an image (e.g., an image preset by the user) of the authenticated user (e.g., primarily authenticated user) in the second area 722. The electronic device 110 (e.g., a processor 214) may display information indicating that additional authentication (e.g., secondary authentication) is to be performed in the second area 722. The electronic device 110 (e.g., a processor 214) may display a guide for authentication of veins in the second area 722.
According to one embodiment, when authentication fails, the electronic device 110 (e.g., a processor 214) may display an authentication reattempt button or an authentication cancelation button in the second area 722. For example, when authentication fails with predetermined frequency (e.g., three times) or more, the electronic device 110 (e.g., a processor 214) may indicate predetermined time (e.g., 30 seconds) is being counted, in the second area 722. The predetermined frequency (e.g., three times) may be variably adjusted.
According to one embodiment, when authentication succeeds, the electronic device 110 (e.g., a processor 214) may display a message (e.g., Align the palm onto the camera below the screen.) encouraging the user to place a body part (e.g., the palm) to one or more of the first IR camera 217b1, the RGB camera 217a and the second IR camera 217b2 of the camera assembly 216, in the third area 723, for secondary authentication.
Referring to FIGS. 7c and 7d, two or more cameras may be used such that the electronic device according to one embodiment obtains an image of the body part 734 of the user. For example, the electronic device 110 (e.g., a processor 214) may obtain an image of the body part 734 of the user using one or more of the two IR cameras 217b1, 217b2 and the RGB camera 217a. The electronic device 110 (e.g., a processor 214) may obtain an image of the body part 734 (e.g., the palm 735) of the user depending on surrounding brightness through one or more of the two IR cameras 217b1, 217b2 and the RGB camera 217a.
According to one embodiment, the electronic device 110 (e.g., a processor 214) may display a message (e.g., Authentication is underway.) indicating authentication (e.g., secondary authentication) of the user is being performed in the first area 731. Additionally, the electronic device 110 (e.g., a processor 214) may display a preview image 742 of the obtained body part 734 of the user in the second area 732. Further, the electronic device 110 (e.g., a processor 214) may display a guide line 741 along which the preview image 742 is moved, along with the preview image 742, in the second area 732. The guide line 741 may denote a guide that encourages the user to move the body part (e.g., the palm).
Referring to FIG. 7e, the electronic device according to one embodiment may configure a screen 750 regarding the authentication (e.g., secondary authentication) being performed by using an image obtained from the user, and display the screen 750 through the display 212a. The screen 750 may include a first area 751, a second area 752, a third area 753 and a settings menu 754.
According to one embodiment, when authentication (e.g., secondary authentication) is completed, the electronic device 110 (e.g., a processor 214 may display a message (e.g., Authentication completed) indicating the authentication is completed, in the first area 751.
According to one embodiment, the electronic device 110 (e.g., a processor 214) may display an image (e.g., an image preset by a user) of the authenticated user, in the second area 752. For example, when authentication fails, the electronic device 110 (e.g., a processor 214) may display an authentication reattempt button or an authentication cancelation button in the second area 752. The electronic device 110 (e.g., a processor 214) may display a profile image of a user in the second area 752. For example, when authentication fails with predetermined frequency (e.g., three times) or more, the electronic device 110 (e.g., a processor 214) may display a message indicating that predetermined time (e.g., 30 seconds) is being counted, in the second area. The predetermined frequency (e.g., three times) and the predetermined time (e.g., 30 seconds) may be variably adjusted.
According to one embodiment, when authentication (e.g., secondary authentication) succeeds, the electronic device 110 (e.g., a processor 214) may display various types of information such as the name (e.g., Hong Gildong) of a user, a welcome message (e.g., Welcome.) and the like in relation to success in authentication in the third area 753.
According to one embodiment, when sensing the settings menu is selected, the electronic device 110 (e.g., a processor 214) may go into a setting mode in relation to the electronic device 110. The setting mode may denote a mode in which an instruction for one or more functions of the electronic device 110, such as a change in the password, storage or renewal of an image of the user and the like, is input.
FIG. 8a is an exemplary view showing a screen indicating a touch gesture being input from a user through an electronic device according to one embodiment. FIG. 8b is an exemplary view showing a screen to which a password is input from a user in an electronic device according to one embodiment. FIG. 8c is an exemplary view showing a result of authentication performed by an electronic device according to one embodiment using an input password.
Referring to FIG. 8a, the electronic device according to one embodiment may configure a screen 710 to which a touch gesture is input from the user, and display the screen 710 through the display 212a. The screen 710 may be partially similar to the screen 710 in FIG. 7a. The second area 712 of the screen 710 in FIG. 7a and the second area 712 of the screen 710 in FIG. 8a may be partially different in that, while the second area 712 of the screen 710 in FIG. 7a displays a preview image (e.g., a face image) obtained from the user, the second area 712 of the screen 710 in FIG. 8a may receive a touch and drag of the user.
According to one embodiment, when receiving an input based on the touch and drag where a finger touches and moves from a first position 801 to a second position 802 on the second area 712, the electronic device 110 (e.g., a processor 214) may recognize the input as an instruction for inputting a password. When receiving the instruction based on the touch and drag, the electronic device 110 (e.g., a processor 214) may display a screen to which a password for opening the door 130 is input, on the display 212a. The input based on the touch of the left and the drag to the right is provided only as an example, and the touch and drag may vary depending on settings of the user in the disclosure.
Referring to FIG. 8b, the electronic device 110 (e.g., a processor 214) may configure a screen 810 to which a password for opening the door 130 is input, and display the screen 810 through the display 212a, based on the touch and drag.
According to one embodiment, the screen 810 may include a first area 811 in which a message for guiding an input of a password is displayed, a second area 812 in which the input password is displayed, and a third area 813 in which a virtual keypad 814 is displayed to receive the password by the user. The electronic device 110 (e.g., a processor 214) may encode numbers, symbols, characters or special characters, input through the virtual keypad 814, and display the encoded numbers, symbols, characters or special characters in the second area 812.
Referring to FIG. 8c, the electronic device according to one embodiment may configure a screen 820 where authentication has been performed using the password input from the user, and display the screen 820 through the display 212a. The screen 820 may include a first area 821, a second area 822, a third area 823 and a settings menu 824.
According to one embodiment, when authentication based on the input of the password is completed, the electronic device 110 (e.g., a processor 214) may display a message (e.g., Authentication completed) indicating the authentication is completed, in the first area 821.
According to one embodiment, the electronic device 110 (e.g., a processor 214) may display an image (e.g., an image preset by the user) of the authenticated user, in the second area 822. For example, when the authentication fails, the electronic device 110 (e.g., a processor 214) may display an authentication reattempt button (not illustrated) or an authentication cancelation button (not illustrated), in the second area 822. The electronic device 110 (e.g., a processor 214) may display a profile image of the user in the second area 822. For example, when the authentication fails with predetermined frequency (e.g., three times) or more, the electronic device 110 (e.g., a processor 214) may indicate that predetermined time (e.g., 30 seconds) is being counted, in the second area 822. The predetermined frequency (e.g., three times) and the predetermined time (e.g., 30 seconds) may be variably adjusted.
According to one embodiment, when the authentication based on the input of the password succeeds, the electronic device 110 (e.g., a processor 214) may display various types of information in relation to success in authentication such as a name of the user (e.g., Hong Gildong), a welcome message (e.g., Welcome.) and the like, in the third area 823. The method of authentication with a password may be applied when an image (e.g., a face image, biometric information image and the like) for authentication is not pre-stored (e.g., when the door is opened for a guest who makes a visit to an absent house.
According to one embodiment, when sensing that the settings menu 754 is selected, the electronic device 110 (e.g., a processor 214) may go into a setting mode in relation to the electronic device 110. The setting mode may denote a mode in which an instruction for one or more functions of the electronic device 110, such as a change in the password, storage or renewal of an image of the user and the like, is input.
FIG. 9a is an exemplary view showing a handle of a door opening/closing device, being separated from a body, based on authentication of a user according to one embodiment. FIG. 9b is an exemplary view showing a handle of a door opening/closing device, completely separated from a body, based on authentication of a user according to one embodiment. FIG. 9c is an exemplary view showing a handle of a door opening/closing device according to one embodiment, gripped by a user.
Referring to FIGS. 9a to 9c, when authentication (e.g., primary authentication, secondary authentication or authentication based on a password) of the user is completed by the electronic device 110 (e.g., a processor 214), the door opening/closing device 120 (e.g., a communicator 221) may receive a signal for undoing the closing of the door opening/closing device 120 from the electronic device 110 (e.g., a communicator 210). When the signal is received from the door opening/closing device 120 (e.g., a communicator 221), the door opening/closing device 120 (e.g., a processor 226) may control the motor 224 such that the handle 230 is separated from the body 220.
According to one embodiment, when the handle 230 is completely separated from the body 220, the door opening/closing device 120 (e.g., a processor 226) may control the lock part 225 to unlock the door opening/closing device 120. Alternatively, the door opening/closing device 120 (e.g., a processor 226) may control the motor 224 such that the handle 230 is separated from the body 220 after unlocking the door opening/closing device 120 as a result of control over the lock part 225. Additionally, the door opening/closing device 120 (e.g., a processor 226) may emit light through an LED of the light emitter 232 of the handle 230, to allow the user to recognize the unlocking of the door 130.
According to one embodiment, the door opening/closing device 120 (e.g., a processor 226) may sense a touch of the body part (e.g., the hand 910) of the user on the handle 230 through the sensor assembly 233. The door opening/closing device 120 (e.g., a processor 226) may sense that the user grips the handle 230 with the hand 910, through the sensor assembly 233. For example, when the user pulls the handle 230 in a direction (i.e., a direction where the user is positioned) opposite to the body 220 after gripping the handle 230, the door 130 may be opened by force of the user. When the door 130 is closed 130 after the user comes in and out through the door 130, the door opening/closing device 120 (e.g., a processor 226) may sense the closing of the door 130 through the sensor assembly 223. Additionally, the door opening/closing device 120 (e.g., a processor 226) may start to sterilize the handle 230 through the UVC sterilizer 231.
FIG. 10(a) is an exemplary view showing a screen, indicating that a handle is currently being sterilized and displayed by an electronic device according to one embodiment. FIG. 10(b) is an exemplary view showing a door opening/closing device being sterilized according to one embodiment.
Referring to FIG. 10(a), the electronic device 110 (e.g., a processor 214) may display a screen 1010 indicating the handle 230 is currently being sterilized through the display 212a. The screen 1010 may include a first area 1010 in which a message indicating the handle 230 is currently being sterilized is displayed, a second area 1012 in which position information (e.g., unit No. 1302) set by the user and a security mode icon are displayed, and a third area 1013 indicating a door bell.
Referring to FIG. 10b, the electronic device 110 (e.g., a processor 214) may emit an LED disposed on one side (e.g., an edge) of the handle 230 in a predetermined color (e.g., violet), to show the handle 230 is currently being sterilized.
Each of the steps in each of the flow charts, described above, may be performed regardless of the order in which each of the steps is illustrated or may be performed at the same time. Further, one or more components, and one or more operations performed by one or more of the components in the disclosure may be implemented as hardware and/or software.
The embodiments are described above with reference to a number of illustrative embodiments thereof. However, the present disclosure is not intended to limit the embodiments and drawings set forth herein, and numerous other modifications and embodiments can be devised by one skilled in the art without departing from the technical spirit of the disclosure. Further, the effects and predictable effects based on the configurations in the disclosure are to be included within the range of the disclosure though not explicitly described in the description of the embodiments.

Claims (15)

  1. An electronic device, comprising:
    a display;
    a camera assembly:
    a communicator;
    a storage; and
    a processor electrically connected to the display, the camera assembly, the communicator, and the storage,
    wherein the processor is configured to:
    activate a first camera of the camera assembly based on recognizing a user,
    obtain a first image of the user through the activated first camera,
    perform authentication of the user based on the obtained first image and a second image being pre-stored in the storage, and
    transmit a signal for controlling an operation of a door opening/closing device disposed in a door to the door opening/closing device through the communicator, based on the authentication of the user.
  2. The electronic device of claim 1, wherein the processor is configured to transmit a signal for unlocking the door opening/closing device to the door opening/closing device through the communicator, based on a success of the authentication of the user.
  3. The electronic device of claim 1, wherein the processor is configured to:
    identify whether a signal in relation to closing of the door is received from the door opening/closing device through the communicator, and
    transmit a signal for controlling sterilization of the door opening/closing device to the door opening/closing device through the communicator, based on receiving the signal in relation to closing of the door.
  4. The electronic device of claim 1, wherein the processor is configured to:
    activate a second camera of the camera assembly, based on a success of the authentication of the user, and
    output a message indicating movement of a body part of the user to a position of the activated second camera through the display.
  5. The electronic device of claim 4, wherein the processor is configured to:
    display a guide line for guiding the body part to an area where the body part is supposedly positioned through the display, while the message is being output,
    obtain a preview image of the body part through the first camera, and
    display visual information through the display based on whether the obtained preview image and the displayed guide line match.
  6. The electronic device of claim 5, wherein the processor is configured to obtain information on veins of the body part through the activated second camera, based on matching the obtained preview image and the displayed guide line.
  7. The electronic device of claim 5, wherein the processor is configured to:
    obtain a third image of the body part through the activated second camera, and
    perform authentication of the user based on the obtained third image and a fourth image pre-stored in the storage.
  8. The electronic device of claim 7, wherein the processor is configured to:
    identify brightness around the electronic device, and
    select at least one of the first camera and the second camera based on the identified surrounding brightness, and
    wherein the first camera or the second camera is a red-green-blue (RGB) camera or an infrared camera.
  9. The electronic device of claim 1, wherein the processor is configured to output a message in relation to success in the authentication of the user through the display when the authentication of the user succeeds.
  10. The electronic device of claim 1, wherein the processor is configured to output a message in relation to failure in the authentication of the user through the display when the authentication of the user fails.
  11. The electronic device of claim 1, wherein the processor is configured to display, on the display, at least piece of information on one or more operation states of the electronic device, and one or more operation states of the door opening/closing device.
  12. A door opening/closing device, comprising:
    a body attached to a first side of a door and configured to protrude; and
    a handle operatively connected to the door opening/closing device,
    wherein the body comprises:
    a communicator;
    a motor; and
    a processor electrically connected to the communicator and the motor, and
    wherein the processor is configured to:
    receive, through the communicator, a signal for controlling the door opening/closing device from an electronic device disposed near the door, and
    unlock the door opening/closing device based on the received control signal.
  13. The door opening/closing device of claim 12, wherein the processor is configured to control the motor such that the handle protrudes from the body based on the received control signal.
  14. The door opening/closing device of claim 12, wherein the processor is configured to:
    lock the door opening/closing device when sensing closing of the door, and
    sterilize the door opening/closing device through at least one ultraviolet C light emitting diode (UVC LED).
  15. A method for controlling opening and closing of a door, comprising:
    by an electronic device,
    activating a first camera based on recognizing a user;
    obtaining a first image of the user through the activated first camera;
    performing authentication of the user based on the obtained first image and a pre-stored second image;
    transmitting, to a door opening/closing device, a signal for controlling an operation of the door opening/closing device disposed in the door based on the authentication of the user, and
    by the door opening/closing device,
    receiving a signal for controlling the door opening/closing device from the electronic device; and
    unlocking the door opening/closing device based on the received control signal.
PCT/KR2021/008436 2020-08-28 2021-07-02 Electronic device for controlling opening and closing of a door, a door opening/closing device disposed to the door, and a method therefor WO2022045562A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020200109643A KR20220028528A (en) 2020-08-28 2020-08-28 An electronic device for controlling opening and closing of a door and a door opening/closing device disposed in the door, and method therefor
KR10-2020-0109643 2020-08-28

Publications (1)

Publication Number Publication Date
WO2022045562A1 true WO2022045562A1 (en) 2022-03-03

Family

ID=80353414

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/008436 WO2022045562A1 (en) 2020-08-28 2021-07-02 Electronic device for controlling opening and closing of a door, a door opening/closing device disposed to the door, and a method therefor

Country Status (2)

Country Link
KR (1) KR20220028528A (en)
WO (1) WO2022045562A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115359589A (en) * 2022-08-08 2022-11-18 珠海格力电器股份有限公司 Control method and device of intelligent door lock, electronic device and storage medium
KR102472122B1 (en) * 2022-03-04 2022-11-29 김정태 Door and the driving method thereof
WO2023242871A1 (en) * 2022-06-17 2023-12-21 Matdun Labs India Private Limited Device, system and method for 3d face recognition and access management door lock
WO2023242875A1 (en) * 2022-06-18 2023-12-21 Matdun Labs India Private Limited Device, system and method for 3d face recognition and authentication door lock

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101316805B1 (en) * 2013-05-22 2013-10-11 주식회사 파이브지티 Automatic face tracing and recognition method and system
KR101550206B1 (en) * 2014-11-25 2015-09-07 (주)코리센 Finger Vein Recognition System having a Double Security Function and Controlling Method for the Same
KR102019342B1 (en) * 2018-03-19 2019-09-06 숭실대학교산학협력단 Digital Door-lock Management server and Opening/Closing Controlling System, Method for Controlling Digital Door-lock Communicating Smartphone
JP2020022675A (en) * 2018-08-08 2020-02-13 株式会社サンライズプロジェクト Door handle disinfection apparatus
KR20200070601A (en) * 2018-12-10 2020-06-18 엘지전자 주식회사 Door Rock

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101316805B1 (en) * 2013-05-22 2013-10-11 주식회사 파이브지티 Automatic face tracing and recognition method and system
KR101550206B1 (en) * 2014-11-25 2015-09-07 (주)코리센 Finger Vein Recognition System having a Double Security Function and Controlling Method for the Same
KR102019342B1 (en) * 2018-03-19 2019-09-06 숭실대학교산학협력단 Digital Door-lock Management server and Opening/Closing Controlling System, Method for Controlling Digital Door-lock Communicating Smartphone
JP2020022675A (en) * 2018-08-08 2020-02-13 株式会社サンライズプロジェクト Door handle disinfection apparatus
KR20200070601A (en) * 2018-12-10 2020-06-18 엘지전자 주식회사 Door Rock

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102472122B1 (en) * 2022-03-04 2022-11-29 김정태 Door and the driving method thereof
WO2023242871A1 (en) * 2022-06-17 2023-12-21 Matdun Labs India Private Limited Device, system and method for 3d face recognition and access management door lock
WO2023242875A1 (en) * 2022-06-18 2023-12-21 Matdun Labs India Private Limited Device, system and method for 3d face recognition and authentication door lock
CN115359589A (en) * 2022-08-08 2022-11-18 珠海格力电器股份有限公司 Control method and device of intelligent door lock, electronic device and storage medium
CN115359589B (en) * 2022-08-08 2023-10-10 珠海格力电器股份有限公司 Control method and device of intelligent door lock, electronic device and storage medium

Also Published As

Publication number Publication date
KR20220028528A (en) 2022-03-08

Similar Documents

Publication Publication Date Title
WO2022045562A1 (en) Electronic device for controlling opening and closing of a door, a door opening/closing device disposed to the door, and a method therefor
WO2017099314A1 (en) Electronic device and method for providing user information
EP2923232A1 (en) Head mount display and method for controlling the same
WO2015137645A1 (en) Mobile terminal and method for controlling same
WO2015133827A1 (en) Mobile electronic apparatus, accessory device therefor, and electronic apparatus including the accessory device
WO2019182409A1 (en) Electronic device and authentication method thereof
WO2015194693A1 (en) Video display device and operation method therefor
WO2015190796A1 (en) Hand-attachable wearable device capable of iris identification indoors and outdoors
WO2020204531A1 (en) Tv control system and tv control device suitable therefor
WO2017179846A1 (en) Polyhedral three-dimensional imaging device for simultaneously authenticating fingerprint and finger veins
WO2018034491A1 (en) A primary device, an accessory device, and methods for processing operations on the primary device and the accessory device
WO2022005121A1 (en) Device for improving temperature accuracy of thermal imaging camera for detecting human body
WO2018164364A1 (en) Contactless multiple body part recognition method and multiple body part recognition device, using multiple biometric data
WO2020218848A1 (en) Electronic device and method for performing biometric authenticaion function and intelligent agent function using user input in electronic device
WO2018164363A1 (en) Contactless multiple body part recognition method and multiple body part recognition device, using multiple biometric data
WO2017039036A1 (en) Terminal device and method for performing user authentication by means of biometric data
WO2016006831A1 (en) Door-lock using iris recognition and system thereof, mobile communication terminal and network gateway used therein, and user authentication method
WO2018190637A1 (en) Method for establishing communication connection of device and apparatus therefor
WO2015111794A1 (en) Smart key and control method and apparatus using the same
WO2017052004A1 (en) Mobile terminal and control method therefor
EP3676739A1 (en) Electronic device and control method thereof
WO2018038545A1 (en) Method, system, and mobile communication terminal to run a predetermined operation for performing a specific function in response to an activation user input
WO2019050212A1 (en) Method, electronic device, and storage medium for fingerprint recognition
EP3381180A1 (en) Photographing device and method of controlling the same
WO2016003066A1 (en) Mobile terminal and method for controlling the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21861863

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21861863

Country of ref document: EP

Kind code of ref document: A1