CN111704012A - User detection system of elevator - Google Patents

User detection system of elevator Download PDF

Info

Publication number
CN111704012A
CN111704012A CN201911164317.8A CN201911164317A CN111704012A CN 111704012 A CN111704012 A CN 111704012A CN 201911164317 A CN201911164317 A CN 201911164317A CN 111704012 A CN111704012 A CN 111704012A
Authority
CN
China
Prior art keywords
detection
car
door
user
elevator according
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911164317.8A
Other languages
Chinese (zh)
Inventor
野田周平
横井谦太朗
木村纱由美
田村聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Elevator and Building Systems Corp
Original Assignee
Toshiba Elevator Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Elevator Co Ltd filed Critical Toshiba Elevator Co Ltd
Publication of CN111704012A publication Critical patent/CN111704012A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B13/00Doors, gates, or other apparatus controlling access to, or exit from, cages or lift well landings
    • B66B13/24Safety devices in passenger lifts, not otherwise provided for, for preventing trapping of passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B5/00Applications of checking, fault-correcting, or safety devices in elevators
    • B66B5/0006Monitoring devices or performance analysers
    • B66B5/0012Devices monitoring the users of the elevator system

Landscapes

  • Indicating And Signalling Devices For Elevators (AREA)
  • Elevator Door Apparatuses (AREA)

Abstract

The present invention relates to a user detection system for an elevator, which can accurately detect a user or an object near a door without a plurality of sensors, and prevent accidents when the door is opened or closed. A user detection system for an elevator according to one embodiment includes an imaging unit, a detection area setting unit, and a detection processing unit. The imaging unit images a predetermined range including the vicinity of an entrance where a door opens and closes from inside the car. The detection region setting unit sets a 1 st detection region on a front pillar provided on at least one of both sides of the doorway on the captured image obtained by the imaging unit, and sets a 2 nd detection region on a car sill side of a floor surface of the car. The detection processing unit detects the presence or absence of a user or an object based on the images in the 1 st detection region and the 2 nd detection region set by the detection region setting unit.

Description

User detection system of elevator
The present application is based on Japanese patent application 2019-049876 (application date: 2019/3/18) and enjoys priority based on the application. This application is incorporated by reference into this application in its entirety.
Technical Field
Embodiments of the present invention relate to a user detection system for an elevator.
Background
When the car of the elevator is opened, a user's finger or the like may be pulled into the door obscura. In order to prevent such an accident, for example, a method is known in which a photoelectric sensor is provided near a door box, and a user or the like near the door box is detected to give a warning.
However, in the above-described method using sensors such as a photoelectric sensor, the user may operate only when passing near the door dark box, and a warning may be frequently issued. Therefore, in order to prevent the sensor from erroneous detection, a fine adjustment operation for the installation position is required. In addition, in the case where the two doors are of the double-door type, since there are door boxes on both sides of the car doorway, it is necessary to provide sensors for these door boxes in advance.
Further, if the two doors are double doors, there is a "sandwiched accident" in which a hand or the like is sandwiched between the doors and the door when the doors are closed. If the two doors are single door, when the door is opened, a hand or the like is pulled into the gap between the door and the door, such as a pulled-in gap accident. To prevent these accidents, it is necessary to detect a user or an object located near the door using a plurality of sensors.
Disclosure of Invention
The invention provides a user detection system of an elevator, which can accurately detect users or objects near a door without a plurality of sensors, and prevent accidents when the door is opened and closed.
A user detection system for an elevator according to one embodiment includes an imaging unit, a detection area setting unit, and a detection processing unit.
The imaging unit images a predetermined range including the vicinity of an entrance where a door opens and closes from inside the car. On the captured image obtained by the imaging unit, the detection region setting unit sets a 1 st detection region on a face pillar provided on at least one of both sides of the doorway, and sets a 2 nd detection region on a car sill side of a floor surface of the car. The detection processing unit detects the presence or absence of a user or an object based on the images in the 1 st detection region and the 2 nd detection region set by the detection region setting unit.
According to the user detection system configured as described above, it is possible to accurately detect a user or an object located near the door without requiring a plurality of sensors, and to prevent an accident when the door is opened or closed.
Drawings
Fig. 1 is a diagram showing a configuration of an elevator user detection system according to embodiment 1.
Fig. 2 is a diagram showing a configuration of a portion around an entrance in the car in the embodiment.
Fig. 3 is a diagram showing an example of an image captured by the camera in the present embodiment.
Fig. 4 is a flowchart showing the overall processing flow of the user detection system according to this embodiment.
Fig. 5 is a diagram for explaining a coordinate system in real space in this embodiment.
Fig. 6 is a diagram for explaining a region setting method in this embodiment.
Fig. 7 is a diagram for explaining another region setting method in this embodiment.
Fig. 8 is a diagram showing a relationship between a user in the car and the detection area in the embodiment.
Fig. 9 is a diagram showing a relationship between a user and a detection area on a captured image in the present embodiment.
Fig. 10 is a diagram for explaining a difference method used in the user detection processing in this embodiment.
Fig. 11 is a diagram for explaining motion detection used in the user detection processing in this embodiment.
Fig. 12 is a diagram for explaining boundary detection used in the user detection processing in this embodiment.
Fig. 13 is a diagram showing a configuration of a doorway peripheral portion in a car using a single-opening type car door in this embodiment.
Fig. 14 is a diagram for explaining the opening and closing operation of the single-opening type car door in the present embodiment.
Fig. 15 is a diagram showing an example of an image captured by the camera according to embodiment 2.
Fig. 16 is a diagram for explaining a region setting method in this embodiment.
Fig. 17 is a diagram for explaining another region setting method in this embodiment.
Fig. 18 is a diagram showing a relationship between a user in the car and the detection area in the embodiment.
Fig. 19 is a diagram showing a relationship between a user and a detection area on a captured image in the present embodiment.
Fig. 20 is a diagram showing a configuration of a doorway peripheral portion in a car using a single-opening type car door in this embodiment.
Fig. 21 is a diagram for explaining the opening and closing operation of the single-opening type car door in this embodiment.
Detailed Description
Hereinafter, embodiments will be described with reference to the drawings.
Note that the disclosure is merely an example, and the present invention is not limited to the contents described in the following embodiments. Variations that would be readily apparent to one skilled in the art are, of course, included within the scope of this disclosure. In the drawings, the dimensions, shapes, and the like of the respective portions are schematically shown in some cases by being modified from those of the actual embodiment in order to make the description clearer. In the drawings, corresponding elements are denoted by the same reference numerals, and detailed description thereof may be omitted.
(embodiment 1)
Fig. 1 is a diagram showing a configuration of an elevator user detection system according to embodiment 1. Note that, although 1 car is described as an example, the same configuration is applied to a plurality of cars.
A camera 12 is provided at an upper portion of an entrance of the car 11. Specifically, the camera 12 is provided in the lintel plate 11a covering the upper part of the doorway of the car 11 with the lens portion directed directly downward, or inclined at a predetermined angle toward the lobby 15 side or the inside of the car 11.
The camera 12 is a small-sized monitoring camera such as an in-vehicle camera, has a wide-angle lens or a fisheye lens, and can continuously capture images of several frames (for example, 30 frames/second) within 1 second. The camera 12 is activated when the car 11 reaches the hall 15 at each floor, and captures a predetermined range L including the vicinity of the car door 13.
The installation location of the camera 12 may not be located above the doorway of the car 11 as long as it is near the car door 13. For example, the imaging device may be any device that can image the vicinity of the doorway of the car 11, such as the upper portion of the side wall near the doorway of the car 11. By providing the camera 12 in such a place in advance, a detection area described later can be appropriately set, and a user or an object can be accurately detected from an image in the detection area.
In contrast, since a surveillance camera generally used for a purpose of surveillance is installed in a car or on a ceiling surface, an imaging range is wide in the whole car. Therefore, it is difficult to set the detection area, and the possibility of detecting users including users far from the doorway of the car 11 is high.
In the hall 15 at each floor, a hall door 14 is openably and closably provided at an arrival gate of the car 11. When the car 11 arrives, the hoistway doors 14 engage with the car doors 13 and perform opening and closing operations. Further, the power source (door motor) is located on the car 11 side, and the hoistway doors 14 are opened and closed only following the car doors 13. In the following description, it is assumed that the hoistway doors 14 are also opened when the car doors 13 are opened, and the hoistway doors 14 are also closed when the car doors 13 are closed.
Each image (video) continuously captured by the camera 12 is analyzed and processed in real time by the image processing device 20. Note that, although the image processing device 20 is shown in fig. 1 as being removed from the car 11 for convenience, the image processing device 20 is actually housed in the lintel plate 11a together with the camera 12.
The image processing apparatus 20 includes a storage unit 21 and a detection unit 22. The storage unit 21 sequentially stores images captured by the camera 12, and has a buffer area for temporarily storing data necessary for processing by the detection unit 22. The storage unit 21 may store an image subjected to a process such as distortion correction, enlargement and reduction, or partial cropping as a pre-process for the captured image.
The detection section 22 detects a user located near the car door 13 using the captured image of the camera 12. The detection unit 22 is functionally divided into a detection region setting unit 22a and a detection processing unit 22 b.
On the image captured by the camera 12, the detection region setting unit 22a sets a detection region on a frontal pole provided on at least one of both sides of the doorway of the car 11. Specifically, the detection region setting unit 22a sets a band-shaped detection region along the inner side surface of the frontal pole. The "front pillars" are also called entrance pillars or entrance frames, and are provided on both sides or one side of the entrance of the car 11 (see fig. 2). A door box for housing the car door 13 is generally provided on the back side of the front pillar.
The detection processing unit 22b detects the presence or absence of a user or an object based on the image in the detection area set by the detection area setting unit 22 a. The term "object" as used herein includes, for example, clothes and cargoes of a user, and moving objects such as wheelchairs. Further, the car control device 30 may have a part or all of the functions of the image processing device 20.
The car control device 30 controls operations of various devices (destination floor buttons, lighting, and the like) provided in the car 11. The car control device 30 includes a door opening/closing control unit 31 and a notification unit 32. The door opening/closing control section 31 controls opening/closing of the doors of the car doors 13 when the car 11 arrives at the waiting hall 15. Specifically, the door opening/closing control unit 31 opens the car doors 13 when the car 11 reaches the waiting hall 15, and closes the car doors 13 after a predetermined time has elapsed.
Here, when the detection processing unit 22b detects a user or an object during the opening of the car door 13, the door opening/closing control unit 31 performs door opening/closing control for avoiding a door accident (an accident of being pulled into a door dark box). Specifically, the door opening/closing control unit 31 temporarily stops the door opening operation of the car doors 13, moves in the reverse direction (door closing direction), or slows down the door opening speed of the car doors 13. The notification unit 32 calls the attention of the user in the car 11 based on the detection result of the detection processing unit 22 b.
Fig. 2 is a diagram showing a configuration of a portion around an entrance in the car 11.
A car door 13 is provided openably and closably at an entrance of the car 11. In the example of fig. 2, the car door 13 is shown in which both doors are of a double-door type, and both door panels 13a and 13b constituting the car door 13 are opened and closed in opposite directions to each other in the doorway direction (horizontal direction). The "doorway" is the same as the doorway of the car 11.
Front pillars 41a and 41b are provided on both sides of the doorway of the car 11, and surround the doorway of the car 11 together with the lintel plate 11 a. When the car door 13 is opened, one door panel 13a is housed in a door black 42a provided on the back side of the face pillar 41a, and the other door panel 13b is housed in a door black 42b provided on the back side of the face pillar 41 b.
One or both of the front pillars 41a and 41b are provided with a display 43, an operation panel 45 on which a destination floor button 44 and the like are arranged, and a speaker 46. In the example of fig. 2, a speaker 46 is provided on the front pillar 41a, and a display 43 and an operation panel 45 are provided on the front pillar 41 b.
Here, a camera 12 is provided at a central portion of a lintel plate 11a at an upper portion of the doorway of the car 11. The camera 12 is provided downward from a lower portion of the lintel plate 11a so as to be able to take a picture including the vicinity of the doorway when the car door 13 is opened together with the hoistway door 14 (see fig. 3).
Fig. 3 is a diagram showing an example of the captured image by the camera 12. In the state where the car doors 13 ( door panels 13a and 13b) and the hall doors 14 ( door panels 14a and 14b) are fully opened, a picture is taken from the upper part of the doorway of the car 11 to the lower part. In fig. 3, the upper side shows a waiting hall 15, and the lower side shows the inside of the car 11.
In the hall 15, door pockets 17a and 17b are provided on both sides of an arrival entrance of the car 11, and belt-shaped hall sills 18 having a predetermined width are arranged on a floor surface 16 between the door pockets 17a and 17b along an opening and closing direction of the hall door 14. A belt-shaped car threshold 47 having a predetermined width is disposed on the doorway side of the floor surface 16 of the car 11 in the opening/closing direction of the car door 13.
Here, the detection regions Ea and Eb are set on the inside side surfaces 41a-1 and 41b-1 of the front pillars 41a and 41b, respectively, on the captured image. The detection areas Ea and Eb are areas for detecting a user or an object on the captured image, and are used to prevent an accident of being pulled into the door obscurations 42a and 42b during the door opening operation.
The detection regions Ea and Eb are respectively set in a band shape having predetermined widths D1 and D2 in the width direction of the inner side surfaces 41a-1 and 41b-1 of the front pillars 41a and 41 b. The widths D1, D2 are set to be, for example, the same as the lateral widths (lateral widths) of the inner side surfaces 41a-1, 41b-1 or smaller than the lateral widths (lateral widths) of the inner side surfaces 41a-1, 41 b-1. The width may be the same as or different from the widths D1 and D2.
The widths D1 and D2 may be partially changed so that, for example, the widths D1a and D2a of the portions that are easily touched by the hand of the user are slightly wider than the widths D1 and D2 (see fig. 9). Thus, an accident of being pulled into the door dark box can be detected as early as possible.
The front faces of the front pillars 41a and 41b are set outside the region. This is because the operation panel 45 is provided on the front surface of the front surface posts 41a and 41b, and the user is often present in the vicinity. In the case of the inner side surfaces 41a-1, 41b-1 of the face pillars 41a, 41b, the detection areas Ea, Eb that are not affected by the opening and closing operation of the car door 13 can be set without erroneously detecting a user or the like who operates the operation panel 45.
Next, the operation of the present system will be described in detail.
Fig. 4 is a flowchart showing the overall processing flow of the present system.
First, as the initial setting, the detection region setting unit 22a of the detection unit 22 provided in the image processing apparatus 20 executes the detection region setting process (step S11). This detection region setting process is executed, for example, when the camera 12 is set or when the set position of the camera 12 is adjusted, as follows.
That is, the detection area setting unit 22a sets the detection areas Ea and Eb on the image captured by the camera 12. Specifically, if the car door 13 is of a double-door type, the detection region setting unit 22a sets the detection region Ea on the inner side surface 41a-1 of the face pillar 41a near the door black 42a into which one door panel 13a is drawn. Next, the detection area setting unit 22a sets the detection area Eb on the inner side surface 41b-1 of the face pillar 41b near the door black 42b into which the other door panel 13b is pulled.
The regions in which the front pillars 41a and 41b are reflected in the captured image are calculated from the design values of the components of the car 11 and the values unique to the camera 12.
Width of doorway (lateral width of doorway of cage)
Height of the door
Width of the column
Type of door (double open/single open on right or left)
Relative position of camera with respect to doorway (three-dimensional)
Angle of the Camera (3 axes)
Angle of view (focal length) of the camera
The detection region setting unit 22a calculates the regions in which the front surface posts 41a and 41b are reflected on the captured image based on these values. That is, assuming that the frontal columns 41a, 41b are erected vertically from both ends of the doorway (doorway), the detection region setting unit 22a calculates the three-dimensional coordinates of the frontal columns 41a, 41b from the relative position, angle, and angle of view of the camera 12 with respect to the doorway.
The three-dimensional coordinates refer to coordinates when a direction horizontal to the car doors 13 is an X axis, a direction from the center of the car doors 13 to the lobby 15 (a direction perpendicular to the car doors 13) is a Y axis, and a height direction of the car 11 is a Z axis, as shown in fig. 5.
For example, as shown in fig. 6, marks m1 and m2 may be placed on both ends of the car inner side of the car sill 47, and the three-dimensional coordinates of the front pillars 41a and 41b may be obtained based on the positions of the marks m1 and m 2. Alternatively, as shown in fig. 7, the positions of two points P1, P2 on the car inner side of the car sill 47 may be obtained by image processing, and the three-dimensional coordinates of the front pillars 41a, 41b may be obtained based on the positions of the two points P1, P2.
The detection region setting unit 22a projects the three-dimensional coordinates of the front surface posts 41a and 41b as two-dimensional coordinates on the captured image, obtains regions in which the front surface posts 41a and 41b are projected on the captured image, and sets the detection regions Ea and Eb in the regions. Specifically, the detection region setting unit 22a sets the detection regions Ea and Eb having predetermined widths D1 and D2 along the longitudinal direction of the inner side surfaces 41a-1 and 41b-1 of the frontal pillars 41a and 41 b.
The setting process of the detection zones Ea, Eb may be performed in a state where the car doors 13 are opened, or may be performed in a state where the car doors 13 are closed. In a state where the car door 13 is closed, the detection areas Ea and Eb are set easily without the hall 15 being reflected in the image captured by the camera 12. In addition, the lateral width (width in the short-side direction) of the car sill 47 is generally larger than the thickness of the car door 13. Therefore, even when the car doors 13 are fully closed, one end side of the car threshold 47 is reflected in the captured image. Therefore, the positions of the front pillars 41a and 41b can be determined with reference to the position of one end side thereof, and the detection regions Ea and Eb can be set.
By setting the detection areas Ea and Eb in advance in the inner side surfaces 41a-1 and 41b-1 of the front posts 41a and 41b in this manner, as shown in fig. 8 and 9, when the user places his or her hand on the inner side surface 41a-1 of the front post 41a during door opening, the hand can be detected before the hand is pulled into the door obscura 42 a.
Next, the operation of the car 11 during operation will be described.
As shown in fig. 4, when the car 11 arrives at the waiting hall 15 at any floor (yes in step S12), the car control device 30 opens the car doors 13 (step S13).
At this time (during the door opening operation of the car door 13), the camera 12 provided at the upper part of the doorway of the car 11 captures images of the periphery (the face pillars 41a, 41b, etc.) of the car door 13 at a predetermined frame rate (for example, 30 frames/second). The image processing apparatus 20 acquires images captured by the camera 12 in time series, sequentially stores the images in the storage unit 21 (step S14), and executes the following user detection processing in real time (step S15). Further, distortion correction, enlargement and reduction, cutting out of a part of an image, and the like may be performed as preprocessing of a captured image.
The user detection process is executed by the detection processing unit 22b of the detection unit 22 provided in the image processing apparatus 20.
That is, the detection processing unit 22b extracts images in the detection areas Ea and Eb from a plurality of captured images obtained in time series by the camera 12, and detects the presence or absence of a user or an object based on these images. Specifically, the detection was performed by the following method.
(a) Difference method
As shown in fig. 10, the detection processing unit 22b compares the images in the detection areas Ea and Eb with the base image in time series, and detects the presence or absence of a user or an object based on the difference between the images of the two. Fig. 10 (a) is a basic image, which is an image in the detection areas Ea and Eb extracted from the image captured by the camera 12 in advance in a state where no user or object is present in the car 11. Fig. 10 (b) shows the detection target image, which is an image in the detection areas Ea and Eb extracted from the captured image when the door is opened.
The detection processing unit 22b compares the basic image and the detection target image, and determines that a user or an object is present in the vicinity of the dark boxes 42a and 42b if the difference between the pixel values in the images is equal to or greater than a predetermined amount.
(b) Motion detection
As shown in fig. 11, the detection processing unit 22b divides the captured image into a matrix in units of predetermined blocks, and detects the presence or absence of a user or an object by focusing on a moving block among the blocks.
More specifically, the detection processing unit 22b reads out the images held in the storage unit 21 one by one in time series order, and calculates the average luminance value of the images for each block. At this time, the average luminance value of each block calculated when the first image is input is held as an initial value in the 1 st buffer area, not shown, in the storage unit 21.
When the second and subsequent images are obtained, the detection processing section 22b compares the average luminance value of each block of the current image with the average luminance value of each block of the previous image held in the 1 st buffer area. As a result, when there is a block having a luminance difference equal to or greater than a preset value in the current image, the detection processing unit 22b determines that the block is a motion block. When determining the presence or absence of motion with respect to the current image, the detection processing section 22b holds the average luminance value of each block of the image in the 1 st buffer area for comparison with the next image. Similarly, the detection processing unit 22b repeatedly compares the luminance values of the respective images in units of blocks in time series order and determines the presence or absence of motion.
The detection processing unit 22b checks whether or not there is a moving block in the image in the detection areas Ea and Eb. As a result, if there is a moving block in the image in the detection areas Ea and Eb, the detection processing unit 22b determines that there is a person or an object near the dark boxes 42a and 42 b.
As shown in FIG. 3, the detection regions Ea, Eb are set on the inner side surfaces 41a-1, 41b-1 of the front pillars 41a, 41 b. Therefore, the movement of the car doors 13 ( door panels 13a, 13b) when opening and closing is not detected within the detection areas Ea, Eb.
(c) Boundary detection
The detection processing unit 22b detects the boundary of the elevator structure from the images in the detection areas Ea and Eb. The "boundary of the elevator structure" herein refers to the boundary between the inner side surfaces 41a-1, 41b-1 of the front pillars 41a, 41b and the door obscurations 42a, 42 b. The detection processing unit 22b determines that the user or the object is present when the boundary is interrupted in the image (when the boundary is partially hidden).
In this method, as shown in fig. 12, the detection regions Ea and Eb need to be expanded in advance and set to include the above-described boundary. Further, a method of detecting a boundary in a detection area on an image is known, for example, in japanese patent application publication No. 2017-240799, and thus a detailed description thereof is omitted here.
In the image captured by the camera 12, there are boundaries between the inner side surfaces 41a-1, 41b-1 of the face pillars 41a, 41b and the doorboxes 42a, 42b, regardless of the open/close state of the car door 13. Therefore, by detecting whether the boundary is interrupted on the image, it is possible to reliably detect the user or the object approaching the door boxes 42a, 42b, and not to erroneously detect the user or the object away from the door boxes 42a, 42 b.
As another method, an object other than the elevator structure may be recognized from the images in the detection areas Ea and Eb, and the presence of the user or the object may be determined based on the recognition result. The method of object recognition may be a generally known method. For example, there are deep learning, SVM (Support Vector Machine), random forest, and the like.
Returning to fig. 4, when the presence of a user or an object is detected in the detection areas Ea and Eb during the door opening operation of the car door 13 (yes in step S16), a user detection signal is output from the image processing apparatus 20 to the car control apparatus 30. Upon receiving the user detection signal, the door opening/closing control unit 31 of the car control device 30 temporarily stops the door opening operation of the car doors 13, and after several seconds, restarts the door opening operation from the stop position (step S17).
When the user detection signal is received, the door opening speed of the car doors 13 may be made slower than normal, or the car doors 13 may be moved slightly in the reverse direction (door closing direction) and then the door opening operation may be started.
The notification unit 32 of the car control device 30 sounds the speaker 46 in the car 11 to call the user' S attention to leave the door boxes 42a and 42b (step S18). Further, the method of notification is not limited to voice broadcast, and for example, "danger is present near the dark box, please leave immediately" may be displayed. Such a message may be displayed in combination with a voice broadcast. In addition, a warning sound may be sounded.
The above-described processing is repeated while the presence of the user or the object is detected in the detection areas Ea, Eb. Thus, for example, when the user places his or her hand near the door camera 42a, the user can be prevented from being pulled into the door camera 42 a.
When the presence of the user or the object is not detected in the detection areas Ea and Eb (no in step S16), the car control device 30 continues the door closing operation of the car doors 13, and starts the car 11 to the destination floor after the door closing is completed (step S19).
In the above-described embodiment, the double-opening type car door is described as an example, but the same applies to the single-opening type shown in fig. 13.
Fig. 13 is a diagram showing a configuration of a doorway peripheral portion in a car using a car door in which both doors are of a single-door type. In this example, a car door 13 having two doors of a single-opening type is openably and closably provided at an entrance of a car 11. As shown in fig. 14, the car door 13 includes two door panels 13a and 13b that open and close in the same direction along the doorway direction.
In the case where the car door 13 is of the single-opening type, the door bellows 42a is provided only on one side of the doorway. In the example of fig. 13, a door box 42a is provided on the left side of the doorway, and both door panels 13a and 13b are stored in the door box 42a in an overlapping state when the door is opened.
Here, the camera 12 provided on the lintel plate 11a is brought close to the door dark box 42a side, and the detection area Ea is set in advance for the front surface pillar 41a on the door dark box 42a side. Specifically, as described with reference to fig. 3, a band-shaped detection area Ea having a predetermined width D1 is set in advance along the inner side surface 41a-1 of the front pillar 41 a. Thus, for example, when the hand of the user is located near the door camera 42a, the state can be detected from the image in the detection area Ea, and the user can be reflected in the door opening/closing operation, thereby preventing an accident that has been pulled into the door camera 42 a.
In fig. 13, if the detection area Eb is set in advance also for the other face pillar 41a, an accident (door collision accident) in which the side end portion of the car door 13 collides when the door is closed can be prevented.
As described above, according to embodiment 1, the detection regions Ea and Eb are set in advance on both sides of the doorway of the car 11, and thus it is possible to accurately detect a user or an object located near the door. Thus, an accident such as an accident that the elevator car is pulled into the door dark box can be prevented from occurring in advance when the door is opened and closed, and the elevator can be safely used. On the other hand, by limiting the location where the detection regions Ea and Eb are set, it is possible to avoid erroneous detection of a user or an object far from the door, and it is possible to prevent unnecessary door control and attention calling.
In the above-described embodiment 1, the detection regions Ea and Eb are set on both sides of the doorway of the car 11, but the detection regions may be set in advance only in at least one of the detection regions.
(embodiment 2)
Next, embodiment 2 will be explained.
Fig. 15 is a diagram showing an example of an image captured by the camera 12 in embodiment 2. In the state where the car doors 13 ( door panels 13a and 13b) and the hall doors 14 ( door panels 14a and 14b) are fully opened, a picture is taken from the upper part of the doorway of the car 11 to the lower part. In fig. 15, the upper side shows a waiting hall 15, and the lower side shows the inside of the car 11.
As in embodiment 1, detection regions Ea and Eb are set on the captured image on the inner side surfaces 41a-1 and 41b-1 of the front pillars 41a and 41b, respectively. The detection areas Ea, Eb are areas for detecting a user or an object, and here are used to prevent an accident of being pulled into the door obscurations 42a, 42b in the door opening operation.
Here, in embodiment 2, a detection area Ec is set in addition to the detection areas Ea and Eb on the captured image. The detection region Ec is provided adjacent to a car threshold 47 provided on the floor surface 19 of the car 11. The detection area Ec is an area for detecting a user or an object on the captured image, similarly to the detection areas Ea and Eb, and is used to prevent an accident of being caught between the doors during the door closing operation when the car door 13 is of a double door type.
The detection region Ec has a predetermined width D3 in the direction perpendicular to the doorway, and is set in a band shape along the longitudinal direction of the car sill 47. The width D3 may be the same as or different from the width D1 of the detection region Ea or the width D2 of the detection region Eb.
The width D3 may be partially varied. In the case where the car door 13 is of a double-door type, an accident occurs in which the car door is caught between the doors at the center of the doorway during the door closing operation. Therefore, the width D3a in the vicinity of the central portion of the detection region Ec is slightly wider than the width D3 (see fig. 19). This makes it possible to detect an accident that has occurred between doors early.
Further, since the car doors 13 ( door panels 13a, 13b) move on the car threshold 47, the car threshold 47 is out of zone setting. That is, the detection region Ec is set adjacent to one side in the longitudinal direction of the car threshold 47, except the upper side of the car threshold 47. This enables setting of the detection area Ec that is not affected by the opening and closing operation of the car doors 13.
Note that the method of setting the detection area Ec on the captured image is the same as the detection areas Ea and Eb. That is, the three-dimensional coordinates of the car sill 47 are obtained from the design values of the components of the car 11 and the values specific to the camera 12, the three-dimensional coordinates are projected as two-dimensional coordinates on the captured image, the region in which the car sill 47 is reflected on the captured image is obtained, and the detection region Ec is set in the region.
In this case, for example, as shown in fig. 16, marks m1 and m2 are placed on both ends of the car inner side of the car sill 47, and the three-dimensional coordinates of the car sill 47 are obtained with the positions of the marks m1 and m2 as references. Alternatively, as shown in fig. 17, the positions of two points P1, P2 on the car inner side of the car sill 47 may be obtained by image processing, and the three-dimensional coordinates of the front pillars 41a, 41b and the car sill 47 may be obtained based on the positions of the two points P1, P2.
The user detection process using the detection area Ec is also the same as the detection areas Ea and Eb. That is, the detection processing unit 22b extracts images in the detection area Ec from a plurality of captured images obtained in time series by the camera 12, and detects the presence or absence of a user or an object from these images.
As described above, in addition to the detection areas Ea and Eb, the detection area Ec is set in advance, and as shown in fig. 18 and 19, for example, when the hand of the user is extended outward from the car when the door is closed, the hand can be detected before the hand is sandwiched between the door panels 13a and 13 b.
When a person or an object is detected in the detection area Ec during the door closing operation, a user detection signal is output from the image processing device 20 to the car control device 30. Upon receiving the user detection signal, the door opening/closing control unit 31 of the car control device 30 stops the door closing operation of the car doors 13, moves the car doors 13 in the reverse direction (door opening direction), and then opens the car doors. At this time, the notification unit 32 of the car control device 30 gives a voice broadcast through the speaker 46 in the car 11, and calls the user's attention to move away from the car door 13.
Here, when the car door 13 is of a double-door type, the detection area Ec is preferably invalidated in advance because an accident of being caught near the center of the doorway does not occur when the car door 13 is opened. This is because, when the two doors are of the double-door type, the door panel 13a and the door panel 13b are opened from the center of the doorway, and therefore, there is a possibility that the user goes up and down from the vicinity of the center of the doorway in the middle of opening the doors. At this time, if the detection area Ec is valid, the user who gets on or off the elevator is detected.
Therefore, when the car doors 13 are opened, it is preferable to adopt a configuration in which the detection area Ec is switched to be effective/ineffective according to the type of the car doors 13 (type of opening/closing direction: double opening/single opening). That is, the detection processing unit 22b is configured to determine the type of the car door 13 when the car door 13 is opened, and if the two doors are of the double-door type, invalidate the detection area Ec and not execute the detection processing using the detection area Ec.
On the other hand, when the car door 13 is of the single-door opening type, the detection area Ec needs to be validated in advance because an accident of being pulled into the gap occurs when the car door is opened. Hereinafter, a case of the two-fan single-open type will be described.
Fig. 20 is a diagram showing a configuration of a doorway peripheral portion in a car using a car door in which both doors are of a single-door type. In this example, a car door 13 having two doors of a single-opening type is openably and closably provided at an entrance of a car 11. As shown in fig. 21, the car door 13 includes two door panels 13a and 13b that open and close in the same direction along the doorway direction.
Here, if the two doors are of the single-opening type, the door panel 13a and the door panel 13b overlap each other from the vicinity of the center of the doorway when the door is opened, and are stored in the door black 42 a. At this time, the user's hand, clothes, goods, etc. may sometimes be pulled into the gap 13c between the door panels 13a and 13 b. The detection area Ec is used to prevent such an accident of being pulled into the gap in the door opening action when the two doors are of the single opening type.
As described above, the detection region Ec is set in a band shape having the predetermined width D3 in the longitudinal direction of the car sill 47. Here, when the car door 13 is of the two-door single-door type, the two door panels 13a and 13b move in a superimposed manner from the center of the doorway to the door opening direction during the door opening operation, and therefore an accident of being pulled into the gap occurs during this period. Therefore, the width D3a from the vicinity of the central portion of the detection region Ec toward the door opening direction may be slightly larger than the width D3 (see fig. 21). This makes it possible to detect an accident of being caught by the gap as early as possible.
When a person or an object is detected in the detection area Ec during the door opening operation, a user detection signal is output from the image processing device 20 to the car control device 30. Upon receiving the user detection signal, the door opening/closing control section 31 of the car control device 30 temporarily stops the door opening operation of the car door 13, and restarts the door opening operation from the stopped position after several seconds, as in the case of the detection areas Ea and Eb. The door opening speed of the car doors 13 may be made slower than normal, or the car doors 13 may be moved slightly in the reverse direction (door closing direction) and the door opening operation may be restarted. The notification unit 32 of the car control device 30 sends a voice broadcast through the speaker 46 in the car 11 to call the user to notice the departure from the car door 13.
On the other hand, when the car door 13 is of the single-door opening type, when the car door 13 is closed, an accident (door collision accident) in which the side end portion of the car door 13 (the side end portion of one of the door panels 13a and 13b) collides, and an accident in which the car door 13 is caught between the doorway and the car door 13 may occur. Therefore, it is preferable to validate the detection region Ec in advance when the door is closed. That is, when the car door 13 is of the single-opening type, the detection processing unit 22b validates the detection area Ec both at the time of opening the door and at the time of closing the door, and executes the detection processing using the detection area Ec.
As described above, according to embodiment 2, in addition to the detection areas Ea and Eb set on both sides of the doorway of the car 11, the detection area Ec set on the car sill 47 side of the floor surface 16 of the car 11 is used to detect a user or an object. Thus, accidents occurring when the doors are opened and closed, including accidents occurring near the center of the doorway and accidents occurring when the doors are caught between the doors, accidents occurring when the doors are pulled into the gaps, and the like, can be prevented from occurring, and the elevator can be safely used.
In embodiment 2, the detection area Ec is set in addition to the detection areas Ea and Eb, but only the detection area Ec may be set to detect a user or an object.
According to at least one embodiment described above, it is possible to provide a user detection system for an elevator, which can accurately detect a user or an object located near a door without requiring a plurality of sensors, and prevent an accident in opening and closing the door.
In the above embodiments, the description has been given assuming the door provided in the car of the elevator, but the present invention is also applicable to an automatic door provided in, for example, a doorway of a building. That is, for example, in the case of an automatic door at the doorway of a building, a camera is installed at the upper part of the doorway, and detection regions Ea and Eb (one if opened on one side) are set inside the pillar parts on both sides of the doorway using the image captured by the camera. Further, a detection area Ec is set on the floor surface of the doorway along the door opening/closing direction of the door. Thereafter, as in the above-described embodiments, the user or the object is detected by the image in the detection area Ea, Eb, Ec, and is reflected in the door opening/closing control, and the user or the object is alerted.
In short, although several embodiments of the present invention have been described, these embodiments are merely provided as examples and are not intended to limit the scope of the present invention. These new embodiments can be implemented in other various ways, and various omissions, substitutions, and changes can be made without departing from the spirit of the invention. These embodiments and modifications thereof are included in the scope and gist of the invention, and are included in the invention described in the claims and the equivalent scope thereof.

Claims (16)

1. A user detection system for an elevator is characterized by comprising:
an imaging unit that images a predetermined range including the vicinity of an entrance where a door opens and closes from inside the car;
a detection region setting unit that sets a 1 st detection region on a face pillar provided on at least one of both sides of the doorway on the captured image obtained by the imaging unit, and sets a 2 nd detection region on a car sill side of a floor surface of the car; and
and a detection processing unit that detects the presence or absence of a user or an object based on the images in the 1 st detection region and the 2 nd detection region set by the detection region setting unit.
2. The user detection system of an elevator according to claim 1,
in the door opening operation of the door, the detection processing unit detects the presence or absence of a user or an object based on the image in the 1 st detection area and the image in the 2 nd detection area.
3. The user detection system of an elevator according to claim 1,
in the door closing operation of the door, the detection processing unit detects the presence or absence of a user or an object based on the image in the 1 st detection area and the image in the 2 nd detection area.
4. The user detection system of an elevator according to claim 1 or 2,
when the door is opened, the detection processing section switches the 2 nd detection area to be valid or invalid according to the type of the door.
5. The user detection system of an elevator according to claim 4,
in the case where the door is of a double-open type, the detection processing unit invalidates the 2 nd detection area and does not execute the detection processing using the 2 nd detection area.
6. The user detection system of an elevator according to claim 1,
the 1 st detection region is set on an inner side surface of the front pillar on the captured image.
7. The user detection system of an elevator according to claim 6,
the 1 st detection region is set to have a predetermined width in the width direction of the inner side surface of the front pillar.
8. The user detection system of an elevator according to claim 1,
the 2 nd detection region is set along the opening/closing direction of the door on the car sill side of the floor surface of the car on the captured image.
9. The user detection system of an elevator according to claim 8,
the 2 nd detection region is set to have a predetermined width in a short side direction of the car sill.
10. The user detection system of an elevator according to claim 1,
the detection area setting unit calculates a position of the face pillar and a position of the car sill on the captured image based on design values of the components of the car, an installation angle of the imaging unit, and a viewing angle, sets the 1 st detection area on the face pillar thus calculated, and sets the 2 nd detection area on the car sill side of the floor surface of the car.
11. The user detection system of an elevator according to claim 1,
the camera shooting part is arranged at the upper part of the access opening of the car.
12. The user detection system of an elevator according to claim 1,
the detection processing unit compares images in the 1 st detection region and the 2 nd detection region in time series order, and detects the presence or absence of a user or an object based on a difference between the images.
13. The user detection system of an elevator according to claim 1,
the detection processing unit compares the brightness of the images in the 1 st detection area and the 2 nd detection area in units of blocks in time series order, and detects the presence or absence of a user or an object based on a block having motion.
14. The user detection system of an elevator according to claim 1,
the detection processing unit detects a 1 st boundary between the face pillar and the doorway of the car based on the image in the 1 st detection area, detects a 2 nd boundary between the car threshold and the doorway of the car based on the image in the 2 nd detection area, and detects the presence or absence of a user or an object based on whether or not at least one of the 1 st boundary and the 2 nd boundary is broken.
15. The user detection system of an elevator according to claim 1,
the door opening/closing control unit controls the opening/closing operation of the door based on the detection result of the detection processing unit.
16. The user detection system of an elevator according to claim 1,
the elevator car further comprises a notification unit that calls the attention of a user in the car based on the detection result of the detection processing unit.
CN201911164317.8A 2019-03-18 2019-11-25 User detection system of elevator Pending CN111704012A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-049876 2019-03-18
JP2019049876A JP6702578B1 (en) 2019-03-18 2019-03-18 Elevator user detection system

Publications (1)

Publication Number Publication Date
CN111704012A true CN111704012A (en) 2020-09-25

Family

ID=70858128

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911164317.8A Pending CN111704012A (en) 2019-03-18 2019-11-25 User detection system of elevator

Country Status (2)

Country Link
JP (1) JP6702578B1 (en)
CN (1) CN111704012A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111483898A (en) * 2019-01-28 2020-08-04 奥的斯电梯公司 Elevator car and door motion monitoring

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7155201B2 (en) * 2020-07-09 2022-10-18 東芝エレベータ株式会社 Elevator user detection system
KR102302767B1 (en) * 2021-02-17 2021-09-14 (주)태성에스엔씨 System for Detecting Subway Entrance Door based on Image Deep Learning Reading

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1346327A (en) * 1999-02-11 2002-04-24 Tl琼斯有限公司 Obstruction detection system
JP2007153499A (en) * 2005-12-02 2007-06-21 Mitsubishi Electric Corp Elevator door control device
CN101891102A (en) * 2009-05-21 2010-11-24 株式会社日立制作所 The safety device of elevator and method of controlling security
JP2012214285A (en) * 2011-04-01 2012-11-08 Mitsubishi Electric Corp Door system and elevator apparatus
JP2018030676A (en) * 2016-08-24 2018-03-01 東芝エレベータ株式会社 Elevator control system
CN108622777A (en) * 2017-03-24 2018-10-09 东芝电梯株式会社 The boarding detection system of elevator

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5410149A (en) * 1993-07-14 1995-04-25 Otis Elevator Company Optical obstruction detector with light barriers having planes of light for controlling automatic doors
JPH10152277A (en) * 1996-11-21 1998-06-09 Mitsubishi Electric Corp Elevator door opening/closing device
JP4869785B2 (en) * 2006-05-25 2012-02-08 三菱電機株式会社 Elevator door control device
JP2008100782A (en) * 2006-10-17 2008-05-01 Mitsubishi Electric Corp Elevator and its safety device
JP5369175B2 (en) * 2008-05-22 2013-12-18 オーチス エレベータ カンパニー Elevator door detection apparatus and detection method using video
JP6046286B1 (en) * 2016-01-13 2016-12-14 東芝エレベータ株式会社 Image processing device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1346327A (en) * 1999-02-11 2002-04-24 Tl琼斯有限公司 Obstruction detection system
JP2007153499A (en) * 2005-12-02 2007-06-21 Mitsubishi Electric Corp Elevator door control device
CN101891102A (en) * 2009-05-21 2010-11-24 株式会社日立制作所 The safety device of elevator and method of controlling security
JP2012214285A (en) * 2011-04-01 2012-11-08 Mitsubishi Electric Corp Door system and elevator apparatus
JP2018030676A (en) * 2016-08-24 2018-03-01 東芝エレベータ株式会社 Elevator control system
CN108622777A (en) * 2017-03-24 2018-10-09 东芝电梯株式会社 The boarding detection system of elevator

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111483898A (en) * 2019-01-28 2020-08-04 奥的斯电梯公司 Elevator car and door motion monitoring
CN111483898B (en) * 2019-01-28 2022-04-08 奥的斯电梯公司 Elevator car and door motion monitoring

Also Published As

Publication number Publication date
JP2020152469A (en) 2020-09-24
JP6702578B1 (en) 2020-06-03

Similar Documents

Publication Publication Date Title
CN111704012A (en) User detection system of elevator
JP7230114B2 (en) Elevator user detection system
JP7043565B2 (en) Elevator user detection system
CN111847159B (en) User detection system of elevator
CN111704013A (en) User detection system of elevator
JP6878558B1 (en) Elevator user detection system
JP6849760B2 (en) Elevator user detection system
CN112441490B (en) User detection system for elevator
CN112340560B (en) User detection system for elevator
CN112456287B (en) User detection system for elevator
CN112441497B (en) User detection system for elevator
JP7155201B2 (en) Elevator user detection system
CN112551292B (en) User detection system for elevator
JP7077437B2 (en) Elevator user detection system
JP7135144B1 (en) Elevator user detection system
JP2024080810A (en) Elevator System

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200925