CN114671311B - Display control device for elevator - Google Patents

Display control device for elevator Download PDF

Info

Publication number
CN114671311B
CN114671311B CN202110625983.8A CN202110625983A CN114671311B CN 114671311 B CN114671311 B CN 114671311B CN 202110625983 A CN202110625983 A CN 202110625983A CN 114671311 B CN114671311 B CN 114671311B
Authority
CN
China
Prior art keywords
attention
region
detection unit
landing
projection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110625983.8A
Other languages
Chinese (zh)
Other versions
CN114671311A (en
Inventor
加藤美穗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Publication of CN114671311A publication Critical patent/CN114671311A/en
Application granted granted Critical
Publication of CN114671311B publication Critical patent/CN114671311B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B3/00Applications of devices for indicating or signalling operating conditions of elevators
    • B66B3/02Position or depth indicators
    • B66B3/023Position or depth indicators characterised by their mounting position
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B5/00Applications of checking, fault-correcting, or safety devices in elevators
    • B66B5/0006Monitoring devices or performance analysers
    • B66B5/0018Devices monitoring the operating condition of the elevator system

Landscapes

  • Indicating And Signalling Devices For Elevators (AREA)

Abstract

The display control device of the elevator can display images to users easily in a viewing manner even at a crowded landing. The display control device includes a position detection unit, a attention detection unit, and a region determination unit. The position detection unit detects a waiting position of a user waiting at a landing from an image of the landing captured by the imaging device. The attention detection unit detects the attention of a user waiting at a landing, based on an image captured by the imaging device, with respect to a region of the landing where the projection device can project the image. The area determination unit determines a projection area in which the image is projected by the projection device, based on information including the waiting position detected by the position detection unit and the attention degree detected by the attention degree detection unit. The region determination unit preferentially determines a region having a higher degree of attention as a projection region as a region having a lower possibility of the projection being blocked by the user.

Description

Display control device for elevator
Technical Field
The present invention relates to a display control device for an elevator.
Background
Patent document 1 discloses an example of an elevator. At the landing of the elevator, an image is projected by a projection device provided to the car by means of a reflection device.
Prior art literature
Patent literature
Patent document 1: japanese patent laid-open publication 2016-199390
Disclosure of Invention
However, in the elevator of patent document 1, the image is projected to a single area set in advance. Therefore, in the case of crowding of a landing or the like, there is a possibility that the projected image becomes less visible.
The present invention has been made to solve such problems. The invention provides a display control device of an elevator, which can display images to users easily in a viewing manner even at a crowded landing.
The elevator display control device of the invention comprises: a position detection unit that detects a waiting position of a user waiting at a landing, from an image captured by a photographing device provided at the landing of an elevator; a degree of attention detection unit that detects a degree of attention of a user waiting at the hall from an image captured by the imaging device, with respect to a region of the hall where an image can be projected by a projection device provided at the hall; and a region determination unit that, based on information including the waiting position detected by the position detection unit and the attention degree detected by the attention degree detection unit, preferentially determines, as a projection region in which the image is projected by the projection device, a region in which the higher the attention degree is and a region in which the likelihood of projection is less blocked by a user waiting at the landing.
Effects of the invention
In the case of the display control apparatus of the present invention, an image can be displayed to a user with ease of viewing even at a crowded landing.
Drawings
Fig. 1 is a structural diagram of an elevator according to embodiment 1.
Fig. 2 is a flowchart showing an example of the operation of the display system according to embodiment 1.
Fig. 3 is a hardware configuration diagram of a main part of the display system of embodiment 1.
Fig. 4 is a diagram showing an example of a landing of the elevator of embodiment 2.
Fig. 5 is a flowchart showing an example of the operation of the display system according to embodiment 2.
Fig. 6 is a diagram illustrating an example of the attention detection performed by the attention detection unit of embodiment 3.
Fig. 7 is a flowchart showing an example of the operation of the display system according to embodiment 3.
Fig. 8 is a flowchart showing an example of the operation of the display system according to embodiment 4.
Fig. 9 is a flowchart showing an example of the operation of the display system according to embodiment 5.
Description of the reference numerals
1: an elevator; 2: a hoistway; 3: a landing; 4: landing door; 5: a car; 6: a control panel; 7: a car door; 8: a display system; 9: a photographing device; 10: a projection device; 11: a display control device; 12: an information acquisition unit; 13: an image processing section; 14: a position detection unit; 15: a characteristic detection unit; 16: a attention degree detection unit; 17: a region determination unit; 18: an image conversion section; 100a: a processor; 100b: a memory; 200: dedicated hardware.
Detailed Description
The mode for carrying out the invention will be described with reference to the accompanying drawings. In the drawings, the same or corresponding portions are denoted by the same reference numerals, and repetitive description thereof will be appropriately simplified or omitted.
Embodiment 1
Fig. 1 is a structural diagram of an elevator 1 according to embodiment 1.
The elevator 1 is applied to a building having a plurality of floors. A hoistway 2 of an elevator 1 is provided in a building. The hoistway 2 is a vertically long space that spans a plurality of floors. Landing 3 of elevator 1 is provided at each floor of the building. Landing doors 4 are provided at the landing 3 of each floor. The landing door 4 is a door that divides the landing 3 and the hoistway 2. The elevator 1 includes a car 5 and a control panel 6. The car 5 is a device that travels in the vertical direction in the hoistway 2 to convey a user riding in the interior space between a plurality of floors. The car 5 is provided with a car door 7. The car door 7 is a device for opening and closing the landing door 4 of an arbitrary floor in association with each other when the car 5 stops at the floor, so that a user can ride on and off the car 5. The control panel 6 is a device for controlling the operation of the elevator 1, such as the running of the car 5 and the opening and closing of the car door 7.
The elevator 1 is provided with a display system 8. The display system 8 is a system for displaying information to a user waiting for the arrival of the car 5 at the landing 3. The display system 8 includes an imaging device 9, a projection device 10, and a display control device 11.
The imaging device 9 is provided at the landing 3. In this example, the imaging device 9 is a camera that captures images of the landing 3. The image captured by the imaging device 9 is a moving image, a still image, or the like. The imaging device 9 is disposed, for example, on a ceiling of the landing 3. The photographing device 9 may be provided in plurality at one landing 3.
The projection device 10 is provided at the landing 3. In this example, the projection device 10 is a projector that projects an image to the landing 3. The image projected by the projection device 10 contains information to be displayed to the user. The information to be displayed to the user includes, for example, traffic information such as congestion status, departure and arrival status of the elevator 1. The information to be displayed to the user may include, for example, guidance information for guiding the elevator 1 to travel on and off. The projection device 10 is disposed, for example, on a ceiling of the landing 3. In this example, the projection apparatus 10 is equipped with a panning mechanism capable of adjusting the area of the projection image. Here, the area where the projection device 10 can project an image includes, for example, a wall surface of the landing 3, a floor surface, a ceiling, or the like. The area where the projection device 10 can project an image may include a part or all of the surface of the landing door 4.
The display control device 11 has a function of controlling display of image projection by the projection device 10. The display control device 11 includes an information acquisition unit 12, an image processing unit 13, a position detection unit 14, a characteristic detection unit 15, a attention detection unit 16, a region determination unit 17, and an image conversion unit 18.
The information acquisition unit 12 is a unit that acquires information to be displayed to a user. In this example, the information acquisition unit 12 acquires the operation information of the elevator 1 from the control panel 6.
The image processing unit 13 is a part that processes an image captured by the imaging device 9. The image processing unit 13 acquires a captured image from the imaging device 9. The image processing in the image processing unit 13 includes, for example, conversion from a moving image to a still image when the captured image is a moving image.
The position detection unit 14 is a unit that detects a waiting position of a user waiting at the hall 3. The waiting position of the user is a position of the user waiting for the arrival of the car 5 at the landing 3. When a plurality of users are waiting at the hall 3, the position detection unit 14 detects a waiting position for each user. The position detecting unit 14 detects a waiting position from the image of the landing 3 captured by the imaging device 9. The position detecting unit 14 acquires the image of the landing 3 processed by the image processing unit 13. In this example, the image processed by the image processing unit 13 is a still image converted from a dynamic image by the image processing unit 13. The position detection unit 14 detects the position of the user based on, for example, a difference between a reference image acquired in advance when the user is not present at the landing 3 and the image acquired from the image processing unit 13.
The characteristic detecting unit 15 detects characteristics of a user waiting at the hall 3. Here, the characteristics of the user include information on the height of the line of sight of the user. The user characteristics may also include information about whether the wheelchair user is, for example. The characteristics of the user may include, for example, information on whether or not the user is a low-height person having a height lower than a predetermined height. The person of low height may also be a child, for example. The user's characteristics may also be the height of a standing user. The position detecting unit 14 detects the characteristics of the user of the hall 3 from the image of the hall 3 captured by the imaging device 9.
The attention detection unit 16 is a portion that detects the attention of the user waiting at the landing 3 for the area of the landing 3 where the projection apparatus 10 can project an image. The attention detection unit 16 detects attention from the image of the landing 3 captured by the imaging device 9. The attention detection unit 16 acquires the image of the landing 3 processed by the image processing unit 13. The attention detection unit 16 detects, for example, a position on which the user's line of sight is directed at the landing 3 by using the orientation of the face or body for each user whose waiting position is detected by the position detection unit 14. The attention detection unit 16 sets, for example, a region including a portion on which the user's eyes are directed as a region of high attention. The attention detection unit 16 may set, for example, a region of a predetermined range based on the center of gravity of a portion on which the eyes of a plurality of users are directed at the wall surface of the landing 3 as a region of high attention. In this case, the attention detection unit 16 detects the number of users who have a line of sight directed to the area as the attention of the area, for example. The attention detection unit 16 may detect a plurality of regions having a high attention when there are a plurality of regions where the user's eyes are concentrated at the landing 3. In this case, the attention detection unit 16 may detect the number of users who have a line of sight directed to each area as the attention of the area, for example.
The area determination unit 17 is a part that determines a projection area of the image projected by the projection device 10. The area determination unit 17 sets an area where the likelihood of the projection being blocked by the user is smaller as the projection area is more preferable, based on the waiting position detected by the position detection unit 14. For example, if there is no user on the optical path of the image projected by the projector 10, the area determination unit 17 is configured to reduce the possibility that the projection is blocked by the user in the area. For example, when a user is present on the optical path of the image projected by the projector 10, the area determination unit 17 may block the projection in the area. In this case, the area determination unit 17 may be configured to increase the possibility that the projection is blocked by the user as the user on the optical path approaches the area of the optical axis of the projection apparatus 10. The area determination unit 17 may be configured to increase the possibility that the projection is blocked by the user as the number of users on the optical path increases, for example. The area determination unit 17 may determine the possibility that the projection is blocked by the user by using the height information of the user whose characteristics are detected by the characteristics detection unit 15. Alternatively, the area determination unit 17 may determine the possibility that the projection is blocked by the user by setting the height of the user to a predetermined value such as the average height. The region determination unit 17 sets a region having a higher degree of attention as a projection region preferentially according to the degree of attention detected by the degree of attention detection unit 16. The region determination unit 17 calculates the priority using, for example, a monotonic function such as a weighted sum of the probability that the projection will be blocked by the user and the degree of attention, and determines a region with a higher priority as the projection region. Alternatively, the region determination unit 17 may determine, as the projection region, a region having the highest degree of attention among regions where the possibility of the projection being blocked by the user is lower than a predetermined threshold. The region determination unit 17 may determine the projection region by another method based on the possibility that the projection is blocked by the user and the attention.
The image conversion unit 18 converts the image projected by the projection device 10 based on the projection region determined by the region determination unit 17. The conversion in the image conversion unit 18 is, for example, affine (affine) conversion, homography (homography) conversion, or the like of the image projected by the projection device 10. The conversion in the image conversion unit 18 may include, for example, conversion of an image color corresponding to the color of the projection region. For example, when the image projected by the projector 10 includes information such as an arrow indicating the direction to the landing door 4, the conversion by the image conversion unit 18 may include a conversion of the direction of the arrow.
Next, an operation example of the display system 8 according to embodiment 1 will be described with reference to fig. 2.
Fig. 2 is a flowchart showing an example of the operation of the display system 8 according to embodiment 1.
In step S101, the photographing device 9 photographs an image of the landing 3. Then, in step S102, the image processing unit 13 performs conversion processing of the captured image. Then, in step S103, the position detecting unit 14 detects the waiting position of the user from the image subjected to the conversion processing in the image processing unit 13. Then, in step S104, the characteristic detecting unit 15 detects the characteristic of the user from the image subjected to the conversion processing in the image processing unit 13. Then, in step S105, the attention detection unit 16 detects the attention of the user to the region of the hall 3 from the image subjected to the conversion processing in the image processing unit 13. Then, in step S106, the information acquiring unit 12 acquires information to be displayed to the user. The information acquisition unit 12 acquires information on the operation of the elevator 1 from the control panel 6, for example. Then, in step S107, the area determination unit 17 determines the projection area based on the possibility that the projection is blocked by the user and the attention. Then, in step S108, the image conversion unit 18 performs conversion processing corresponding to the projection area of the image to be displayed to the user. Then, in step S109, the projector 10 projects the image converted by the image conversion unit 18 to the projection area determined by the area determination unit 17. After that, the operation of the display system 8 proceeds to step S101. By repeating the processing in this way, the information displayed to the user is always projected to the region where the attention of the user is high.
As described above, the display control device 11 of the display system 8 according to embodiment 1 includes the position detection unit 14, the attention detection unit 16, and the region determination unit 17. The imaging device 9 and the projection device 10 are provided in the landing 3. The position detecting unit 14 detects a waiting position of the user waiting at the hall 3 from the image captured by the imaging device 9. The attention detection unit 16 detects the attention of the user waiting at the landing 3 from the image captured by the imaging device 9 for the area of the landing 3 where the image can be projected by the projection device 10. The area determination unit 17 determines a projection area in which the image is projected by the projection device 10, based on information including the waiting position detected by the position detection unit 14 and the attention degree detected by the attention degree detection unit 16. The region determination unit 17 determines a region having a higher degree of attention and a region having a lower possibility of the projection being blocked by the user as projection regions with priority.
According to such a configuration, the projection device 10 projects an image toward the user, that is, the projection is blocked by the user. In addition, the information is displayed in a region where the attention of the user is high. Therefore, even at the crowded landing 3, an image can be displayed to the user with ease of viewing. The user can easily notice the displayed information, and thus the convenience of the user can be improved.
The display control device 11 further includes an image conversion unit 18. The image conversion unit 18 converts an image to be projected by the projector 10 based on the projection area determined by the area determination unit 17.
According to such a configuration, an appropriate image is displayed according to the projection area. Thus, the user can more easily view the information.
The location where the imaging device 9 and the projection device 10 are disposed is not limited to the ceiling of the landing 3. The imaging device 9 is disposed at a position where the image of the landing 3 is captured without being blocked by an obstacle other than the user. The imaging device 9 may be disposed at any position of the landing 3 as long as it can capture an image for detecting the waiting position, characteristics, and the like of the user. The projection device 10 is disposed at a position where projection onto the area of the landing 3 is not blocked by an obstacle other than the user. The projector 10 may be disposed at any position of the landing 3 as long as it can project an image for display to a user.
Next, an example of a hardware configuration of the display system 8 will be described with reference to fig. 3.
Fig. 3 is a hardware configuration diagram of a main part of the display system 8 of embodiment 1.
The functions of the display system 8 including the display control device 11 can be realized by a processing circuit. The processing circuit is provided with at least one processor 100a and at least one memory 100b. The processing circuit may include the processor 100a and the memory 100b, or may include at least one dedicated hardware 200 instead of them.
In the case where the processing circuit includes the processor 100a and the memory 100b, each function of the display system 8 is realized by software, firmware, or a combination of software and firmware. At least one of the software and the firmware is described as a program. The program is stored in the memory 100b. The processor 100a realizes the functions of the display system 8 by reading out and executing programs stored in the memory 100b.
The processor 100a is also called a CPU (Central Processing Unit: central processing unit), a processing device, an arithmetic device, a microprocessor, a microcomputer, or a DSP. The Memory 100b is constituted by a nonvolatile or volatile semiconductor Memory such as RAM (Random Access Memory: random access Memory), ROM (Read Only Memory), flash Memory, EPROM (Erasable Programmable Read Only Memory: erasable programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read Only Memory: electrically erasable programmable Read Only Memory), or the like.
In the case of a processing circuit provided with dedicated hardware 200, the processing circuit is implemented, for example, by a single circuit, a complex circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit: application specific integrated circuit), an FPGA (Field Programmable Gate Array: field programmable gate array), or a combination thereof.
The functions of the display system 8 can be realized by a processing circuit. Alternatively, the functions of the display system 8 may be realized by a processing circuit. With respect to the functions of the display system 8, one part may be implemented by the dedicated hardware 200, and the other part may be implemented by software or firmware. Thus, the processing circuitry implements the functions of the display system 8 through dedicated hardware 200, software, firmware, or a combination thereof.
In the respective embodiments described below, differences from examples disclosed in other embodiments are described in particular detail. As for the features not described in the respective embodiments below, any features of examples disclosed in other embodiments may be employed.
Embodiment 2
Fig. 4 is a diagram showing an example of landing 3 of elevator 1 according to embodiment 2.
The projection device 10 projects an image using any one of a plurality of regions set in advance as a projection region. In the display system 8 of this example, 9 areas are set in advance. The predetermined plurality of regions include an upper left region A1, an upper center region A2, an upper right region A3, a middle left region B1, a middle center region B2, a middle right region B3, a lower left region C1, a lower center region C2, and a lower right region C3. Here, the areas A2 and B2 are areas of the surface of the landing door 4. The areas A1, A3, B1, and B3 are areas of the wall surface of the landing 3. The areas C1, C2, and C3 are areas of the ground of the landing 3. The areas A2, B2, and C2 are areas in front of the landing door 4.
The attention degree detection unit 16 detects the attention degree of the user for each region set in advance. In this example, the attention detection unit 16 counts the number of users whose video lines are directed to each area. The attention degree detection unit 16 sets the number of users counted for each area as the attention degree of the area.
Next, an operation example of the display system 8 according to embodiment 2 will be described with reference to fig. 5.
Fig. 5 is a flowchart showing an example of the operation of display system 8 according to embodiment 2.
Fig. 5 shows an example of operations related to the attention detection by the attention detection unit 16.
In step S201, the attention detection unit 16 acquires the image of the landing 3 subjected to the conversion processing in the image processing unit 13. After that, the operation of the attention detection unit 16 proceeds to step S202.
In step S202, the attention detection unit 16 selects a user whose line of sight is not determined among users waiting at the landing 3. The attention detection unit 16 determines the direction of the line of sight of the selected user from the acquired image. Then, in step S203, the attention detection unit 16 adds 1 to the number of lines of sight directed to the region including the portion to which the selected line of sight of the user is directed. Here, if the selected portion to which the user's line of sight is directed is not included in any region, the attention detection unit 16 may add 1 to the number of lines of sight of the region closest to the selected portion to which the user's line of sight is directed. After that, the operation of the attention detection unit 16 proceeds to step S204.
In step S204, the attention detection unit 16 determines whether or not there is a user whose line of sight is not determined among users waiting at the landing 3. If the determination result is yes, the operation of the attention detection unit 16 proceeds to step S202. On the other hand, if the determination result is no, the operation of the attention detection unit 16 proceeds to step S205.
In step S205, the attention degree detection unit 16 sets the number of lines of sight directed to each region as the attention degree of the region. The attention degree detection unit 16 outputs the attention degree detected for each region to the region determination unit 17. After that, in step S206, the attention detection unit 16 performs a reset process. Here, the resetting processing includes, for example, processing for setting the number of lines of sight and the attention degree to each region to 0. Then, the operation of the attention detection unit 16 relating to the attention detection ends.
The region determination unit 17, which receives the attention output from the attention detection unit 16, determines the projection region by preferentially selecting a region having a higher attention from among the preset regions. The projection device 10 projects the image onto the projection region determined by the region determining unit 17.
As described above, the attention degree detection unit 16 of the display control apparatus 11 according to embodiment 2 detects the attention degree of the user waiting at the landing 3 for each of the plurality of areas set in advance at the landing 3.
According to such a configuration, the projection area is limited to any of a plurality of areas set in advance. Therefore, the fluctuation of the projection area due to the fine fluctuation of the user's line of sight is less likely to occur, and the displayed image is easier to view.
The attention detection unit 16 detects the attention of each region by using the number of users whose line of sight is directed to each region.
According to such a configuration, the display control apparatus 11 can detect a region of high attention in a simpler manner.
The attention detection unit 16 may detect the attention of each area using information other than the number of users whose line of sight is directed to each area. The attention detection unit 16 may detect the attention for each area, for example, for a duration of the user's gaze.
Embodiment 3
Fig. 6 is a diagram illustrating an example of the attention detection by the attention detection unit 16 according to embodiment 3.
In this example, the characteristic detecting section 15 of the display control device 11 detects wheelchair users, low height users, and other users as characteristics of the users. The low-height user is, for example, a user who is not a wheelchair user but is lower than a predetermined height. Other users are for example users who are neither wheelchair users nor low-height users.
The attention degree detection unit 16 detects attention degrees with respect to the respective areas set in advance using the characteristic information of the user detected by the characteristic detection unit 15. The attention degree detection unit 16 stores a predetermined point table shown in fig. 6. The point table is a table in which point values are associated with characteristics of users. The attention degree detection unit 16 associates 10 points with the wheelchair user in the point table. The attention degree detection unit 16 associates 3 points with the person with low height in the point table. The attention degree detection unit 16 associates 1 point with another user in the point table. The number of points in the point table may be a value that can be set as a parameter.
Next, an operation example of the display system 8 according to embodiment 3 will be described with reference to fig. 7.
Fig. 7 is a flowchart showing an example of the operation of display system 8 according to embodiment 3.
Fig. 7 shows an example of operations of the attention detection by the attention detection unit 16.
In step S301, the attention detection unit 16 acquires an image of the landing 3 subjected to the conversion processing by the image processing unit 13. After that, the operation of the attention detection unit 16 proceeds to step S302.
In step S302, the attention detection unit 16 selects a user whose line of sight orientation is not determined, among users waiting at the landing 3. After that, the operation of the attention detection unit 16 proceeds to step S303.
In step S303, the attention detection unit 16 determines whether or not the characteristics of the selected user are wheelchair users. If the determination result is yes, the operation of the attention detection unit 16 proceeds to step S304. On the other hand, if the determination result is no, the operation of the attention detection unit 16 proceeds to step S305.
In step S304, the attention degree detection unit 16 reads out points corresponding to the wheelchair user from a pre-stored point table. After that, the operation of the attention detection unit 16 proceeds to step S308.
In step S305, the attention detection unit 16 determines whether or not the characteristics of the selected user are low-height users. If the determination result is yes, the operation of the attention detection unit 16 proceeds to step S306. On the other hand, if the determination result is no, the operation of the attention detection unit 16 proceeds to step S307.
In step S306, the attention detection unit 16 reads out the number of points corresponding to the person with low height from the prestored number of points table. After that, the operation of the attention detection unit 16 proceeds to step S308.
In step S307, the attention degree detection unit 16 reads out points corresponding to other users from a pre-stored point table. After that, the operation of the attention detection unit 16 proceeds to step S308.
In step S308, the attention detection unit 16 determines the line of sight orientation of the selected user from the acquired image. Then, in step S309, the attention detection unit 16 adds the point read out according to the characteristics of the user to the point of the region including the portion to which the selected user' S line of sight is directed. After that, the operation of the attention detection unit 16 proceeds to step S310.
In step S310, the attention detection unit 16 determines whether or not there is a user whose line of sight is not determined among users waiting at the landing 3. If the determination result is yes, the operation of the attention detection unit 16 proceeds to step S302. On the other hand, if the determination result is no, the operation of the attention detection unit 16 proceeds to step S311.
In step S311, the attention degree detection unit 16 sets the number of points in each region as the attention degree of the region. The attention degree detection unit 16 outputs the attention degree detected for each region to the region determination unit 17. After that, in step S312, the attention detection unit 16 performs a reset process. Here, the reset processing includes, for example, processing for setting the number of points and the attention degree of each area to 0. Then, the operation of the attention detection unit 16 relating to the attention detection ends.
The attention detection by the attention detection unit 16 will be described with reference to a more specific example. In this example, 5 users are waiting at the landing 3. One wheelchair user among the waiting users and one other user are directing the line of sight to the area A1. One of the waiting users, the low height user, and two other users are directing the line of sight to the area B2. At this time, the attention detection unit 16 detects the number of points of the area A1, that is, the attention as 10+1=11. The attention detection unit 16 detects the number of points in the region B2, that is, the attention as 3+1+1=5. The attention degree detection unit 16 detects the attention degree of the other regions as 0.
The region determination unit 17, which receives the output of the attention from the attention detection unit 16, determines the projection region by preferentially selecting a region having a higher attention from the preset regions. The projection device 10 projects the image onto the projection region determined by the region determining unit 17.
As described above, the display control device 11 according to embodiment 3 includes the characteristic detection unit 15. The characteristic detecting unit 15 detects characteristics concerning a user waiting at the landing 3 from an image captured by the imaging device 9. The characteristics of the user include information about the height of the user's line of sight. The attention degree detection unit 16 detects the attention degree of each region using a value weighted according to the characteristics detected by the characteristics detection unit 15 of the user who is looking into each region.
According to this configuration, even if a physically disabled user is present, the display control device 11 can perform a readily noticeable display. Further, the display can be suppressed from being easily seen due to the physical constitution of the user.
Embodiment 4
Fig. 8 is a flowchart showing an example of the operation of display system 8 according to embodiment 4.
Fig. 8 shows an example of the operation of the projection area determination by the area determination unit 17.
The area determination unit 17 of the display control device 11 determines the projection area according to the status of the landing door 4. In this example, 9 areas shown in fig. 4 are preset. The region determination unit 17 determines the projection region by selecting an arbitrary region from the predetermined regions.
In step S401, the area determination unit 17 acquires the opening/closing information of the landing door 4 from the control panel 6 via the information acquisition unit 12. Then, in step S402, the area determination unit 17 obtains the waiting position of the user from the position detection unit 14, and obtains the attention degree for each preset area from the attention degree detection unit 16. After that, the operation of the area determination unit 17 proceeds to step S403.
In step S403, the region determination unit 17 selects the 1 st candidate of the projection region from the arbitrary region. Here, the selection of the 1 st candidate of the projection region is performed by the same method as the selection of the projection region. In this example, the 1 st candidate of the projection region is the region of highest priority. Here, the region having a higher attention and a lower possibility of being blocked by the user is the region having a higher priority. The area determination unit 17 determines whether or not the 1 st candidate of the projection area includes the surface of the landing door 4. In this example, the areas A2 and B2 are areas including the surface of the landing door 4. When the 1 st candidate of the projection area includes the surface of the landing door 4, the operation of the area determination unit 17 proceeds to step S404. On the other hand, when the 1 st candidate of the projection area does not include the surface of the landing door 4, the operation of the area determination unit 17 proceeds to step S405.
In step S404, the area determination unit 17 determines whether the landing door 4 is closed using the opening/closing information of the landing door 4 acquired from the control panel 6. In this example, the area determination unit 17 determines that the landing door 4 is not closed when the landing door 4 is in an open state. The area determination unit 17 determines that the landing door 4 is not closed when the landing door 4 is in a state of being opened and closed. When it is determined that the landing door 4 is closed, the operation of the area determination unit 17 proceeds to step S405. On the other hand, when it is determined that the landing door 4 is not closed, the operation of the area determination unit 17 proceeds to step S406.
In step S405, the region determination unit 17 determines the selected 1 st candidate as the projection region. Then, the operation of the area determining unit 17 related to the determination of the projection area is completed.
In step S406, the region determination unit 17 selects the 2 nd candidate of the projection region from the arbitrary region. In this example, the 2 nd candidate of the projection region is a region having priority next to the 1 st candidate. The area determination unit 17 determines whether or not the 2 nd candidate of the projection area includes the surface of the landing door 4. When the 2 nd candidate of the projection area does not include the surface of the landing door 4, the operation of the area determination unit 17 proceeds to step S407. On the other hand, when the 2 nd candidate of the projection area includes the surface of the landing door 4, the operation of the area determination unit 17 proceeds to step S408.
In step S407, the region determination unit 17 determines the selected 2 nd candidate as the projection region. Then, the operation of the area determining unit 17 related to the determination of the projection area is completed.
In step S408, the area determination unit 17 determines an area having the highest priority among areas other than the area including the surface of the landing door 4 as a projection area. Then, the operation of the area determining unit 17 related to the determination of the projection area is completed.
The projection device 10 projects the image onto the projection region determined by the region determining unit 17.
As described above, the area determination unit 17 of the display control device 11 according to embodiment 4 determines the projection area so as to exclude the area including the surface of the landing door 4 from the plurality of areas when the landing door 4 of the elevator 1 is not closed.
According to this configuration, it is possible to prevent an image from being projected to a user riding in the car 5 through the opening of the landing door 4 that is opened together with the car door 7. Thus, the user at the landing 3 can easily view the display in the display system 8 of the landing 3. The user can easily notice the displayed information, and thus the convenience of the user is improved.
In addition, when there is a region in which a part is located on the surface of the landing door 4 and the other part is not located on the surface of the landing door 4, the region determination unit 17 may determine the projection region so as to exclude the region when the landing door 4 is not closed.
Embodiment 5
Fig. 9 is a flowchart showing an example of the operation of display system 8 according to embodiment 5.
Fig. 9 shows an example of the operation performed by the area determination unit 17 to determine the projection area.
The area determination unit 17 of the display control device 11 determines the projection area according to the status of the landing door 4. In this example, 9 areas shown in fig. 4 are preset. The region determination unit 17 determines the projection region by selecting an arbitrary region from the predetermined regions. Here, when the area of highest attention is not specified as one, the area determination unit 17 determines a preset reference area as a projection area. The reference region is a region selected in advance from among the plurality of regions. The reference region may be settable as a parameter.
The region determining unit 17 according to embodiment 5 operates in the same manner as the region determining unit 17 according to embodiment 4 in steps S401 to S402 and steps S403 to S408. After the waiting position and the attention degree are acquired in step S402, the operation of the area determination unit 17 in embodiment 5 proceeds to step S451.
In step S451, the area determination unit 17 determines whether or not a user is present at the landing 3. For example, when the attention of each area is 0, the area determination unit 17 determines that there is no user at the landing 3. When there is no user at the landing 3, the attention degrees of the respective areas are all 0, and there is no difference, so the area with the highest attention degree is not determined as one. When it is determined that the user is present at the hall 3, the operation of the area determination unit 17 proceeds to step S452. On the other hand, when it is determined that there is no user at the hall 3, the operation of the area determination unit 17 proceeds to step S453.
In step S452, the region determination unit 17 calculates a difference in attention level between regions arranged in the first two bits, which are high in attention level, among the plurality of regions. The regions arranged in the first two places are, for example, a region having the highest attention and a region having attention inferior to the region. Alternatively, when there are a plurality of regions having the highest attention, the regions arranged in the first two may be any two of the regions having the highest attention. The area determination unit 17 determines whether the calculated difference between the attention degrees is smaller than a predetermined threshold. In the case where the difference in the degree of attention of the regions arranged in the first two bits is smaller than the threshold value, the region of highest degree of attention is not determined as one. When it is determined that the difference in the attention degrees of the regions arranged in the first two positions is smaller than the threshold value, the operation of the region determining unit 17 proceeds to step S453. On the other hand, when it is determined that the difference between the attention degrees of the regions arranged in the first two digits is not smaller than the threshold value, the operation of the region determining unit 17 proceeds to step S403.
In step S453, the region determination unit 17 determines a preset reference region as a projection region. Then, the operation of the area determining unit 17 related to the determination of the projection area is completed.
As described above, the region determining unit 17 of the display control apparatus 11 according to embodiment 5 determines whether or not the difference in attention level between the regions in the first two rows having high attention levels among the plurality of regions is smaller than the predetermined threshold. When the difference between the attention degrees is smaller than the threshold value, the region determination unit 17 determines a preset reference region among the plurality of regions as a projection region.
According to this configuration, even when the area of highest attention is not specified as one, it is possible to prevent the projection area from being specified. Thereby, the user at the landing 3 can easily view the display in the display system 8 of the landing 3. The user can easily notice the displayed information, and thus the convenience of the user can be improved.

Claims (7)

1. A display control device for an elevator, wherein the display control device for an elevator comprises:
a position detection unit that detects a waiting position of a user waiting at a landing, from an image captured by a photographing device provided at the landing of an elevator;
a degree of attention detection unit that detects a degree of attention of a user waiting at the hall from an image captured by the imaging device, with respect to a region of the hall where an image can be projected by a projection device provided at the hall; and
and a region determination unit that determines, based on information including the waiting position detected by the position detection unit and the attention degree detected by the attention degree detection unit, a region having a higher attention degree and a region having a lower possibility of being blocked by a user waiting at the landing in a projection manner, as a projection region in which the image is projected by the projection device, with priority.
2. The display control apparatus of an elevator according to claim 1, wherein,
the elevator display control device includes an image conversion unit that converts an image to be projected by the projection device based on the projection area determined by the area determination unit.
3. The display control apparatus of an elevator according to claim 1 or 2, wherein,
the attention detection unit detects the attention of a user waiting at the landing for each of a plurality of areas preset at the landing.
4. The display control apparatus of an elevator according to claim 3, wherein,
the attention degree detection unit detects the attention degree of each of the plurality of areas using the number of users whose line of sight is directed to each of the plurality of areas.
5. The display control apparatus of an elevator according to claim 4, wherein,
the elevator display control device comprises a characteristic detection unit for detecting characteristics of a user who waits at the landing and contains information related to the height of the sight line based on the image shot by the shooting device,
the attention degree detection unit detects the attention degree of each of the plurality of areas using a value weighted according to the characteristics detected by the characteristics detection unit of the user who has a line of sight directed to each of the plurality of areas.
6. The display control apparatus of an elevator according to any one of claims 3 to 5, wherein,
the area determination unit determines a projection area of an image to be projected by the projection device so as to exclude an area including a surface of the landing door from the plurality of areas when the landing door of the elevator is not closed.
7. The display control apparatus of an elevator according to any one of claims 3 to 6, wherein,
the region determination unit determines a predetermined reference region among the plurality of regions as a projection region of an image to be projected by the projection device when a difference in attention level between regions arranged in the first two positions of the plurality of regions having high attention level is smaller than a predetermined threshold value.
CN202110625983.8A 2020-12-24 2021-06-04 Display control device for elevator Active CN114671311B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-215212 2020-12-24
JP2020215212A JP6962439B1 (en) 2020-12-24 2020-12-24 Elevator display control device

Publications (2)

Publication Number Publication Date
CN114671311A CN114671311A (en) 2022-06-28
CN114671311B true CN114671311B (en) 2023-12-26

Family

ID=78409828

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110625983.8A Active CN114671311B (en) 2020-12-24 2021-06-04 Display control device for elevator

Country Status (2)

Country Link
JP (1) JP6962439B1 (en)
CN (1) CN114671311B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009116510A (en) * 2007-11-05 2009-05-28 Fujitsu Ltd Attention degree calculation device, attention degree calculation method, attention degree calculation program, information providing system and information providing device
JP2013105384A (en) * 2011-11-15 2013-05-30 Nippon Hoso Kyokai <Nhk> Attention degree estimating device and program thereof
CN103927670A (en) * 2013-01-10 2014-07-16 上海通用汽车有限公司 Method of quantizing region attention degree of object
CN104743419A (en) * 2013-12-25 2015-07-01 株式会社日立制作所 Image monitoring device and elevator monitoring device
CN105293235A (en) * 2015-10-09 2016-02-03 中国矿业大学 Non-contact detection device and method for pitch circle diameter of steel wire rope for mining
CN105293226A (en) * 2014-06-10 2016-02-03 东芝电梯株式会社 Calling registration system of lift hall
JP2016066369A (en) * 2015-12-08 2016-04-28 株式会社Pfu Information processing device, method, and program
CN105868782A (en) * 2016-03-30 2016-08-17 苏州大学 Attention rate statistical system, information delivery device, and electronic advertisement display screen
CN107113391A (en) * 2014-12-17 2017-08-29 索尼公司 Information processor and method
US10091482B1 (en) * 2017-08-04 2018-10-02 International Business Machines Corporation Context aware midair projection display
CN109384114A (en) * 2017-08-14 2019-02-26 奥的斯电梯公司 Elevator safety and control system
CN110316630A (en) * 2019-06-03 2019-10-11 浙江新再灵科技股份有限公司 The deviation method for early warning and system of elevator camera setting angle
CN110577121A (en) * 2018-06-08 2019-12-17 株式会社日立大厦*** Elevator system and group management control method of elevator
CN111353461A (en) * 2020-03-11 2020-06-30 京东数字科技控股有限公司 Method, device and system for detecting attention of advertising screen and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5865729B2 (en) * 2012-02-24 2016-02-17 東芝エレベータ株式会社 Elevator system
JP2016199390A (en) * 2015-04-14 2016-12-01 三菱電機株式会社 Elevator guide device
JP6503254B2 (en) * 2015-07-30 2019-04-17 株式会社日立製作所 Group control elevator system
WO2019171530A1 (en) * 2018-03-08 2019-09-12 三菱電機株式会社 Elevator device and elevator control method

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009116510A (en) * 2007-11-05 2009-05-28 Fujitsu Ltd Attention degree calculation device, attention degree calculation method, attention degree calculation program, information providing system and information providing device
JP2013105384A (en) * 2011-11-15 2013-05-30 Nippon Hoso Kyokai <Nhk> Attention degree estimating device and program thereof
CN103927670A (en) * 2013-01-10 2014-07-16 上海通用汽车有限公司 Method of quantizing region attention degree of object
CN104743419A (en) * 2013-12-25 2015-07-01 株式会社日立制作所 Image monitoring device and elevator monitoring device
CN105293226A (en) * 2014-06-10 2016-02-03 东芝电梯株式会社 Calling registration system of lift hall
CN107113391A (en) * 2014-12-17 2017-08-29 索尼公司 Information processor and method
CN105293235A (en) * 2015-10-09 2016-02-03 中国矿业大学 Non-contact detection device and method for pitch circle diameter of steel wire rope for mining
JP2016066369A (en) * 2015-12-08 2016-04-28 株式会社Pfu Information processing device, method, and program
CN105868782A (en) * 2016-03-30 2016-08-17 苏州大学 Attention rate statistical system, information delivery device, and electronic advertisement display screen
US10091482B1 (en) * 2017-08-04 2018-10-02 International Business Machines Corporation Context aware midair projection display
CN109384114A (en) * 2017-08-14 2019-02-26 奥的斯电梯公司 Elevator safety and control system
CN110577121A (en) * 2018-06-08 2019-12-17 株式会社日立大厦*** Elevator system and group management control method of elevator
CN110316630A (en) * 2019-06-03 2019-10-11 浙江新再灵科技股份有限公司 The deviation method for early warning and system of elevator camera setting angle
CN111353461A (en) * 2020-03-11 2020-06-30 京东数字科技控股有限公司 Method, device and system for detecting attention of advertising screen and storage medium

Also Published As

Publication number Publication date
JP2022100925A (en) 2022-07-06
JP6962439B1 (en) 2021-11-05
CN114671311A (en) 2022-06-28

Similar Documents

Publication Publication Date Title
JP6377797B1 (en) Elevator boarding detection system
JP6092433B1 (en) Elevator boarding detection system
JP6377796B1 (en) Elevator boarding detection system
JP5483702B2 (en) Elevator stagnant detector
JP6367411B1 (en) Elevator system
JP6242966B1 (en) Elevator control system
CN106966277A (en) The seating detecting system of elevator
JP6317004B1 (en) Elevator system
CN110294391B (en) User detection system
JP2018162115A (en) Elevator boarding detection system
JP6693627B1 (en) Image processing device
CN113428752B (en) User detection system for elevator
JP2018158842A (en) Image analyzer and elevator system
JP2002293484A (en) Elevator control device
CN114671311B (en) Display control device for elevator
CN112429609B (en) User detection system for elevator
CN111717768B (en) Image processing apparatus and method
CN112875446A (en) Elevator control device and elevator control method
JP6968943B1 (en) Elevator user detection system
CN111960206B (en) Image processing apparatus and marker
JPH10312448A (en) Number of person detector and elevator control system using the same
WO2021090447A1 (en) Control system and control method for elevator
JP7367174B1 (en) elevator system
JP6729980B1 (en) Elevator user detection system
CN112551292B (en) User detection system for elevator

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant