WO2023017588A1 - Control device, control method, and computer-readable storage medium - Google Patents

Control device, control method, and computer-readable storage medium Download PDF

Info

Publication number
WO2023017588A1
WO2023017588A1 PCT/JP2021/029655 JP2021029655W WO2023017588A1 WO 2023017588 A1 WO2023017588 A1 WO 2023017588A1 JP 2021029655 W JP2021029655 W JP 2021029655W WO 2023017588 A1 WO2023017588 A1 WO 2023017588A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
target robot
controlling
robots
collected data
Prior art date
Application number
PCT/JP2021/029655
Other languages
French (fr)
Japanese (ja)
Inventor
英樹 渡辺
一浩 戸崎
大智 中西
博敏 坂井
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Priority to PCT/JP2021/029655 priority Critical patent/WO2023017588A1/en
Publication of WO2023017588A1 publication Critical patent/WO2023017588A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions

Definitions

  • This disclosure relates to technology for controlling robots.
  • Robots that perform tasks and communicate with people may be installed in facilities such as government offices, shopping malls, hotels, warehouses, and hospitals. Such robots may be built into a wireless communication system and run automatically by control signals transmitted from a remote location.
  • Japanese Patent Application Laid-Open No. 2002-200000 describes a technique for controlling a robot from a remote location, in which a current position of a robot is estimated based on an image captured by the robot and movement control information for instructing the robot to move.
  • a technique for generating an image of the surroundings of a robot corresponding to the current position is disclosed.
  • Patent Document 2 discloses a technology related to a communication method in which data is directly exchanged between terminals without going through a base station.
  • the robot may be controlled to travel toward a predetermined location.
  • a plurality of persons come and go to the facility as described above. Due to the traffic of people, the situation of passages in the facility changes. For example, there may be a crowd of people in the aisle, or a person may appear from the corner of the aisle. In such cases, the robot may not be able to run or may collide with a person. Therefore, it is necessary to control the robot according to the situation.
  • Patent Document 1 describes avoiding obstacles around the target robot. However, such control is based on a limited range of information available only from the target robot. Therefore, there is room for improvement in terms of performing control according to the situation as described above.
  • Patent document 2 does not describe controlling the robot according to the situation.
  • the present disclosure has been made in view of the above problems, and one of the objects thereof is to provide a control device or the like capable of more appropriately controlling a robot that runs automatically according to the situation. do.
  • a control device includes acquisition means for acquiring collected data including position information of each of a plurality of automatically traveling robots and data collected by each of the plurality of robots; and control means for controlling a target robot, which is at least one of the robots, based on the collected data.
  • a control method acquires collected data including position information of each of a plurality of automatically traveling robots and data collected by each of the plurality of robots, and obtains collected data of the plurality of robots. At least one of the target robots is controlled based on the collected data.
  • a storage medium includes a process of acquiring collected data including position information of each of a plurality of automatically traveling robots and data collected by each of the plurality of robots; A program for causing a computer to execute a process of controlling a target robot, which is at least one of the robots, based on the collected data is stored.
  • FIG. 1 is a diagram schematically showing an example of a configuration of a control system including a control device according to a first embodiment of the present disclosure
  • FIG. 2 is a block diagram showing an example of a functional configuration of a control device according to the first embodiment of the present disclosure
  • FIG. 4 is a flow chart showing an example of the operation of the control device according to the first embodiment of the present disclosure
  • FIG. 7 is a block diagram showing an example of the configuration of a control system according to a second embodiment of the present disclosure
  • FIG. FIG. 10 is a diagram showing an example of robots placed in a facility according to the second embodiment of the present disclosure
  • FIG. FIG. 10 is a diagram showing another example of robots placed in the facility according to the second embodiment of the present disclosure
  • FIG. 11 is an output example of information about an event according to the second embodiment of the present disclosure
  • FIG. FIG. 9 is a flow chart showing an example of the operation of the control device according to the second embodiment of the present disclosure
  • FIG. FIG. 5 is a diagram showing an example of an imaging device and sensors installed in a facility according to Modification 1 of the present disclosure
  • FIG. 11 is a diagram showing an example of robots arranged in a facility according to modification 2 of the present disclosure
  • FIG. 11 is a block diagram showing an example of the configuration of a control system according to a third embodiment of the present disclosure
  • FIG. FIG. 11 is a diagram schematically showing an example of the configuration of a control system according to a third embodiment of the present disclosure
  • FIG. 11 is a flow chart showing an example of the operation of the control device according to the third embodiment of the present disclosure
  • FIG. 1 is a block diagram showing an example of a hardware configuration of a computer device that implements control devices according to first, second, and third embodiments of the present disclosure
  • FIG. 1 is a block diagram showing an example of a hardware configuration of a computer device that implements control devices according to first, second, and third embodiments of the present disclosure
  • FIG. 1 is a diagram schematically showing an example of the configuration of a control system 1000 including a control device 100.
  • control system 1000 includes control device 100 , robot 200 and base station 300 .
  • each of the robots 200 may be separately described as robots 200-1, 200-2, . . . , 200-n.
  • the control device 100 is communicably connected to the robot 200 via the base station 300 .
  • the control device 100 and the robot 200 may communicate via a different communication device such as a router.
  • the control device 100 controls the robot 200.
  • the robot 200 operates under the control of the control device 100, but may also operate under the autonomous control of the robot itself.
  • the robot 200 may be, for example, a robot installed in facilities such as government offices, shopping malls, hotels, warehouses, and hospitals. In this case, the robot 200 may be, for example, a guide robot that guides the user to a predetermined location.
  • the robot 200 may be an unmanned robot that runs automatically.
  • FIG. 2 is a block diagram showing an example of the functional configuration of the control device 100 of the first embodiment.
  • the control device 100 includes an acquisition section 110 and a control section 120 .
  • the acquisition unit 110 acquires various data from the multiple robots 200 .
  • each of the robots 200 is equipped with various sensors such as a sound sensor, a human sensor, a distance sensor, and a photographing device.
  • Each robot 200 collects data obtained from, for example, various sensors and imaging devices.
  • data collected by the robot 200 may also be referred to as collected data.
  • Each of the robots 200 transmits collected data to the control device 100, for example.
  • the collected data also includes position information of the robot 200, for example.
  • the acquisition unit 110 acquires collected data including position information of each of the plurality of robots 200 that automatically travel and data collected by each of the plurality of robots.
  • Acquisition unit 110 is an example of acquisition means.
  • the control unit 120 controls the robot 200 based on the collected data. For example, the control unit 120 detects the occurrence of an event from data collected from the robots 200-1, 200-2, and 200-3.
  • An event indicates, for example, a situation different from normal times. For example, events include the occurrence of obstacles, the occurrence of incidents and accidents, and the occurrence of sudden illness such as the collapse of a person.
  • the control unit 120 controls the robot 200 according to the event. For example, when an obstacle occurs on the route along which the robot 200-1 moves, the control unit 120 controls the robot 200-1 to stop, decelerate, or change direction so as to avoid the obstacle. , control to change the route so that it does not pass through places that are difficult to pass.
  • the control unit 120 may control the robot 200-1 to go to the place where the event occurred. good. That is, the controller 120 may control the robot 200 to dynamically collect information.
  • the control unit 120 controls one of the plurality of robots 200 based on collected data acquired from the plurality of robots 200. may be controlled.
  • the robot 200 to be controlled in the present disclosure may be referred to as a target robot.
  • control unit 120 controls the target robot, which is at least one of the plurality of robots 200, based on the collected data.
  • Control unit 120 is an example of control means.
  • each step in a flowchart or a sequence diagram is expressed using a number attached to each step, such as “S1”.
  • FIG. 3 is a flowchart explaining an example of the operation of the control device 100.
  • the acquisition unit 110 acquires collected data including position information of each of the plurality of automatically traveling robots 200 and data collected by each of the plurality of robots 200 (S1). Then, the control unit 120 controls the target robot, which is at least one of the plurality of robots 200, based on the collected data (S2).
  • control device 100 uses data obtained from multiple robots to control the target robot. Therefore, the control device 100 can perform control according to a wider variety of situations than, for example, when the target robot is controlled by judging the situation from data obtained only from the target robot. That is, the control device 100 can more appropriately control the automatically traveling robot according to the situation.
  • FIG. 4 is a block diagram showing an example of the configuration of the control system 1000 of the second embodiment.
  • a control system 1000 includes a control device 100, a robot 200, and a base station 300, as in the first embodiment.
  • the control device 100 and the robot 200 communicate via the base station 300, but the network constructed in the control system 1000 need not be limited to a specific method.
  • the network constructed in the control system 1000 may be a wireless LAN (Local Area Network), a public line network, a mobile data communication network, or a combination of these networks.
  • mobile data communication networks there are 3rd generation mobile communication systems (3G: 3rd Generation), LTE (Long Term Evolution), 4th generation mobile communication systems (4G: 4th Generation), 5th generation mobile communication systems (5G: 5th Generation), etc.
  • a 5G network which is a dedicated network built in a specific range, is called local 5G.
  • the control system 1000 of the present disclosure may be built by local 5G.
  • the base station 300 is a base station supporting local 5G.
  • Base station 300 may be virtually constructed by software on a server.
  • the control device 100 can acquire a large amount of data from the plurality of robots 200 with low delay compared to other mobile data communication networks or the like. That is, the control device 100 can control the plurality of robots 200 more smoothly than other mobile data communication networks or the like. Security can also be ensured by using a dedicated network such as local 5G.
  • the acquisition unit 110 acquires the collected data from the plurality of robots through the local 5G network.
  • the control unit 120 controls the target robot through a local 5G network.
  • the robot 200 is equipped with an imaging device, a sensor, an input device, an output device, and the like. Each device mounted on the robot 200 is merely an example, and a different device may be mounted. Also, there may be two or more of each of the photographing device, the sensor, the input device, the output device, and the like. In this embodiment, robots 200-1, 200-2, and 200-3 each have the same configuration, but each robot 200 may have a different configuration.
  • the robot 200 is a robot installed in facilities such as government offices, shopping malls, hotels, warehouses, and hospitals.
  • the robot 200 may make a response corresponding to the input.
  • the robot 200 is a guide robot
  • the user uses an input device to input information indicating a predetermined location.
  • the robot 200 may output information indicating the predetermined location using an output device, or may move toward the predetermined location.
  • Such control of responses to inputs may be performed by a device different from the robot 200 , such as the control device 100 , or may be performed autonomously in the robot 200 .
  • the input device may be a touch panel, keyboard, microphone, or the like. That is, the user may input information indicating a predetermined location by pressing a touch panel or keyboard, or may input information indicating a predetermined location by uttering voice.
  • Output devices may be, for example, displays, lamps, speakers, and the like.
  • the robot 200 generates a captured image using an imaging device mounted on the robot 200 . That is, the robot 200 can photograph the surroundings of the robot 200 using the photographing device. Henceforth, the photographing by the photographing device mounted on the robot 200 may also be expressed as the photographing by the robot 200 . Further, the robot 200 can collect various sensor data from sensors mounted on the robot 200 . For example, if the sensor is a ranging sensor, the robot 200 can collect distance information to surrounding objects. Also, if the sensor is a human sensor, the robot 200 can detect a person within a predetermined range near the robot 200 . If the sensors are sound sensors, the robot 200 can detect sounds generated within a predetermined range near the robot 200 .
  • the robot 200 has a function of acquiring position information.
  • robot 200 may include a system that computes location information from radio waves from base station 300 or other wireless devices.
  • the robot 200 may be equipped with a GNSS (Global Navigation Satellite System).
  • the robot 200 associates data including captured images and sensor data with position information and transmits the data to the control device 100 .
  • the robot 200 may transmit route information indicating a moving route to the control device 100 .
  • the robot 200 may transmit, as route information, information indicating a place to pass through to the predetermined point.
  • control device 100 includes an acquisition section 110 and a control section 120 .
  • Acquisition unit 110 acquires collected data from robot 200 .
  • Collected data includes captured images, sensor data, and position information collected by robot 200 .
  • the control unit 120 includes an event detection unit 121 and a robot control unit 122.
  • the event detection unit 121 detects events based on collected data. For example, the event detection unit 121 detects the occurrence of an obstacle as an event from the captured image of the collected data.
  • An example of the generation of obstacles is crowds. For example, when a plurality of people appear in the captured image and the movement of the plurality of people is small, the event detection unit 121 detects the occurrence of a crowd.
  • a situation in which the robot 200 cannot pass easily such as when the passage is crowded or when things are scattered on the passage, is also included as an example of the occurrence of obstacles.
  • the event detection unit 121 may detect the occurrence of an incident or an accident as an event.
  • the event detection unit 121 detects the occurrence of an incident or an accident by, for example, detecting a person holding a knife or a firearm from a captured image, or detecting a shout from sensor data. . Further, the event detection unit 121 may detect the occurrence of an emergency patient by detecting a crouching person from the captured image. In this way, the event detection unit 121 detects events based on collected data.
  • the event detection unit 121 is an example of event detection means.
  • the event detection unit 121 calculates the place where the event occurs.
  • the event detection unit 121 can calculate the place where the event occurs from the position information of the robot 200, the map information of the facility, and the like.
  • the facility map information may be stored in advance in a storage device (not shown) mounted on the control device 100 or a storage device (not shown) communicable with the control device 100 .
  • the event detection unit 121 may predict event occurrence.
  • the event detection unit 121 may predict the appearance of an obstacle from collected data.
  • the event detection unit 121 may predict the place where a person appears based on the direction in which the person moves calculated from the collected data and the map information of the facility.
  • the robot control unit 122 controls the target robot based on the event detected by the event detection unit 121.
  • the robot control unit 122 is an example of robot control means. Here, an example of control performed by the robot control unit 122 will be described with reference to FIGS. 5 and 6. FIG.
  • FIG. 5 is a diagram showing an example of robots 200 placed in a facility.
  • robots 200-1, 200-2, and 200-3 are arranged within the facility. Hatched areas indicate impassable areas such as pillars and walls.
  • people are gathered in the vicinity of the point A.
  • the robot 200-2 is photographing a range including the point A.
  • the event detection unit 121 detects that a crowd has occurred at the point A based on the collected data including the photographed data photographed by the robot 200-2.
  • the robot control unit 122 controls the robot 200-3 to move to the vicinity of the point A, for example.
  • the control device 100 can acquire the collected data related to the event that occurred near the point A also from the robot 200-3.
  • the robot controller 122 may control the target robot to collect information about the event that has occurred.
  • the target robot may be selected in various ways. For example, assume that the event detection unit 121 detects that an event has occurred at point A. FIG. In this case, the robot control unit 122 may select the robot 200 closest to the point A as the target robot. At this time, the robot control unit 122 may select a plurality of robots 200 in order of proximity to the point A. FIG. Further, the robot control unit 122 may select the robot 200 within a predetermined range from the point A. Further, the robot control unit 122 may select a robot 200 that has not received an instruction from the user from among the plurality of robots 200 .
  • a robot 200 that has not received an instruction from a user is a robot that is accepting an input or making a response corresponding to the input.
  • the robot control unit 122 may select the robot 200 other than the robot that guides the robot to a predetermined point or transports an object, according to an instruction from the user.
  • FIG. 6 is a diagram showing another example of the robot 200 arranged in the facility.
  • robots 200-1, 200-2, and 200-3 are arranged in the facility as in the example of FIG.
  • hatched areas indicate impassable areas such as pillars and walls.
  • robot 200-1 is moving toward point B in order to guide the user there.
  • the robot 200-2 is photographing a range including the point C.
  • event detection unit 121 detects that a crowd has occurred at point C, based on collected data including photographed data photographed by robot 200-2.
  • the robot control unit 122 controls the robot 200-1 so that the robot 200-1 moves along a route that does not pass through the point C.
  • the robot control unit 122 controls to move in the X direction in FIG.
  • the robot control unit 122 controls the robot 200-1 to avoid the person. For example, before the robot 200-1 passes the point D, the robot control unit 122 controls the robot 200-1 to decelerate or stop. In this way, the robot control unit 122 may control the moving target robot to avoid an event that has occurred or an event that is predicted to occur.
  • the target robot may be selected in various ways. For example, when the event detection unit 121 detects the occurrence of an event at the point C, the robot control unit 122 selects the robot 200 scheduled to pass near the point C from among the plurality of robots 200 based on the route information. may In this case, if no robot passes near the point C, the robot 200 need not be controlled according to the event.
  • the robot control unit 122 may control the robot 200 to output information regarding the detected event. For example, in the example of FIG. 6, when it is predicted that a person will appear at point D, the output device of robot 200-1 may output that the person will appear.
  • FIG. 7 is an output example of information about an event. As shown in FIG. 7, the robot control unit 122 may control the display, which is an example of the output device, to display "A person is jumping out. Please be careful.” The content and method of output are not limited to this example. For example, when the occurrence of an incident or accident is detected, the robot control unit 122 may display information indicating that an incident or accident has occurred on the display of the robot 200, or audibly using a speaker. It may be output or indicated by the light of a lamp.
  • FIG. 8 is a flow chart showing an example of the operation of the control device 100. As shown in FIG.
  • the acquisition unit 110 acquires collected data from each of the robots 200 (S101). If the event detection unit 121 does not detect the occurrence of an event from the collected data (“No” in S102), the flow ends. At this time, the control device 100 repeats the flow, for example.
  • the robot control unit 122 selects a target robot (S103). For example, when collecting information about an event that has occurred, the robot control unit 122 may select robots 200 that are within a predetermined range from the place where the event occurred as target robots. Further, for example, when controlling the robot 200 to avoid an event that occurs, the robot control unit 122 may select the robot 200 that passes through the place where the event occurs as the target robot. Then, the robot control unit 122 controls the selected robot 200 (that is, the target robot) according to the event (S104). For example, the robot control unit 122 may control the target robot to move near the location where the event occurred. Further, for example, the robot control unit 122 may control the target robot to avoid the event.
  • the robot control unit 122 may select a plurality of target robots. Then, the robot control unit 122 controls some target robots among the plurality of target robots to move to the vicinity of the location where the event occurred, and controls the other target robots to avoid the event. may
  • the control device 100 of the second embodiment acquires collected data including the position information of each of the plurality of automatically traveling robots and the data collected by each of the plurality of robots.
  • a target robot which is at least one of the robots, is controlled based on the collected data.
  • the control device 100 can perform control according to a wider variety of situations than, for example, when controlling the target robot by determining the situation from data obtained only from the target robot. That is, the control device 100 can more appropriately control the automatically traveling robot according to the situation.
  • the collected data includes captured images generated by imaging devices mounted on a plurality of robots
  • the control device 100 of the second embodiment detects an event based on the collected data, and detects the detected event. You may control a target robot based on.
  • the control device 100 can control the target robot according to the event. That is, the control device 100 can more appropriately perform control according to the situation.
  • the control device 100 of the second embodiment controls the target robot to move along a route that does not pass through the location where the detected event occurred. good too. For example, assume that there is a place that is difficult for the robot to pass due to the occurrence of an event. On the other hand, the control device 100 can cause the target robot to avoid the difficult-to-pass places. In addition, since the control device 100 detects the occurrence of an event based on the collected data obtained from a plurality of robots, for example, before the target robot approaches a place where it is difficult to pass, the target robot is notified of the location. It is also possible to avoid it.
  • control device 100 of the second embodiment may control the target robot to move toward the location where the detected event occurred.
  • control device 100 can increase the amount of collected data collected in the vicinity of the place where the event occurred, so that more detailed information about the event can be obtained.
  • the collected data acquired from a plurality of robots 200 is used to control the target robot, but different data may be used to control the target robot.
  • the control device 100 may acquire data obtained from the stationary imaging device and sensor.
  • a stationary imaging device may be, for example, a surveillance camera installed in a facility.
  • the stationary sensor may be a microphone installed integrally with the surveillance camera, or may be an independent microphone.
  • FIG. 9 is a diagram showing an example of an imaging device and sensors installed in a facility.
  • a monitoring camera is installed as a stationary photographing device in the situation shown in FIG. 5, and an independent microphone is installed as a stationary sensor.
  • the acquisition unit 110 acquires the captured image generated by the stationary imaging device and the sensor data generated by the sensor.
  • the event detection unit 121 detects occurrence of an event based on the acquired data.
  • the event detection unit 121 detects that people are gathering near the point A, for example, based on the collected data acquired from the robot 200-2 and the photographed image acquired from the stationary photographing device. detect.
  • the event detection unit 121 also detects that a specific noise has occurred around the sensor based on sensor data acquired from the stationary sensor.
  • the event detection unit 121 may detect that an event has occurred at the point A from such information.
  • control device 100 may further acquire another captured image from an imaging device different from that of the plurality of robots 200 and acquire other sensor data from a sensor different from that of the plurality of robots 200 .
  • the control device 100 may then control the robot 200 based on collected data, other captured images, and other sensor data.
  • the control device 100 can grasp the situation in more detail, so that it is possible to control the robot 200 according to the situation.
  • FIG. 10 is a diagram showing an example of robots 200 placed in a facility.
  • a guide robot that guides users is placed in a two-story facility.
  • robot 200-1 guides the user from the first floor to the second floor.
  • the robot 200-1 cannot get on the elevator, there is a case where another robot 200 takes over the guidance of the user on the second floor.
  • the event detection unit 121 of the control device 100 detects, from the route information of the robot 200-1 and the like, that the handover of guidance occurs in the elevator on the second floor as an event. Then, the robot control unit 122 selects a target robot to take over guidance from the other robots 200 .
  • the robot 200-2 is guiding another user. Therefore, the robot 200-3 is selected as the target robot and controlled to take over the guidance.
  • the robot control unit 122 may select the robot 200 on the second floor and near the elevator as the target robot.
  • FIG. 11 is a block diagram showing an example of the configuration of the control system 1001 of the third embodiment. As shown in FIG. 11 , control system 1001 includes control device 101 , robot 200 and base station 300 .
  • FIG. 12 is a diagram schematically showing an example of the configuration of the control system 1001.
  • the robot 200 guides the user.
  • the control system 1001 is applied in a hospital, but the application of the control system 1001 is not limited to hospitals.
  • Control system 1001 may be applied to facilities such as government offices, shopping malls, hotels, warehouses, etc., as described above.
  • the robot 200 is, for example, a robot that guides a visiting person through the hospital or a robot that guides a patient to walk.
  • the robot 200 that guides the user is the target robot.
  • the control device 101 can also communicate with an information processing terminal.
  • the information processing terminal may be, for example, a mobile terminal such as a smart phone and a tablet terminal, or may be a personal computer.
  • the information processing terminal may be provided, for example, in a room of a doctor or a nurse, or may be provided in a security guard room where a security guard is stationed. Also, the information processing terminal may be possessed by an employee of the facility.
  • the robot 200 captures an image of the guiding user. For example, the robot 200 may photograph the user when the robot 200 starts guiding the user. At this time, the robot 200 may transmit the image to the control device 101 as a captured image of the user. Also, the robot 200 may be capable of communicating with a wearable terminal worn by the user. A wearable terminal measures a user's biometric information, for example. Examples of biological information include the user's body temperature, blood pressure, heart rate, and blood oxygen concentration. The robot 200 may transmit the user's biological information measured by the wearable terminal to the control device 101 as collected data.
  • the control device 101 includes an acquisition section 111 , a control section 125 , an authentication section 130 and a notification section 140 .
  • the authentication unit 130 authenticates users. Specifically, the authentication unit 130 identifies the user by performing face authentication based on the captured image including the user. For example, the authentication unit 130 extracts a feature amount related to the user's face from the captured image. Then, the authentication unit 130 collates the extracted feature amount with the feature amount included in the feature amount database.
  • the feature amount database includes information in which feature amount information relating to the faces of a plurality of persons and information for identifying the persons are associated with each other.
  • the feature amount database may be stored in a storage device (not shown) that the control device 101 has, or may be stored in a storage device (not shown) that can communicate with the control device 101 .
  • a known method may be used to perform face authentication.
  • the authentication unit 130 may perform authentication using not only the feature amount related to the face, but also the feature amount related to other living bodies such as iris, fingerprint, and palm print. In this way, the authentication unit 130 performs biometric authentication based on a captured image including the user, which is captured by the target robot, and identifies the user.
  • Authentication unit 130 is an example of authentication means.
  • the acquisition unit 111 acquires collected data from each of the robots 200 in the same manner as the acquisition unit 110 .
  • the collected data includes the user's biological information measured by the wearable terminal.
  • the acquisition unit 111 acquires a captured image including the user from the target robot.
  • the acquisition unit 111 may acquire personal data including information about the specified user.
  • Personal data is data that contains information about a specific person.
  • An example of personal data is patient chart data.
  • personal data may include information such as a person's name, sex, age, address, disease name, symptoms, allergies, various test results, location of disability, and grade of disability, for example.
  • the personal data may be stored in a storage device (not shown) that the control device 101 has, or may be stored in a storage device (not shown) that can communicate with the control device 101 .
  • the control unit 125 includes an event detection unit 123 and a robot control unit 124.
  • the event detection unit 123 and the robot control unit 124 may perform the same operations as the event detection unit 121 and the robot control unit 122, respectively.
  • the event detection unit 123 may detect that an abnormality has occurred in the user.
  • the event detection unit 123 may detect the user's abnormality from the user's biological information. For example, if the user's body temperature, blood pressure, heart rate, blood oxygen level, etc. are out of the predetermined range, the event detection unit 123 detects the user's body temperature, blood pressure, heart rate, blood oxygen level, etc. Detect anomalies in The event detection unit 123 detects, for example, such an anomaly in biological information as an anomaly in the user.
  • the robot control unit 124 may control the target robot based on the personal data acquired by the acquisition unit 111 . For example, if the age information contained in the personal data indicates that the user is an elderly person, the robot control unit 124 controls the target robot to slow down, or displays on the display mounted on the robot. You may perform control which enlarges a character. Also, for example, it is assumed that the personal data includes information indicating that the user lives normally on crutches or in a wheelchair. In this case, the robot control unit 124 may control the target robot to slow down its movement speed or to follow a route with few steps.
  • the robot control unit 124 may control the target robot based on the user's biological information. For example, when the user's heart rate rises by a predetermined value or more, or when the user's body temperature rises by a predetermined value or more, the robot control unit 124 controls the target robot to slow down its movement, Control may be performed to move along a route using elevators and escalators.
  • the notification unit 140 notifies various information. For example, when the user is identified by the authentication unit 130, the notification unit 140 notifies the information processing terminal of the user's location information. At this time, the notification unit 140 may notify the position information of the target robot as the position information of the user. In this way, the notification unit 140 notifies the predetermined terminal of the positional information of the target robot as the user's positional information.
  • the notification unit 140 is an example of notification means.
  • the notification unit 140 may notify based on the user's biometric information. For example, when the event detection unit 123 detects an abnormality in the user's biological information, the notification unit 140 notifies the information processing terminal that the user has an abnormality. This allows, for example, medical personnel in a hospital to be notified of the user's condition.
  • FIG. 13 is a flow chart showing an example of the operation of the control device 101. As shown in FIG.
  • the acquisition unit 111 acquires a captured image including the user (S201). Then, the authentication unit 130 performs biometric authentication based on the captured image including the user. When the user is specified by the biometric authentication (“Yes” in S202), the acquiring unit 111 acquires personal data including information about the specified user (S203). The robot control unit 124 controls the target robot based on the personal data (S204). If the user is not specified by the authentication unit 130 (“No” in S202), the control device 101 does not perform the processes of S203 and S204.
  • the acquisition unit 111 acquires collected data (S205).
  • the robot control unit 124 controls the target robot according to the event (S207).
  • the robot control unit 124 controls the target robot according to the above, for example, when there is an abnormality in the user's biological information included in the collected data. If the event detection unit 123 does not detect the occurrence of an event (“No” in S206), the control device 101 does not perform the process of S207.
  • the notification unit 140 notifies the user (S209). For example, the notification unit 140 notifies the information processing terminal of the positional information of the target robot to guide the user as the positional information of the user. Moreover, the notification unit 140 notifies the information processing terminal that an abnormality has occurred in the user, for example, when there is an abnormality in the user's biological information. If the user is not specified in the process of S202 (“No” in S208), the control device 101 does not perform the process of S209.
  • the processing from S205 to S209 may be repeated until the target robot to guide the user finishes guiding. Also, in S206 and S207, the processing of S102 to S104 described in the second embodiment may be performed.
  • the control device 101 of the third embodiment performs biometric authentication based on the captured image including the user, which is an image captured by the target robot. may be specified, and the positional information of the target robot may be notified to a predetermined terminal as the user's positional information. Thereby, the control device 101 can inform a specific person of the position information of the user. For example, in a hospital where a target robot is guiding a patient, the position information of the patient can be notified to the medical staff in the hospital.
  • the control device 101 of the third embodiment acquires personal data including information about the specified user, and controls the target robot based on the personal data. may be controlled. As a result, the control device 101 can control the target robot according to the user.
  • the collected data includes information obtained by the target robot from a wearable terminal worn by the user and includes biological information of the user measured by the wearable terminal. good too.
  • the control device 101 of the third embodiment may control the target robot based on the biological information. Thereby, the control device 101 can control the target robot using the user's biological information. For example, if the user's body temperature, blood pressure, heart rate, blood oxygen concentration, etc. are abnormal, the target robot can be decelerated or stopped.
  • FIG. 14 is a block diagram showing an example of a hardware configuration of a computer that implements the control device in each embodiment.
  • the computer device 10 implements the control device and the anti-theft method described in each embodiment and each modified example.
  • the computer device 10 includes a processor 11, a RAM (Random Access Memory) 12, a ROM (Read Only Memory) 13, a storage device 14, an input/output interface 15, a bus 16, and a drive device 17.
  • the control device may be realized by a plurality of electric circuits.
  • the storage device 14 stores a program (computer program) 18.
  • the processor 11 uses the RAM 12 to execute the program 18 of the control device.
  • the program 18 includes a program that causes a computer to execute the processes shown in FIGS. 3, 6, 13, and 16. FIG. As the processor 11 executes the program 18, the function of each component of this control device is realized.
  • Program 18 may be stored in ROM 13 .
  • the program 18 may be recorded on the storage medium 20 and read using the drive device 17, or may be transmitted from an external device (not shown) to the computer device 10 via a network (not shown).
  • the input/output interface 15 exchanges data with peripheral devices (keyboard, mouse, display device, etc.) 19 .
  • the input/output interface 15 functions as means for acquiring or outputting data.
  • a bus 36 connects each component.
  • the controller can be implemented as a dedicated device.
  • the control device can be realized based on a combination of multiple devices.
  • a processing method in which a program for realizing each component in the function of each embodiment is recorded in a storage medium, the program recorded in the storage medium is read as code, and a computer executes the processing method is also included in the scope of each embodiment. . That is, a computer-readable storage medium is also included in the scope of each embodiment. Further, each embodiment includes a storage medium in which the above-described program is recorded, and the program itself.
  • the storage medium is, for example, a floppy (registered trademark) disk, hard disk, optical disk, magneto-optical disk, CD (Compact Disc)-ROM, magnetic tape, non-volatile memory card, or ROM, but is not limited to this example.
  • the programs recorded on the storage medium are not limited to programs that execute processing independently, but also work together with other software and expansion board functions to run on an OS (Operating System) to execute processing.
  • a program for executing the program is also included in the category of each embodiment.
  • the collected data includes captured images generated by imaging devices mounted on the plurality of robots;
  • the control means is event detection means for detecting an event based on the collected data; and robot control means for controlling the target robot based on the detected event. 1.
  • the control device according to appendix 1.
  • the robot control means controls the target robot to move toward the location where the detected event occurred.
  • the control device according to appendix 2.
  • the acquisition means acquires personal data including information about the specified user, the control means controls the target robot based on the personal data;
  • the control device according to appendix 6.
  • the collected data includes information obtained by the target robot from a wearable terminal worn by the user and includes biometric information of the user measured by the wearable terminal, the control means controls the target robot based on the biological information; 8.
  • a control device according to appendix 6 or 7.
  • Appendix 9 the acquiring means acquires another photographed image from a photographing device different from that of the plurality of robots, and acquires other sensor data from a sensor different from that of the plurality of robots;
  • the control means controls the target robot based on the collected data, the other captured image, and the other sensor data.
  • the control device according to any one of appendices 1 to 8.
  • the acquisition means acquires the collected data from the plurality of robots through a local 5G (5th Generation) network, the control means controls the target robot through the local 5G network; 10.
  • the control device according to any one of appendices 1 to 9.
  • Appendix 11 Acquiring collected data including position information of each of a plurality of robots that automatically travel and data collected by each of the plurality of robots; controlling a target robot, which is at least one of the plurality of robots, based on the collected data; control method.
  • the collected data includes captured images generated by imaging devices mounted on the plurality of robots; In the step of controlling based on the collected data, Detecting an event based on the collected data; controlling the target robot based on the detected event; The control method according to appendix 11.
  • Appendix 14 Predicting the appearance of an obstacle in the process of detecting the event; 14. The control method according to appendix 12 or 13, wherein in the step of controlling the target robot, the target robot is controlled to avoid the obstacle.
  • Appendix 16 When the target robot guides the user, performing biometric authentication based on a photographed image including the user, which is an image photographed by the target robot, and identifying the user; Notifying a predetermined terminal of position information of the target robot as position information of the user; 16. The control method according to any one of appendices 11 to 15.
  • Appendix 17 obtaining personal data including information about the identified user in the obtaining step; controlling the target robot based on the personal data in the step of controlling based on the collected data; The control method according to appendix 16.
  • the collected data includes information obtained by the target robot from a wearable terminal worn by the user and includes biological information of the user measured by the wearable terminal, controlling the target robot based on the biological information in the step of controlling based on the collected data; 18.
  • Appendix 19 In the acquiring step, acquiring another photographed image from an imaging device different from that of the plurality of robots, acquiring other sensor data from a sensor different from that of the plurality of robots, in the step of controlling based on the collected data, controlling the target robot based on the collected data, the other captured image, and the other sensor data; 19.
  • the control method according to any one of appendices 11 to 18.
  • Appendix 20 In the acquiring step, acquiring the collected data from the plurality of robots through a local 5G (5th Generation) network; In the step of controlling based on the collected data, controlling the target robot through the local 5G network; 20.
  • the control method according to any one of appendices 11 to 19.
  • Appendix 21 A process of acquiring collected data including position information of each of a plurality of robots that automatically travel and data collected by each of the plurality of robots; storing a program for causing a computer to execute a process of controlling a target robot, which is at least one of the plurality of robots, based on the collected data; storage medium.
  • the collected data includes captured images generated by imaging devices mounted on the plurality of robots; In the controlling process, executing a process of detecting an event based on the collected data and a process of controlling the target robot based on the detected event;
  • Appendix 23 When the target robot is moving toward a predetermined location, In the process of controlling the target robot, controlling the target robot so as to move along a route that does not pass through the location where the detected event occurred; 23.
  • Appendix 24 Predicting the appearance of an obstacle in the process of detecting the event; 24.
  • Appendix 26 When the target robot guides the user, a process of performing biometric authentication based on a photographed image including the user, which is an image photographed by the target robot, and identifying the user; a process of notifying a predetermined terminal of the position information of the target robot as the position information of the user; 26.
  • the storage medium according to any one of appendices 21 to 25.
  • Appendix 27 Acquiring personal data including information about the identified user in the acquiring process; controlling the target robot based on the personal data in the process of controlling based on the collected data; A storage medium according to appendix 26.
  • the collected data includes information obtained by the target robot from a wearable terminal worn by the user and includes biometric information of the user measured by the wearable terminal, controlling the target robot based on the biological information in the process of controlling based on the collected data; 28.
  • Appendix 29 In the acquiring process, acquiring another captured image from an imaging device different from that of the plurality of robots, acquiring other sensor data from a sensor different from that of the plurality of robots, In the process of controlling based on the collected data, the target robot is controlled based on the collected data, the other captured image, and the other sensor data; 29.
  • the storage medium according to any one of appendices 21-28.
  • Appendix 30 In the acquiring process, acquiring the collected data from the plurality of robots through a local 5G (5th Generation) network, Controlling the target robot through the local 5G network in the process of controlling based on the collected data; 29.
  • the storage medium according to any one of appendices 21 to 29.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

One of the purposes of the present invention is to provide a control device, etc. that can control an autonomous driving robot more appropriately depending on the situation. A control device according to an aspect of this disclosure includes: an acquiring means that acquires collection data, which includes position information of each of a plurality of autonomous driving robots and data collected by each of the plurality of robots; and a control means that controls a target robot, which is at least one of the plurality of robots, on the basis of the collection data.

Description

制御装置、制御方法、及びコンピュータ読み取り可能な記憶媒体CONTROL DEVICE, CONTROL METHOD, AND COMPUTER-READABLE STORAGE MEDIUM
 本開示は、ロボットを制御する技術に関する。 This disclosure relates to technology for controlling robots.
 作業を行ったり人物とコミュニケーションをとったりするロボットが、役所、ショッピングモール、ホテル、倉庫及び病院等の施設に設置される場合がある。このようなロボットは、無線通信システムに組み込まれ、遠隔地から送信される制御信号により、自動的に走行することがある。 Robots that perform tasks and communicate with people may be installed in facilities such as government offices, shopping malls, hotels, warehouses, and hospitals. Such robots may be built into a wireless communication system and run automatically by control signals transmitted from a remote location.
 特許文献1には、遠隔地からロボットを制御する技術に関して、ロボットによって撮影された画像と、ロボットに対して移動を指示する移動制御情報とに基づいて、ロボットの現在位置を推定し、推定された現在位置に対応するロボット周辺の画像を生成する技術が開示されている。 Japanese Patent Application Laid-Open No. 2002-200000 describes a technique for controlling a robot from a remote location, in which a current position of a robot is estimated based on an image captured by the robot and movement control information for instructing the robot to move. A technique for generating an image of the surroundings of a robot corresponding to the current position is disclosed.
 また、無線通信システムに関連して、特許文献2には、基地局を介さずに、端末間でデータを直接やりとりする通信方式に関する技術が開示されている。 In addition, in relation to wireless communication systems, Patent Document 2 discloses a technology related to a communication method in which data is directly exchanged between terminals without going through a base station.
国際公開第2021/002116号WO2021/002116 特開2020-167681号公報JP 2020-167681 A
 ロボットは所定の場所に向かって走行するよう制御されることがある。一方で、上記のような施設には、複数の人物が往来する。人物の往来に起因して、施設内の通路の状況は変化する。例えば、通路に人だかりができていたり、通路の角から人物が現れたりすることがある。このような場合、ロボットが走行できなかったり、人物と衝突したりする可能性もある。そのため、状況に応じた制御をロボットに行う必要がある。  The robot may be controlled to travel toward a predetermined location. On the other hand, a plurality of persons come and go to the facility as described above. Due to the traffic of people, the situation of passages in the facility changes. For example, there may be a crowd of people in the aisle, or a person may appear from the corner of the aisle. In such cases, the robot may not be able to run or may collide with a person. Therefore, it is necessary to control the robot according to the situation.
 特許文献1には、対象のロボット周辺の障害物を回避することは記載されている。しかしながら、このような制御は、対象のロボットのみから得られる、限られた範囲の情報に基づいて行われる。そのため、上記のような、状況に応じた制御を行う点については、改善の余地がある。 Patent Document 1 describes avoiding obstacles around the target robot. However, such control is based on a limited range of information available only from the target robot. Therefore, there is room for improvement in terms of performing control according to the situation as described above.
 特許文献2には、状況に応じた制御をロボットに行うことは記載されていない。 Patent document 2 does not describe controlling the robot according to the situation.
 本開示は、上記課題を鑑みてなされたものであり、自動走行するロボットに対して、より適切に、状況に応じた制御を行うことが可能な制御装置等を提供することを目的の一つとする。 The present disclosure has been made in view of the above problems, and one of the objects thereof is to provide a control device or the like capable of more appropriately controlling a robot that runs automatically according to the situation. do.
 本開示の一態様にかかる制御装置は、自動走行する複数のロボットのそれぞれの位置情報と、前記複数のロボットのそれぞれによって収集されたデータと、を含む収集データを取得する取得手段と、前記複数のロボットのうちの少なくとも一である対象ロボットを、前記収集データに基づいて制御する制御手段と、を備える。 A control device according to an aspect of the present disclosure includes acquisition means for acquiring collected data including position information of each of a plurality of automatically traveling robots and data collected by each of the plurality of robots; and control means for controlling a target robot, which is at least one of the robots, based on the collected data.
 本開示の一態様にかかる制御方法は、自動走行する複数のロボットのそれぞれの位置情報と、前記複数のロボットのそれぞれによって収集されたデータと、を含む収集データを取得し、前記複数のロボットのうちの少なくとも一である対象ロボットを、前記収集データに基づいて制御する。 A control method according to an aspect of the present disclosure acquires collected data including position information of each of a plurality of automatically traveling robots and data collected by each of the plurality of robots, and obtains collected data of the plurality of robots. At least one of the target robots is controlled based on the collected data.
 本開示の一態様にかかる記憶媒体は、自動走行する複数のロボットのそれぞれの位置情報と、前記複数のロボットのそれぞれによって収集されたデータと、を含む収集データを取得する処理と、前記複数のロボットのうちの少なくとも一である対象ロボットを、前記収集データに基づいて制御する処理と、をコンピュータに実行させるプログラムを格納する。 A storage medium according to an aspect of the present disclosure includes a process of acquiring collected data including position information of each of a plurality of automatically traveling robots and data collected by each of the plurality of robots; A program for causing a computer to execute a process of controlling a target robot, which is at least one of the robots, based on the collected data is stored.
 本開示によれば、自動走行するロボットに対して、より適切に、状況に応じた制御を行うことができる。 According to the present disclosure, it is possible to more appropriately control the autonomously traveling robot according to the situation.
本開示の第1の実施形態の制御装置を含む制御システムの構成の一例を模式的に示す図である。1 is a diagram schematically showing an example of a configuration of a control system including a control device according to a first embodiment of the present disclosure; FIG. 本開示の第1の実施形態の制御装置の機能構成の一例を示すブロック図である。2 is a block diagram showing an example of a functional configuration of a control device according to the first embodiment of the present disclosure; FIG. 本開示の第1の実施形態の制御装置の動作の一例を示すフローチャートである。4 is a flow chart showing an example of the operation of the control device according to the first embodiment of the present disclosure; 本開示の第2の実施形態の制御システムの構成の一例を示すブロック図である。FIG. 7 is a block diagram showing an example of the configuration of a control system according to a second embodiment of the present disclosure; FIG. 本開示の第2の実施形態の施設に配置されたロボットの一例を示す図である。FIG. 10 is a diagram showing an example of robots placed in a facility according to the second embodiment of the present disclosure; FIG. 本開示の第2の実施形態の施設に配置されたロボットの他の例を示す図である。FIG. 10 is a diagram showing another example of robots placed in the facility according to the second embodiment of the present disclosure; 本開示の第2の実施形態のイベントに関する情報の出力例である。FIG. 11 is an output example of information about an event according to the second embodiment of the present disclosure; FIG. 本開示の第2の実施形態の制御装置の動作の一例を示すフローチャートである。FIG. 9 is a flow chart showing an example of the operation of the control device according to the second embodiment of the present disclosure; FIG. 本開示の変形例1の施設に設置された撮影装置及びセンサの一例を示す図である。FIG. 5 is a diagram showing an example of an imaging device and sensors installed in a facility according to Modification 1 of the present disclosure; 本開示の変形例2の施設に配置されたロボットの一例を示す図である。FIG. 11 is a diagram showing an example of robots arranged in a facility according to modification 2 of the present disclosure; 本開示の第3の実施形態の制御システムの構成の一例を示すブロック図である。FIG. 11 is a block diagram showing an example of the configuration of a control system according to a third embodiment of the present disclosure; FIG. 本開示の第3の実施形態の制御システムの構成の一例を模式的に示す図である。FIG. 11 is a diagram schematically showing an example of the configuration of a control system according to a third embodiment of the present disclosure; FIG. 本開示の第3の実施形態の制御装置の動作の一例を示すフローチャートである。FIG. 11 is a flow chart showing an example of the operation of the control device according to the third embodiment of the present disclosure; FIG. 本開示の第1、第2、及び第3の実施形態の制御装置を実現するコンピュータ装置のハードウェア構成の一例を示すブロック図である。1 is a block diagram showing an example of a hardware configuration of a computer device that implements control devices according to first, second, and third embodiments of the present disclosure; FIG.
 以下に、本開示の実施形態について、図面を参照しつつ説明する。 Embodiments of the present disclosure will be described below with reference to the drawings.
 <第1の実施形態>
 本開示の制御装置の概要について説明する。
<First Embodiment>
An overview of the control device of the present disclosure will be described.
 図1は、制御装置100を含む制御システム1000の構成の一例を模式的に示す図である。図1に示すように、制御システム1000は、制御装置100とロボット200と基地局300とを含む。ロボット200は、n台(nは2以上の自然数)存在する。本開示において、ロボット200のそれぞれを、ロボット200-1、200-2、・・・、200-nと、区別して記載することもある。ロボット200-1、200-2、・・・、200-nのそれぞれを区別しない場合、単にロボット200と称する。制御装置100は、基地局300を介してロボット200と通信可能に接続される。この例に限らず、制御装置100とロボット200は、ルータ等の異なる通信装置をさらに介して通信を行ってもよい。 FIG. 1 is a diagram schematically showing an example of the configuration of a control system 1000 including a control device 100. FIG. As shown in FIG. 1 , control system 1000 includes control device 100 , robot 200 and base station 300 . There are n robots 200 (n is a natural number equal to or greater than 2). In the present disclosure, each of the robots 200 may be separately described as robots 200-1, 200-2, . . . , 200-n. The robots 200-1, 200-2, . The control device 100 is communicably connected to the robot 200 via the base station 300 . Not limited to this example, the control device 100 and the robot 200 may communicate via a different communication device such as a router.
 制御装置100は、ロボット200を制御する。ロボット200は、制御装置100の制御により動作するが、ロボット自身の自律的な制御による動作を行ってもよい。ロボット200は、例えば、役所、ショッピングモール、ホテル、倉庫及び病院等の施設に設置されるロボットであってもよい。この場合、ロボット200は、例えば、ユーザを予め定められた所定の場所まで案内する案内ロボットであってもよい。ロボット200は、自動走行を行う無人のロボットであってよい。 The control device 100 controls the robot 200. The robot 200 operates under the control of the control device 100, but may also operate under the autonomous control of the robot itself. The robot 200 may be, for example, a robot installed in facilities such as government offices, shopping malls, hotels, warehouses, and hospitals. In this case, the robot 200 may be, for example, a guide robot that guides the user to a predetermined location. The robot 200 may be an unmanned robot that runs automatically.
 図2は、第1の実施形態の制御装置100の機能構成の一例を示すブロック図である。図2に示すように、制御装置100は、取得部110と制御部120とを備える。 FIG. 2 is a block diagram showing an example of the functional configuration of the control device 100 of the first embodiment. As shown in FIG. 2 , the control device 100 includes an acquisition section 110 and a control section 120 .
 取得部110は、複数のロボット200から各種のデータを取得する。例えば、ロボット200のそれぞれには、音センサ、人感センサ、距離センサ等の各種センサ、及び撮影装置等が搭載される。ロボット200のそれぞれは、例えば、各種センサ及び撮影装置等から得られるデータを収集する。ここで、ロボット200によって収集されるデータを収集データと称することもある。ロボット200のそれぞれは、例えば、収集データを制御装置100に送信する。このとき収集データには、例えば、ロボット200の位置情報も含まれる。このように、取得部110は、自動走行する複数のロボット200のそれぞれの位置情報と、複数のロボットのそれぞれによって収集されたデータと、を含む収集データを取得する。取得部110は、取得手段の一例である。 The acquisition unit 110 acquires various data from the multiple robots 200 . For example, each of the robots 200 is equipped with various sensors such as a sound sensor, a human sensor, a distance sensor, and a photographing device. Each robot 200 collects data obtained from, for example, various sensors and imaging devices. Here, data collected by the robot 200 may also be referred to as collected data. Each of the robots 200 transmits collected data to the control device 100, for example. At this time, the collected data also includes position information of the robot 200, for example. In this way, the acquisition unit 110 acquires collected data including position information of each of the plurality of robots 200 that automatically travel and data collected by each of the plurality of robots. Acquisition unit 110 is an example of acquisition means.
 制御部120は、収集データに基づいて、ロボット200を制御する。例えば、制御部120はロボット200-1、200-2、200-3から収集したデータから、イベントの発生を検知する。イベントは、例えば、通常時とは異なる状況を示す。例えば、イベントは、障害物の発生、事件及び事故の発生、及び人物が倒れる等の急病人の発生等である。 The control unit 120 controls the robot 200 based on the collected data. For example, the control unit 120 detects the occurrence of an event from data collected from the robots 200-1, 200-2, and 200-3. An event indicates, for example, a situation different from normal times. For example, events include the occurrence of obstacles, the occurrence of incidents and accidents, and the occurrence of sudden illness such as the collapse of a person.
 例えば、イベントが検知される場合、制御部120は、イベントに応じてロボット200を制御する。例えば、ロボット200-1が移動するルート上に障害物が発生した場合、制御部120は、ロボット200-1に対して、障害物を回避するよう、停止、減速または方向転換の制御を行ったり、通行が困難な場所を通行しないよう、ルートを変更する制御を行ったりする。この例に限らず、例えば、ロボット200-2によって収取されたデータによってイベントが検知された場合に、制御部120は、当該イベントの発生場所に向かうよう、ロボット200-1を制御してもよい。すなわち、制御部120は、ロボット200を、動的な情報収集を行うために制御してもよい。なお、上記の例では、制御部120は、複数のロボット200から取得される収集データに基づいて、複数のロボット200のうちの一のロボットを制御しているが、2台以上のロボット200を制御してもよい。また、本開示において制御する対象のロボット200を対象ロボットと称することもある。 For example, when an event is detected, the control unit 120 controls the robot 200 according to the event. For example, when an obstacle occurs on the route along which the robot 200-1 moves, the control unit 120 controls the robot 200-1 to stop, decelerate, or change direction so as to avoid the obstacle. , control to change the route so that it does not pass through places that are difficult to pass. Not limited to this example, for example, when an event is detected by the data collected by the robot 200-2, the control unit 120 may control the robot 200-1 to go to the place where the event occurred. good. That is, the controller 120 may control the robot 200 to dynamically collect information. In the above example, the control unit 120 controls one of the plurality of robots 200 based on collected data acquired from the plurality of robots 200. may be controlled. Also, the robot 200 to be controlled in the present disclosure may be referred to as a target robot.
 このように、制御部120は、複数のロボット200のうちの少なくとも一である対象ロボットを、収集データに基づいて制御する。制御部120は、制御手段の一例である。 Thus, the control unit 120 controls the target robot, which is at least one of the plurality of robots 200, based on the collected data. Control unit 120 is an example of control means.
 次に、制御装置100の動作の一例を、図3を用いて説明する。なお、本開示において、フローチャートまたはシーケンス図の各ステップを「S1」のように、各ステップに付した番号を用いて表現する。 Next, an example of the operation of the control device 100 will be explained using FIG. In addition, in the present disclosure, each step in a flowchart or a sequence diagram is expressed using a number attached to each step, such as “S1”.
 図3は、制御装置100の動作の一例を説明するフローチャートである。取得部110は、自動走行する複数のロボット200のそれぞれの位置情報と、複数のロボット200のそれぞれによって収集されたデータと、を含む収集データを取得する(S1)。そして、制御部120は、複数のロボット200のうちの少なくとも一である対象ロボットを、収集データに基づいて制御する(S2)。 FIG. 3 is a flowchart explaining an example of the operation of the control device 100. FIG. The acquisition unit 110 acquires collected data including position information of each of the plurality of automatically traveling robots 200 and data collected by each of the plurality of robots 200 (S1). Then, the control unit 120 controls the target robot, which is at least one of the plurality of robots 200, based on the collected data (S2).
 このように、制御装置100は、複数のロボットから得られるデータを用いて、対象ロボットを制御する。そのため、制御装置100は、例えば、対象ロボットのみから得られるデータから状況を判断して対象ロボットを制御する場合に比べ、より多様な状況に応じた制御を行うことができる。すなわち、制御装置100は、自動走行するロボットに対して、より適切に、状況に応じた制御を行うことができる。 In this way, the control device 100 uses data obtained from multiple robots to control the target robot. Therefore, the control device 100 can perform control according to a wider variety of situations than, for example, when the target robot is controlled by judging the situation from data obtained only from the target robot. That is, the control device 100 can more appropriately control the automatically traveling robot according to the situation.
 <第2の実施形態>
 次に、第2の実施形態の制御装置について説明する。第2の実施形態では、第1の実施形態で説明した制御装置100について、より詳細に説明する。
<Second embodiment>
Next, the control device of the second embodiment will be explained. In the second embodiment, the control device 100 explained in the first embodiment will be explained in more detail.
 図4は、第2の実施形態の制御システム1000の構成の一例を示すブロック図である。第1の実施形態と同様に制御システム1000は、制御装置100とロボット200と基地局300とを備える。なお、本実施形態では、ロボット200は3台存在するものとするが、上述したようにロボットの台数はこの例に限らない。また、3台のロボットは、それぞれ、ロボット200-1、ロボット200-2、及びロボット200-3と称する。 FIG. 4 is a block diagram showing an example of the configuration of the control system 1000 of the second embodiment. A control system 1000 includes a control device 100, a robot 200, and a base station 300, as in the first embodiment. In this embodiment, it is assumed that there are three robots 200, but the number of robots is not limited to this example as described above. Also, the three robots are referred to as robot 200-1, robot 200-2, and robot 200-3, respectively.
 図4に示した制御システム1000の例では、基地局300を介して制御装置100とロボット200とが通信を行うが、制御システム1000において構築されるネットワークは特定の方式に限定されなくともよい。例えば、制御システム1000において構築されるネットワークは、無線LAN(Local Area Network)、公衆回線網、モバイルデータ通信網、またはこれらのネットワークの組み合わせであってよい。モバイルデータ通信網としては、第3世代移動通信システム(3G:3rd Generation)、LTE(Long Term Evolution)、第4世代移動通信システム(4G:4th Generation)、第5世代移動通信システム(5G:5th Generation)等が含まれる。ここで、5Gのネットワークであって、特定の範囲において構築される専用のネットワークをローカル5Gと呼ぶ。本開示の制御システム1000は、ローカル5Gによって構築されてもよい。この場合、基地局300はローカル5Gに対応する基地局である。基地局300は、サーバ上にソフトウェアで仮想的に構築されてもよい。5Gのネットワークを利用することにより、制御装置100は、他のモバイルデータ通信網等と比較して、複数のロボット200から得られる多数のデータを、低遅延で取得することができる。すなわち、他のモバイルデータ通信網等と比較して、制御装置100は、よりスムーズに複数のロボット200を制御することができる。また、ローカル5Gのような、専用のネットワークを用いることにより、セキュリティを担保することもできる。制御システム1000がローカル5Gによって構築されている場合、取得部110は、ローカル5Gのネットワークを通じて前記複数のロボットから前記収集データを取得する。そして、制御部120は、ローカル5Gのネットワークを通じて対象ロボットを制御する。 In the example of the control system 1000 shown in FIG. 4, the control device 100 and the robot 200 communicate via the base station 300, but the network constructed in the control system 1000 need not be limited to a specific method. For example, the network constructed in the control system 1000 may be a wireless LAN (Local Area Network), a public line network, a mobile data communication network, or a combination of these networks. As mobile data communication networks, there are 3rd generation mobile communication systems (3G: 3rd Generation), LTE (Long Term Evolution), 4th generation mobile communication systems (4G: 4th Generation), 5th generation mobile communication systems (5G: 5th Generation), etc. Here, a 5G network, which is a dedicated network built in a specific range, is called local 5G. The control system 1000 of the present disclosure may be built by local 5G. In this case, the base station 300 is a base station supporting local 5G. Base station 300 may be virtually constructed by software on a server. By using the 5G network, the control device 100 can acquire a large amount of data from the plurality of robots 200 with low delay compared to other mobile data communication networks or the like. That is, the control device 100 can control the plurality of robots 200 more smoothly than other mobile data communication networks or the like. Security can also be ensured by using a dedicated network such as local 5G. When the control system 1000 is built by local 5G, the acquisition unit 110 acquires the collected data from the plurality of robots through the local 5G network. The control unit 120 controls the target robot through a local 5G network.
 [ロボット200の詳細]
 ロボット200には、撮影装置、センサ、入力装置、出力装置等が搭載される。なお、ロボット200に搭載される各装置は、あくまで一例であり、さらに異なる装置が搭載されてもよい。また、撮影装置、センサ、入力装置、及び出力装置等はそれぞれ2つ以上存在していてもよい。本実施形態では、ロボット200-1、200-2、200-3のそれぞれは、同様の構成である例について説明するが、ロボット200のそれぞれは、異なる構成を有していてもよい。
[Details of the robot 200]
The robot 200 is equipped with an imaging device, a sensor, an input device, an output device, and the like. Each device mounted on the robot 200 is merely an example, and a different device may be mounted. Also, there may be two or more of each of the photographing device, the sensor, the input device, the output device, and the like. In this embodiment, robots 200-1, 200-2, and 200-3 each have the same configuration, but each robot 200 may have a different configuration.
 ロボット200が、役所、ショッピングモール、ホテル、倉庫及び病院等の施設に設置されるロボットであるとする。この場合、ユーザがロボット200に対して所定の入力を行うと、ロボット200は、入力に対応する応答を行ってもよい。例えば、ロボット200が案内ロボットである場合、ユーザは、入力装置を用いて所定の場所を示す情報を入力する。これに対して、ロボット200は、所定の場所を示す情報を、出力装置を用いて出力してもよいし、所定の場所に向かって移動してもよい。このような、入力に対応する応答の制御は、制御装置100等、ロボット200とは異なる装置によって行われてもよいし、ロボット200において自立的に行われてもよい。入力装置は、タッチパネル、キーボード及びマイク等であってよい。すなわち、ユーザは、タッチパネルまたはキーボードを押すことにより所定の場所を示す情報を入力してもよいし、音声を発することで、所定の場所を示す情報を入力してもよい。出力装置は、例えば、ディスプレイ、ランプ、及びスピーカ等であってよい。 Assume that the robot 200 is a robot installed in facilities such as government offices, shopping malls, hotels, warehouses, and hospitals. In this case, when the user makes a predetermined input to the robot 200, the robot 200 may make a response corresponding to the input. For example, if the robot 200 is a guide robot, the user uses an input device to input information indicating a predetermined location. On the other hand, the robot 200 may output information indicating the predetermined location using an output device, or may move toward the predetermined location. Such control of responses to inputs may be performed by a device different from the robot 200 , such as the control device 100 , or may be performed autonomously in the robot 200 . The input device may be a touch panel, keyboard, microphone, or the like. That is, the user may input information indicating a predetermined location by pressing a touch panel or keyboard, or may input information indicating a predetermined location by uttering voice. Output devices may be, for example, displays, lamps, speakers, and the like.
 ロボット200は、ロボット200に搭載される撮影装置により、撮影画像を生成する。すなわち、当該撮影装置によって、ロボット200は、ロボット200の周囲を撮影することが可能である。以降、ロボット200に搭載された撮影装置が撮影することを、ロボット200が撮影する、と表現することもある。また、ロボット200は、ロボット200に搭載されるセンサにより各種のセンサデータを収集可能である。例えば、センサが測距センサであれば、ロボット200は、周囲の物体までの距離の情報を収集できる。また、センサが人感センサであれば、ロボット200は、ロボット200付近の所定の範囲内の人物を検出できる。センサが音センサであれば、ロボット200は、ロボット200付近の所定の範囲内で発生した音を検出することができる。 The robot 200 generates a captured image using an imaging device mounted on the robot 200 . That is, the robot 200 can photograph the surroundings of the robot 200 using the photographing device. Henceforth, the photographing by the photographing device mounted on the robot 200 may also be expressed as the photographing by the robot 200 . Further, the robot 200 can collect various sensor data from sensors mounted on the robot 200 . For example, if the sensor is a ranging sensor, the robot 200 can collect distance information to surrounding objects. Also, if the sensor is a human sensor, the robot 200 can detect a person within a predetermined range near the robot 200 . If the sensors are sound sensors, the robot 200 can detect sounds generated within a predetermined range near the robot 200 .
 また、ロボット200は、位置情報を取得する機能を有する。例えば、ロボット200は、基地局300または他の無線機器による電波から、位置情報を計算するシステムを備えていてもよい。これに限らず、ロボット200は、GNSS(Global Navigation Satelight System)を備えていてもよい。ロボット200は、撮影画像やセンサデータを含むデータを、位置情報と関連付けて制御装置100に送信する。加えて、ロボット200が自律的に移動を行う場合、ロボット200は、移動するルートを示すルート情報を制御装置100に送信してもよい。例えば、ロボット200が所定の地点まで移動する場合、ロボット200は、所定の地点までに通過する場所を示す情報を、ルート情報として送信してもよい。 Also, the robot 200 has a function of acquiring position information. For example, robot 200 may include a system that computes location information from radio waves from base station 300 or other wireless devices. Not limited to this, the robot 200 may be equipped with a GNSS (Global Navigation Satellite System). The robot 200 associates data including captured images and sensor data with position information and transmits the data to the control device 100 . In addition, when the robot 200 moves autonomously, the robot 200 may transmit route information indicating a moving route to the control device 100 . For example, when the robot 200 moves to a predetermined point, the robot 200 may transmit, as route information, information indicating a place to pass through to the predetermined point.
 [制御装置100の詳細]
 図4に示すように、制御装置100は、取得部110と制御部120とを備える。取得部110は、ロボット200から収集データを取得する。収集データは、ロボット200により収集された、撮影画像とセンサデータと位置情報とを含む。
[Details of control device 100]
As shown in FIG. 4 , the control device 100 includes an acquisition section 110 and a control section 120 . Acquisition unit 110 acquires collected data from robot 200 . Collected data includes captured images, sensor data, and position information collected by robot 200 .
 制御部120は、イベント検知部121とロボット制御部122とを備える。イベント検知部121は、収集データに基づいてイベントを検知する。例えば、イベント検知部121は、収集データの撮影画像から障害物の発生をイベントとして検知する。障害物の発生の一例としては、人だかりが挙げられる。例えば、撮影画像に複数の人物が映り、当該複数の人物の移動が少ない場合、イベント検知部121は、人だかりの発生を検知する。また、混雑していたり、通路上に物が散乱していたりする等の、ロボット200が通行困難な状況も、障害物の発生の一例に含まれる。また、イベント検知部121は、事件や事故の発生をイベントとして検知してもよい。この場合、イベント検知部121は、例えば、撮影画像から、刃物や銃器等を所持している人物を検出したり、センサデータにより叫び声を検出したりすることにより、事件及び事故の発生を検知する。また、イベント検知部121は、撮影画像から、うずくまる人物を検知することにより、急病人の発生を検知してもよい。このように、イベント検知部121は、収集データに基づいてイベントを検知する。イベント検知部121は、イベント検知手段の一例である。 The control unit 120 includes an event detection unit 121 and a robot control unit 122. The event detection unit 121 detects events based on collected data. For example, the event detection unit 121 detects the occurrence of an obstacle as an event from the captured image of the collected data. An example of the generation of obstacles is crowds. For example, when a plurality of people appear in the captured image and the movement of the plurality of people is small, the event detection unit 121 detects the occurrence of a crowd. In addition, a situation in which the robot 200 cannot pass easily, such as when the passage is crowded or when things are scattered on the passage, is also included as an example of the occurrence of obstacles. Also, the event detection unit 121 may detect the occurrence of an incident or an accident as an event. In this case, the event detection unit 121 detects the occurrence of an incident or an accident by, for example, detecting a person holding a knife or a firearm from a captured image, or detecting a shout from sensor data. . Further, the event detection unit 121 may detect the occurrence of an emergency patient by detecting a crouching person from the captured image. In this way, the event detection unit 121 detects events based on collected data. The event detection unit 121 is an example of event detection means.
 また、イベント検知部121は、イベントの発生場所を算出する。例えば、イベント検知部121は、ロボット200の位置情報と、施設の地図情報等から、イベントの発生場所を算出することができる。ここで、施設の地図情報は、制御装置100に搭載された記憶装置(図示せず)または、制御装置100と通信可能な記憶装置(図示せず)に予め格納されていてもよい。 Also, the event detection unit 121 calculates the place where the event occurs. For example, the event detection unit 121 can calculate the place where the event occurs from the position information of the robot 200, the map information of the facility, and the like. Here, the facility map information may be stored in advance in a storage device (not shown) mounted on the control device 100 or a storage device (not shown) communicable with the control device 100 .
 加えて、イベント検知部121は、イベント発生の予測を行ってもよい。例えば、イベント検知部121は、収集データから障害物の出現を予測してもよい。一例として、イベント検知部121は、収集データから算出される人物が移動する方向と施設の地図情報とに基づいて、人物が出現する箇所を予測してもよい。 In addition, the event detection unit 121 may predict event occurrence. For example, the event detection unit 121 may predict the appearance of an obstacle from collected data. As an example, the event detection unit 121 may predict the place where a person appears based on the direction in which the person moves calculated from the collected data and the map information of the facility.
 ロボット制御部122は、イベント検知部121によって検知されたイベントに基づいて、対象ロボットを制御する。ロボット制御部122は、ロボット制御手段の一例である。ここで、図5及び図6を用いて、ロボット制御部122が行う制御の一例を説明する。 The robot control unit 122 controls the target robot based on the event detected by the event detection unit 121. The robot control unit 122 is an example of robot control means. Here, an example of control performed by the robot control unit 122 will be described with reference to FIGS. 5 and 6. FIG.
 図5は、施設に配置されたロボット200の一例を示す図である。図5の例では、施設内にロボット200-1、200-2、200-3が配置されている。ハッチングされている箇所は、柱や壁等の通行不可能な場所を示す。また、地点Aの付近に人物が集まっている。ここで、ロボット200-2が地点Aを含む範囲を撮影しているとする。そして、イベント検知部121が、ロボット200-2によって撮影された撮影データを含む収集データに基づいて、地点Aにおいて人だかりが発生したことを検知したとする。この場合、ロボット制御部122は、例えばロボット200-3に、地点A付近に移動するよう制御する。これにより、制御装置100は、地点A付近で発生したイベントに関する収集データを、ロボット200-3からも取得することができる。このように、ロボット制御部122は、発生したイベントについての情報を収集するために、対象ロボットを制御してもよい。 FIG. 5 is a diagram showing an example of robots 200 placed in a facility. In the example of FIG. 5, robots 200-1, 200-2, and 200-3 are arranged within the facility. Hatched areas indicate impassable areas such as pillars and walls. In addition, people are gathered in the vicinity of the point A. Assume here that the robot 200-2 is photographing a range including the point A. FIG. Assume that the event detection unit 121 detects that a crowd has occurred at the point A based on the collected data including the photographed data photographed by the robot 200-2. In this case, the robot control unit 122 controls the robot 200-3 to move to the vicinity of the point A, for example. As a result, the control device 100 can acquire the collected data related to the event that occurred near the point A also from the robot 200-3. Thus, the robot controller 122 may control the target robot to collect information about the event that has occurred.
 なお、図5において、ロボット200-3を対象ロボットとして選択する例について説明したが、対象ロボットは、様々な方法で選択されてよい。例えばイベント検知部121によって、地点Aにおいてイベントが発生したことが検知されたとする。この場合に、ロボット制御部122は、地点Aに最も近い位置にいるロボット200を、対象ロボットとして選択してもよい。このとき、ロボット制御部122は、地点Aに近い順にロボット200を複数台選択してもよい。また、ロボット制御部122は、地点Aから所定の範囲にいるロボット200を選択してもよい。また、ロボット制御部122は、複数のロボット200のうち、ユーザによる指示を受けていないロボット200を選択してもよい。ユーザによる指示を受けていないロボット200は、入力を受け付けているまたは入力に対応する応答を行っているロボットである。例えば、ロボット制御部122は、ユーザによる指示によって、所定の地点までの案内を行っていたり、物の運搬を行っていたりするロボット以外のロボット200を選択してもよい。 Although the example in which the robot 200-3 is selected as the target robot has been described in FIG. 5, the target robot may be selected in various ways. For example, assume that the event detection unit 121 detects that an event has occurred at point A. FIG. In this case, the robot control unit 122 may select the robot 200 closest to the point A as the target robot. At this time, the robot control unit 122 may select a plurality of robots 200 in order of proximity to the point A. FIG. Further, the robot control unit 122 may select the robot 200 within a predetermined range from the point A. Further, the robot control unit 122 may select a robot 200 that has not received an instruction from the user from among the plurality of robots 200 . A robot 200 that has not received an instruction from a user is a robot that is accepting an input or making a response corresponding to the input. For example, the robot control unit 122 may select the robot 200 other than the robot that guides the robot to a predetermined point or transports an object, according to an instruction from the user.
 また、図6は、施設に配置されたロボット200の他の例を示す図である。図6では、図5の例と同様に、施設内にロボット200-1、200-2、200-3が配置されている。また、ハッチングされている箇所は、柱や壁等の通行不可能な場所を示す。ここで、ロボット200-1は、ユーザを地点Bに誘導するために、地点Bに向かって移動している。ここで、ロボット200-2が地点Cを含む範囲を撮影しているとする。そして、イベント検知部121が、ロボット200-2によって撮影された撮影データを含む収集データに基づいて、地点Cにおいて人だかりが発生したことを検知したとする。この場合、ロボット制御部122は、ロボット200-1が地点Cを通らないルートで移動するよう、ロボット200-1を制御する。例えば、ロボット制御部122が、図6のX方向に移動するよう制御する。 Also, FIG. 6 is a diagram showing another example of the robot 200 arranged in the facility. In FIG. 6, robots 200-1, 200-2, and 200-3 are arranged in the facility as in the example of FIG. In addition, hatched areas indicate impassable areas such as pillars and walls. Here, robot 200-1 is moving toward point B in order to guide the user there. Assume here that the robot 200-2 is photographing a range including the point C. As shown in FIG. Assume that event detection unit 121 detects that a crowd has occurred at point C, based on collected data including photographed data photographed by robot 200-2. In this case, the robot control unit 122 controls the robot 200-1 so that the robot 200-1 moves along a route that does not pass through the point C. For example, the robot control unit 122 controls to move in the X direction in FIG.
 また、図6の例において、イベント検知部121が、ロボット200-3から取得された収集データに基づいて、人物が地点Dの方向に移動したことを検知したとする。すなわち、イベント検知部121が、地点Dに人物が出現することを予測したとする。この場合、ロボット制御部122は、当該人物を避けるようロボット200-1を制御する。例えば、ロボット制御部122は、ロボット200-1が地点Dを通る前に、ロボット200-1に減速または停止の制御を行う。このように、ロボット制御部122は、移動中の対象ロボットに対して、発生したイベント、または、発生が予測されるイベントを避けるような制御を行ってもよい。 Also, in the example of FIG. 6, assume that the event detection unit 121 detects that the person has moved in the direction of the point D based on the collected data acquired from the robot 200-3. That is, assume that the event detection unit 121 predicts that a person will appear at the point D. FIG. In this case, the robot control unit 122 controls the robot 200-1 to avoid the person. For example, before the robot 200-1 passes the point D, the robot control unit 122 controls the robot 200-1 to decelerate or stop. In this way, the robot control unit 122 may control the moving target robot to avoid an event that has occurred or an event that is predicted to occur.
 なお、図6において、ロボット200-1を対象ロボットとして選択する例について説明したが、対象ロボットは、様々な方法で選択されてよい。例えば、イベント検知部121によって地点Cにおいてイベントの発生が検知された場合、ロボット制御部122は、ルート情報に基づいて、複数のロボット200のうち、地点C付近を通る予定のロボット200を選択してもよい。この場合、地点C付近を通るロボットがいない場合、ロボット200にイベントに応じた制御を行わなくともよい。 Although the example in which the robot 200-1 is selected as the target robot has been described in FIG. 6, the target robot may be selected in various ways. For example, when the event detection unit 121 detects the occurrence of an event at the point C, the robot control unit 122 selects the robot 200 scheduled to pass near the point C from among the plurality of robots 200 based on the route information. may In this case, if no robot passes near the point C, the robot 200 need not be controlled according to the event.
 加えて、ロボット制御部122は、検知されたイベントに関する情報を、ロボット200において出力するよう制御してもよい。例えば、図6の例において、地点Dに人物が出現すると予測された場合に、ロボット200-1の出力装置において、人物が出現することを出力してもよい。図7は、イベントに関する情報の出力例である。図7に示すように、ロボット制御部122は、出力装置の一例であるディスプレイに、「人が飛び出してきます。注意してください。」と表示するよう制御してもよい。出力する内容及び方法はこの例に限らない。例えば、事件または事故の発生が検知された場合に、ロボット制御部122は、事件または事故が発生したことを示す情報を、ロボット200のディスプレイに表示してもよいし、スピーカを用いて音声で出力してもよいし、ランプの光により示してもよい。 In addition, the robot control unit 122 may control the robot 200 to output information regarding the detected event. For example, in the example of FIG. 6, when it is predicted that a person will appear at point D, the output device of robot 200-1 may output that the person will appear. FIG. 7 is an output example of information about an event. As shown in FIG. 7, the robot control unit 122 may control the display, which is an example of the output device, to display "A person is jumping out. Please be careful." The content and method of output are not limited to this example. For example, when the occurrence of an incident or accident is detected, the robot control unit 122 may display information indicating that an incident or accident has occurred on the display of the robot 200, or audibly using a speaker. It may be output or indicated by the light of a lamp.
 [制御装置100の動作例]
 次に制御装置100の動作の一例を、図8を用いて説明する。図8は、制御装置100の動作の一例を示すフローチャートである。
[Example of operation of control device 100]
Next, an example of the operation of control device 100 will be described with reference to FIG. FIG. 8 is a flow chart showing an example of the operation of the control device 100. As shown in FIG.
 取得部110は、ロボット200のそれぞれから、収集データを取得する(S101)。イベント検知部121が、収集データからイベントの発生を検知しない場合(S102の「No」)、フローを終了する。このとき、制御装置100は、例えば、当該フローを繰り返す。 The acquisition unit 110 acquires collected data from each of the robots 200 (S101). If the event detection unit 121 does not detect the occurrence of an event from the collected data (“No” in S102), the flow ends. At this time, the control device 100 repeats the flow, for example.
 イベント検知部121が、収集データからイベントの発生を検知した場合(S102の「Yes」)、ロボット制御部122は、対象ロボットを選択する(S103)。例えば、発生したイベントについての情報を収集する場合、ロボット制御部122は、イベントの発生場所から所定範囲内にいるロボット200を対象ロボットとして選択してもよい。また、例えば、発生するイベントを避けるようロボット200を制御する場合、ロボット制御部122は、イベントの発生場所を通るロボット200を対象ロボットとして選択してもよい。そして、ロボット制御部122は、選択されたロボット200(すなわち対象ロボット)をイベントに応じて制御する(S104)。例えば、ロボット制御部122は、対象ロボットをイベントの発生場所付近に移動するよう制御してもよい。また、例えば、ロボット制御部122は、対象ロボットを、イベントを避けるよう制御してもよい。 When the event detection unit 121 detects the occurrence of an event from the collected data ("Yes" in S102), the robot control unit 122 selects a target robot (S103). For example, when collecting information about an event that has occurred, the robot control unit 122 may select robots 200 that are within a predetermined range from the place where the event occurred as target robots. Further, for example, when controlling the robot 200 to avoid an event that occurs, the robot control unit 122 may select the robot 200 that passes through the place where the event occurs as the target robot. Then, the robot control unit 122 controls the selected robot 200 (that is, the target robot) according to the event (S104). For example, the robot control unit 122 may control the target robot to move near the location where the event occurred. Further, for example, the robot control unit 122 may control the target robot to avoid the event.
 なお、ロボット制御部122は、対象ロボットを複数選択してもよい。そして、ロボット制御部122は、複数の対象ロボットのうち、一部の対象ロボットに対して、イベントの発生場所付近に移動する制御を行い、その他の対象ロボットに対して、イベントを避ける制御を行ってもよい。 Note that the robot control unit 122 may select a plurality of target robots. Then, the robot control unit 122 controls some target robots among the plurality of target robots to move to the vicinity of the location where the event occurred, and controls the other target robots to avoid the event. may
 このように、第2の実施形態の制御装置100は、自動走行する複数のロボットのそれぞれの位置情報と、複数のロボットのそれぞれによって収集されたデータと、を含む収集データを取得し、複数のロボットのうちの少なくとも一である対象ロボットを、収集データに基づいて制御する。これにより、制御装置100は、例えば、対象ロボットのみから得られるデータから状況を判断して対象ロボットを制御する場合に比べ、より多様な状況に応じた制御を行うことができる。すなわち、制御装置100は、自動走行するロボットに対して、より適切に、状況に応じた制御を行うことができる。 In this way, the control device 100 of the second embodiment acquires collected data including the position information of each of the plurality of automatically traveling robots and the data collected by each of the plurality of robots. A target robot, which is at least one of the robots, is controlled based on the collected data. As a result, the control device 100 can perform control according to a wider variety of situations than, for example, when controlling the target robot by determining the situation from data obtained only from the target robot. That is, the control device 100 can more appropriately control the automatically traveling robot according to the situation.
 また、収集データは、複数のロボットに搭載される撮影装置によって生成された撮影画像を含み、第2の実施形態の制御装置100は、収集データに基づいて、イベントを検知し、検知されたイベントに基づいて、対象ロボットを制御してもよい。これにより、制御装置100は、例えば、対象ロボットとは異なるロボットからのデータによりイベントの発生が検知された場合であっても、対象ロボットにイベントに応じた制御を行うことができる。すなわち、制御装置100は、より適切に、状況に応じた制御を行うことができる。 In addition, the collected data includes captured images generated by imaging devices mounted on a plurality of robots, and the control device 100 of the second embodiment detects an event based on the collected data, and detects the detected event. You may control a target robot based on. As a result, even if the occurrence of an event is detected by data from a robot different from the target robot, the control device 100 can control the target robot according to the event. That is, the control device 100 can more appropriately perform control according to the situation.
 また、第2の実施形態の制御装置100は、対象ロボットが、所定の場所に向かって移動している場合、検知されたイベントの発生場所を通らないルートで移動するよう対象ロボットを制御してもよい。例えば、イベントの発生によりロボットが通行困難な場所があるとする。これに対して、制御装置100は、対象ロボットに対して、通行困難な場所を回避させることができる。また、制御装置100は、複数のロボットから得られた収集データに基づいてイベントの発生を検知するので、例えば、対象ロボットが通行困難な場所に近づく前に、対象ロボットに対して、当該場所を回避させることも可能となる。 Further, when the target robot is moving toward a predetermined location, the control device 100 of the second embodiment controls the target robot to move along a route that does not pass through the location where the detected event occurred. good too. For example, assume that there is a place that is difficult for the robot to pass due to the occurrence of an event. On the other hand, the control device 100 can cause the target robot to avoid the difficult-to-pass places. In addition, since the control device 100 detects the occurrence of an event based on the collected data obtained from a plurality of robots, for example, before the target robot approaches a place where it is difficult to pass, the target robot is notified of the location. It is also possible to avoid it.
 また、第2の実施形態の制御装置100は、検知されたイベントの発生場所に向かうよう対象ロボットを制御してもよい。これにより、制御装置100は、イベントの発生場所付近で収集される収集データを増加させることができるので、より詳細な、イベントに関する情報を取得することができる。 Also, the control device 100 of the second embodiment may control the target robot to move toward the location where the detected event occurred. As a result, the control device 100 can increase the amount of collected data collected in the vicinity of the place where the event occurred, so that more detailed information about the event can be obtained.
 [変形例1]
 上述の実施形態は、複数のロボット200から取得される収集データを用いて、対象ロボットを制御する例について説明したが、さらに異なるデータも用いて対象ロボットを制御してもよい。
[Modification 1]
In the above-described embodiment, the collected data acquired from a plurality of robots 200 is used to control the target robot, but different data may be used to control the target robot.
 例えば、施設に据え置き型の撮影装置及びセンサ等がある場合、制御装置100は、据え置き型の撮影装置及びセンサから得られるデータを取得してもよい。据え置き型の撮影装置は、例えば、施設に設置された監視カメラであってよい。また、据え置き型のセンサは、監視カメラと一体に設置されたマイクであってもよいし、独立したマイクであってもよい。図9は、施設に設置された撮影装置及びセンサの一例を示す図である。なお、図9は、図5に示した状況に、据え置き型の撮影装置として監視カメラが設置され、据え置き型のセンサとして独立したマイクが設置されている。この場合に、取得部110は、ロボット200による収集データに加え、据え置き型の撮影装置によって生成される撮影画像及びセンサによって生成されるセンサデータを取得する。そして、イベント検知部121は、取得されたデータに基づいてイベントの発生を検知する。図9の例において、イベント検知部121は、例えば、ロボット200-2から取得した収集データと、据え置き型の撮影装置から取得した撮影画像と、から、地点A付近に人物が集まっていることを検知する。また、イベント検知部121は、据え置き型のセンサから取得されたセンサデータに基づいて、センサ周辺に特定の物音が発生したことを検知する。このような情報から、イベント検知部121は、地点Aにおいてイベントが発生したことを検知してもよい。 For example, if a facility has a stationary imaging device and a sensor, the control device 100 may acquire data obtained from the stationary imaging device and sensor. A stationary imaging device may be, for example, a surveillance camera installed in a facility. Also, the stationary sensor may be a microphone installed integrally with the surveillance camera, or may be an independent microphone. FIG. 9 is a diagram showing an example of an imaging device and sensors installed in a facility. In FIG. 9, a monitoring camera is installed as a stationary photographing device in the situation shown in FIG. 5, and an independent microphone is installed as a stationary sensor. In this case, in addition to the data collected by the robot 200, the acquisition unit 110 acquires the captured image generated by the stationary imaging device and the sensor data generated by the sensor. Then, the event detection unit 121 detects occurrence of an event based on the acquired data. In the example of FIG. 9, the event detection unit 121 detects that people are gathering near the point A, for example, based on the collected data acquired from the robot 200-2 and the photographed image acquired from the stationary photographing device. detect. The event detection unit 121 also detects that a specific noise has occurred around the sensor based on sensor data acquired from the stationary sensor. The event detection unit 121 may detect that an event has occurred at the point A from such information.
 このように、制御装置100は、複数のロボット200とは異なる撮影装置から他の撮影画像をさらに取得し、複数のロボット200とは異なるセンサから他のセンサデータを取得してもよい。そして、制御装置100は、収集データ、他の撮影画像、及び他のセンサデータに基づいて、ロボット200を制御してもよい。これにより、制御装置100は、より詳細な状況を把握できるので、ロボット200に対して、より状況に応じた制御を行うことができる。 In this way, the control device 100 may further acquire another captured image from an imaging device different from that of the plurality of robots 200 and acquire other sensor data from a sensor different from that of the plurality of robots 200 . The control device 100 may then control the robot 200 based on collected data, other captured images, and other sensor data. As a result, the control device 100 can grasp the situation in more detail, so that it is possible to control the robot 200 according to the situation.
 [変形例2]
 複数のロボット200から対象ロボットを選択する例について、さらに異なる例を説明する。
[Modification 2]
A different example of selecting a target robot from a plurality of robots 200 will be described.
 図10は、施設に配置されたロボット200の一例を示す図である。図10の例では、2階建ての施設に、ユーザを誘導する案内ロボットが配置されているとする。この例において、ロボット200-1は、ユーザを1階から2階の所定の場所に誘導する。ここで、ロボット200-1がエレベータに乗れない場合、2階においてユーザの誘導を別のロボット200に引き継ぐ場合がある。 FIG. 10 is a diagram showing an example of robots 200 placed in a facility. In the example of FIG. 10, it is assumed that a guide robot that guides users is placed in a two-story facility. In this example, robot 200-1 guides the user from the first floor to the second floor. Here, if the robot 200-1 cannot get on the elevator, there is a case where another robot 200 takes over the guidance of the user on the second floor.
 このような状況において、制御装置100のイベント検知部121は、ロボット200-1のルート情報等から、イベントとして、誘導の引継ぎが2階のエレベータで発生することを検知する。そして、ロボット制御部122は、他のロボット200から誘導の引継ぎを行う対象ロボットを選択する。例えば、図10の例では、ロボット200-2は他ユーザの誘導を行っている。そのため、ロボット200-3を対象ロボットとして選択し、誘導の引継ぎを行うよう制御する。この例に限らず、例えば、ロボット制御部122は、2階にいるロボット200であって、エレベータに近いロボット200を対象ロボットとして選択してもよい。 In such a situation, the event detection unit 121 of the control device 100 detects, from the route information of the robot 200-1 and the like, that the handover of guidance occurs in the elevator on the second floor as an event. Then, the robot control unit 122 selects a target robot to take over guidance from the other robots 200 . For example, in the example of FIG. 10, the robot 200-2 is guiding another user. Therefore, the robot 200-3 is selected as the target robot and controlled to take over the guidance. For example, the robot control unit 122 may select the robot 200 on the second floor and near the elevator as the target robot.
 <第3の実施形態>
 次に、第3の実施形態の制御装置について説明する。第3の実施形態では、ロボットがユーザを誘導して移動する場合の他の例について説明する。なお、第1及び第2の実施形態で説明した内容と重複する内容は、一部説明を省略する。
<Third Embodiment>
Next, the control device of the third embodiment will be explained. In the third embodiment, another example in which the robot guides the user and moves will be described. It should be noted that description of some of the content that overlaps with the content described in the first and second embodiments will be omitted.
 図11は、第3の実施形態の制御システム1001の構成の一例を示すブロック図である。図11に示すように、制御システム1001は、制御装置101とロボット200と基地局300とを備える。 FIG. 11 is a block diagram showing an example of the configuration of the control system 1001 of the third embodiment. As shown in FIG. 11 , control system 1001 includes control device 101 , robot 200 and base station 300 .
 ここで、第3の実施形態において、制御システム1001が適用される場面の一例について説明する。図12は、制御システム1001の構成の一例を模式的に示す図である。ロボット200は、ユーザを誘導する。本実施形態では、制御システム1001を病院において適用する例について説明するが、制御システム1001の適用例は病院に限らない。制御システム1001は、上述したように、役所、ショッピングモール、ホテル、倉庫等の施設に適用されてもよい。病院において制御システム1001が適用される場合、ロボット200は、例えば、来院した人物に対して病院内を案内するロボット、または、患者の散歩の誘導を行うロボットである。なお、本実施形態において、ユーザを誘導するロボット200を、対象ロボットとする。また、本実施形態では、ロボット200が誘導する人物をまとめてユーザと称する。制御装置101は、情報処理端末とも通信可能である。情報処理端末は、例えば、スマートフォン及びタブレット端末等の携帯型端末であってもよいし、パーソナルコンピュータであってもよいし。情報処理端末は、例えば、医師または看護師の居室に備えられてもよいし、警備員が常駐する警備員室等に備えられてもよい。また、情報処理端末は、施設の従業員が所持していてもよい。 Here, an example of a scene where the control system 1001 is applied in the third embodiment will be described. FIG. 12 is a diagram schematically showing an example of the configuration of the control system 1001. As shown in FIG. The robot 200 guides the user. In this embodiment, an example in which the control system 1001 is applied in a hospital will be described, but the application of the control system 1001 is not limited to hospitals. Control system 1001 may be applied to facilities such as government offices, shopping malls, hotels, warehouses, etc., as described above. When the control system 1001 is applied in a hospital, the robot 200 is, for example, a robot that guides a visiting person through the hospital or a robot that guides a patient to walk. In this embodiment, the robot 200 that guides the user is the target robot. Also, in the present embodiment, a person guided by the robot 200 is collectively referred to as a user. The control device 101 can also communicate with an information processing terminal. The information processing terminal may be, for example, a mobile terminal such as a smart phone and a tablet terminal, or may be a personal computer. The information processing terminal may be provided, for example, in a room of a doctor or a nurse, or may be provided in a security guard room where a security guard is stationed. Also, the information processing terminal may be possessed by an employee of the facility.
 ロボット200は、誘導するユーザの映る画像を撮影する。例えば、ロボット200は、ロボット200がユーザの誘導を開始する際にユーザを撮影してもよい。このときロボット200は、当該画像を、ユーザの映る撮影画像として、制御装置101に送信してもよい。また、ロボット200は、ユーザが装着しているウェアラブル端末と通信可能であってよい。ウェアラブル端末は、例えば、ユーザの生体情報を計測する。生体情報の例としては、ユーザの体温、血圧、心拍数、及び血中酸素濃度等である。ロボット200は、ウェアラブル端末によって計測されたユーザの生体情報を収集データとして、制御装置101に送信してもよい。 The robot 200 captures an image of the guiding user. For example, the robot 200 may photograph the user when the robot 200 starts guiding the user. At this time, the robot 200 may transmit the image to the control device 101 as a captured image of the user. Also, the robot 200 may be capable of communicating with a wearable terminal worn by the user. A wearable terminal measures a user's biometric information, for example. Examples of biological information include the user's body temperature, blood pressure, heart rate, and blood oxygen concentration. The robot 200 may transmit the user's biological information measured by the wearable terminal to the control device 101 as collected data.
 [制御装置101の詳細]
 図11に示すように、制御装置101は、取得部111と、制御部125と、認証部130と、通知部140と、を備える。
[Details of the control device 101]
As shown in FIG. 11 , the control device 101 includes an acquisition section 111 , a control section 125 , an authentication section 130 and a notification section 140 .
 認証部130は、ユーザの認証を行う。具体的には、認証部130は、ユーザを含む撮影画像に基づいて顔認証を行うことで、ユーザを特定する。例えば、認証部130は、当該撮影画像から、ユーザの顔に関する特徴量を抽出する。そして、認証部130は、抽出された特徴量と、特徴量データベースに含まれる特徴量とを照合する。ここで、特徴量データベースには、複数の人物の顔に関する特徴量の情報と、人物を識別する情報とが関連付けられた情報が含まれる。特徴量データベースは、制御装置101が有する記憶装置(図示せず)に格納されていてもよいし、制御装置101と通信可能な記憶装置(図示せず)に格納されていてもよい。顔認証を行う方法は既知の方法であってよい。また、認証部130は、顔に関する特徴量に限らず、虹彩、指紋及び掌紋等他の生体に関する特徴量を用いた認証を行ってよい。このように、認証部130は、対象ロボットにおいて撮影された画像であってユーザを含む撮影画像に基づいて生体認証を行い、ユーザを特定する。認証部130は、認証手段の一例である。 The authentication unit 130 authenticates users. Specifically, the authentication unit 130 identifies the user by performing face authentication based on the captured image including the user. For example, the authentication unit 130 extracts a feature amount related to the user's face from the captured image. Then, the authentication unit 130 collates the extracted feature amount with the feature amount included in the feature amount database. Here, the feature amount database includes information in which feature amount information relating to the faces of a plurality of persons and information for identifying the persons are associated with each other. The feature amount database may be stored in a storage device (not shown) that the control device 101 has, or may be stored in a storage device (not shown) that can communicate with the control device 101 . A known method may be used to perform face authentication. In addition, the authentication unit 130 may perform authentication using not only the feature amount related to the face, but also the feature amount related to other living bodies such as iris, fingerprint, and palm print. In this way, the authentication unit 130 performs biometric authentication based on a captured image including the user, which is captured by the target robot, and identifies the user. Authentication unit 130 is an example of authentication means.
 取得部111は、取得部110と同様に、ロボット200のそれぞれから収集データを取得する。ここで、ロボット200が、ユーザが装着しているウェアラブル端末と通信可能である場合、収集データには、ウェアラブル端末によって計測されたユーザの生体情報が含まれる。また、取得部111は、対象ロボットからユーザを含む撮影画像を取得する。 The acquisition unit 111 acquires collected data from each of the robots 200 in the same manner as the acquisition unit 110 . Here, if the robot 200 can communicate with the wearable terminal worn by the user, the collected data includes the user's biological information measured by the wearable terminal. Also, the acquisition unit 111 acquires a captured image including the user from the target robot.
 また、認証部130によってユーザが特定されている場合、取得部111は、特定されたユーザに関する情報を含む個人データを取得してもよい。個人データとは、特定の人物に関する情報を含むデータである。個人データの一例としては、患者のカルテのデータである。個人データには、例えば、人物の氏名、性別、年齢、住所、病名、症状、アレルギー、各種検査結果、障害のある箇所、及び、障害の等級等の情報が含まれてよい。個人データは、制御装置101が有する記憶装置(図示せず)に格納されていてもよいし、制御装置101と通信可能な記憶装置(図示せず)に格納されていてもよい。 Also, when the user is specified by the authentication unit 130, the acquisition unit 111 may acquire personal data including information about the specified user. Personal data is data that contains information about a specific person. An example of personal data is patient chart data. Personal data may include information such as a person's name, sex, age, address, disease name, symptoms, allergies, various test results, location of disability, and grade of disability, for example. The personal data may be stored in a storage device (not shown) that the control device 101 has, or may be stored in a storage device (not shown) that can communicate with the control device 101 .
 制御部125は、イベント検知部123とロボット制御部124とを備える。イベント検知部123とロボット制御部124とは、それぞれ、イベント検知部121とロボット制御部122とのそれぞれと同様の動作を行ってもよい。 The control unit 125 includes an event detection unit 123 and a robot control unit 124. The event detection unit 123 and the robot control unit 124 may perform the same operations as the event detection unit 121 and the robot control unit 122, respectively.
 また、イベント検知部123は、ユーザに異常が発生したことを検知してもよい。この場合、イベント検知部123は、ユーザの生体情報からユーザの異常を検知してもよい。例えば、ユーザの体温、血圧、心拍数、及び血中酸素濃度等が所定の範囲の数値から外れている場合、イベント検知部123は、ユーザの体温、血圧、心拍数、及び血中酸素濃度等の異常を検知する。イベント検知部123は、例えば、このような生体情報の異常をユーザの異常として検知する。 Also, the event detection unit 123 may detect that an abnormality has occurred in the user. In this case, the event detection unit 123 may detect the user's abnormality from the user's biological information. For example, if the user's body temperature, blood pressure, heart rate, blood oxygen level, etc. are out of the predetermined range, the event detection unit 123 detects the user's body temperature, blood pressure, heart rate, blood oxygen level, etc. Detect anomalies in The event detection unit 123 detects, for example, such an anomaly in biological information as an anomaly in the user.
 また、ロボット制御部124は、取得部111によって取得された個人データに基づいて対象ロボットを制御してもよい。例えば、個人データに含まれる年齢の情報からユーザが高齢者であることがわかる場合、ロボット制御部124は、対象ロボットに移動するスピードを緩める制御を行ったり、ロボットに搭載されたディスプレイに表示する文字を大きくする制御を行ったりしてもよい。また、例えば、個人データに、ユーザが松葉づえまたは車いすで通常生活を行っていることを示す情報が含まれているとする。この場合、ロボット制御部124は、対象ロボットに対し、移動するスピードを緩める制御を行ったり、段差の少ないルートを通るよう制御を行ったりしてもよい。 Also, the robot control unit 124 may control the target robot based on the personal data acquired by the acquisition unit 111 . For example, if the age information contained in the personal data indicates that the user is an elderly person, the robot control unit 124 controls the target robot to slow down, or displays on the display mounted on the robot. You may perform control which enlarges a character. Also, for example, it is assumed that the personal data includes information indicating that the user lives normally on crutches or in a wheelchair. In this case, the robot control unit 124 may control the target robot to slow down its movement speed or to follow a route with few steps.
 また、ロボット制御部124は、ユーザの生体情報に基づいて対象ロボットを制御してもよい。例えば、ユーザの心拍数が所定値以上上昇している場合、または、ユーザの体温が所定値以上である場合に、ロボット制御部124は、対象ロボットに、移動するスピードを緩める制御を行ったり、エレベータ及びエスカレータを使用するルートを移動するよう制御を行ったりしてもよい。 Also, the robot control unit 124 may control the target robot based on the user's biological information. For example, when the user's heart rate rises by a predetermined value or more, or when the user's body temperature rises by a predetermined value or more, the robot control unit 124 controls the target robot to slow down its movement, Control may be performed to move along a route using elevators and escalators.
 通知部140は、各種情報の通知を行う。例えば、認証部130によってユーザが特定された場合に、通知部140は、ユーザの位置情報を情報処理端末に通知する。このとき、通知部140は、対象ロボットの位置情報をユーザの位置情報として通知してもよい。このように、通知部140は、対象ロボットの位置情報を、ユーザの位置情報として所定の端末に通知する。通知部140は通知手段の一例である。 The notification unit 140 notifies various information. For example, when the user is identified by the authentication unit 130, the notification unit 140 notifies the information processing terminal of the user's location information. At this time, the notification unit 140 may notify the position information of the target robot as the position information of the user. In this way, the notification unit 140 notifies the predetermined terminal of the positional information of the target robot as the user's positional information. The notification unit 140 is an example of notification means.
 また、通知部140は、ユーザの生体情報に基づいて通知を行ってもよい。例えば、イベント検知部123によって、ユーザの生体情報の異常が検知された場合、通知部140は、ユーザに異常が発生したことを情報処理端末に通知する。これにより、例えば、病院内の医療従事者にユーザの状態を知らせることができる。 Also, the notification unit 140 may notify based on the user's biometric information. For example, when the event detection unit 123 detects an abnormality in the user's biological information, the notification unit 140 notifies the information processing terminal that the user has an abnormality. This allows, for example, medical personnel in a hospital to be notified of the user's condition.
 [制御装置101の動作例]
 次に制御装置101の動作の一例を、図13を用いて説明する。図13は、制御装置101の動作の一例を示すフローチャートである。
[Example of operation of control device 101]
Next, an example of operation of the control device 101 will be described with reference to FIG. FIG. 13 is a flow chart showing an example of the operation of the control device 101. As shown in FIG.
 取得部111は、ユーザを含む撮影画像を取得する(S201)。そして、認証部130は、ユーザを含む撮影画像に基づいて生体認証を行う。当該生体認証によりユーザが特定された場合(S202の「Yes」)、取得部111は、特定されたユーザに関する情報を含む個人データを取得する(S203)。ロボット制御部124は、個人データに基づいて対象ロボットを制御する(S204)。認証部130によってユーザが特定されなかった場合(S202の「No」)、制御装置101は、S203及びS204の処理を行わない。 The acquisition unit 111 acquires a captured image including the user (S201). Then, the authentication unit 130 performs biometric authentication based on the captured image including the user. When the user is specified by the biometric authentication (“Yes” in S202), the acquiring unit 111 acquires personal data including information about the specified user (S203). The robot control unit 124 controls the target robot based on the personal data (S204). If the user is not specified by the authentication unit 130 (“No” in S202), the control device 101 does not perform the processes of S203 and S204.
 取得部111は、収集データを取得する(S205)。イベント検知部123がイベントの発生を検知した場合(S206の「Yes」)、ロボット制御部124は、対象ロボットをイベントに応じて制御する(S207)。このとき、ロボット制御部124は、例えば、収集データに含まれる、ユーザの生体情報に異常がある場合に、以上に応じて対象ロボットを制御する。イベント検知部123がイベントの発生を検知しなかった場合(S206の「No」)、制御装置101は、S207の処理を行わない。 The acquisition unit 111 acquires collected data (S205). When the event detection unit 123 detects the occurrence of an event (“Yes” in S206), the robot control unit 124 controls the target robot according to the event (S207). At this time, the robot control unit 124 controls the target robot according to the above, for example, when there is an abnormality in the user's biological information included in the collected data. If the event detection unit 123 does not detect the occurrence of an event (“No” in S206), the control device 101 does not perform the process of S207.
 そして、S202の処理においてユーザが特定されている場合(S208の「Yes」)、通知部140は、ユーザに関する通知を行う(S209)。例えば、通知部140は、ユーザを誘導する対象ロボットの位置情報をユーザの位置情報として情報処理端末に通知する。また、通知部140は、例えば、ユーザの生体情報に異常がある場合に、ユーザに異常が発生していることを情報処理端末に通知する。S202の処理においてユーザが特定されていない場合(S208の「No」)、制御装置101は、S209の処理を行わない。 Then, if the user is identified in the process of S202 ("Yes" in S208), the notification unit 140 notifies the user (S209). For example, the notification unit 140 notifies the information processing terminal of the positional information of the target robot to guide the user as the positional information of the user. Moreover, the notification unit 140 notifies the information processing terminal that an abnormality has occurred in the user, for example, when there is an abnormality in the user's biological information. If the user is not specified in the process of S202 (“No” in S208), the control device 101 does not perform the process of S209.
 なお、図13のフローチャートにおいて、S205からS209の処理は、ユーザを誘導する対象ロボットが誘導を終えるまで繰り返し行われてよい。また、S206及びS207において、第2の実施形態で説明したS102乃至S104の処理が行われてもよい。 Note that in the flowchart of FIG. 13, the processing from S205 to S209 may be repeated until the target robot to guide the user finishes guiding. Also, in S206 and S207, the processing of S102 to S104 described in the second embodiment may be performed.
 このように、第3の実施形態の制御装置101は、対象ロボットがユーザを誘導する場合に、対象ロボットにおいて撮影された画像であって、ユーザを含む撮影画像に基づいて生体認証を行い、ユーザを特定し、対象ロボットの位置情報を、ユーザの位置情報として所定の端末に通知してもよい。これにより、制御装置101は、ユーザの位置情報を特定の人物に対して知らせることができる。例えば、病院であって、対象ロボットが患者を誘導している場合に、病院内の医療従事者に患者の位置情報を知らせることができる。 As described above, when the target robot guides the user, the control device 101 of the third embodiment performs biometric authentication based on the captured image including the user, which is an image captured by the target robot. may be specified, and the positional information of the target robot may be notified to a predetermined terminal as the user's positional information. Thereby, the control device 101 can inform a specific person of the position information of the user. For example, in a hospital where a target robot is guiding a patient, the position information of the patient can be notified to the medical staff in the hospital.
 また、第3の実施形態の制御装置101は、対象ロボットによって誘導されるユーザが特定された場合に、特定されたユーザに関する情報を含む個人データを取得し、個人データに基づいて、対象ロボットを制御してもよい。これにより、制御装置101は、ユーザに応じた制御を対象ロボットに行うことができる。 Further, when the user guided by the target robot is specified, the control device 101 of the third embodiment acquires personal data including information about the specified user, and controls the target robot based on the personal data. may be controlled. As a result, the control device 101 can control the target robot according to the user.
 また、第3の実施形態において、収集データには、対象ロボットが取得した、ユーザが装着しているウェアラブル端末からの情報であって、当該ウェアラブル端末によって計測されたユーザの生体情報が含まれてもよい。そして、第3の実施形態の制御装置101は、当該生体情報に基づいて、対象ロボットを制御してもよい。これにより、制御装置101は、ユーザの生体情報を用いて、対象ロボットを制御することができる。例えば、ユーザの体温、血圧、心拍数、及び血中酸素濃度等が異常の場合に、対象ロボットを減速させたり停止させたりすることができる。 In addition, in the third embodiment, the collected data includes information obtained by the target robot from a wearable terminal worn by the user and includes biological information of the user measured by the wearable terminal. good too. Then, the control device 101 of the third embodiment may control the target robot based on the biological information. Thereby, the control device 101 can control the target robot using the user's biological information. For example, if the user's body temperature, blood pressure, heart rate, blood oxygen concentration, etc. are abnormal, the target robot can be decelerated or stopped.
 <制御装置のハードウェアの構成例>
 上述した第1、第2、及び第3の実施形態の制御装置を構成するハードウェアについて説明する。図14は、各実施形態における制御装置を実現するコンピュータ装置のハードウェア構成の一例を示すブロック図である。コンピュータ装置10において、各実施形態及び各変形例で説明した、制御装置及び盗難防止方法が実現される。
<Hardware Configuration Example of Control Device>
Hardware constituting the control devices of the above-described first, second, and third embodiments will be described. FIG. 14 is a block diagram showing an example of a hardware configuration of a computer that implements the control device in each embodiment. The computer device 10 implements the control device and the anti-theft method described in each embodiment and each modified example.
 図14に示すように、コンピュータ装置10は、プロセッサ11、RAM(Random Access Memory)12、ROM(Read Only Memory)13、記憶装置14、入出力インタフェース15、バス16、及びドライブ装置17を備える。なお、制御装置は、複数の電気回路によって実現されてもよい。 As shown in FIG. 14, the computer device 10 includes a processor 11, a RAM (Random Access Memory) 12, a ROM (Read Only Memory) 13, a storage device 14, an input/output interface 15, a bus 16, and a drive device 17. Note that the control device may be realized by a plurality of electric circuits.
 記憶装置14は、プログラム(コンピュータプログラム)18を格納する。プロセッサ11は、RAM12を用いて本制御装置のプログラム18を実行する。具体的には、例えば、プログラム18は、図3、図6、図13、及び図16に示す処理をコンピュータに実行させるプログラムを含む。プロセッサ11が、プログラム18を実行することに応じて、本制御装置の各構成要素の機能が実現される。プログラム18は、ROM13に記憶されていてもよい。また、プログラム18は、記憶媒体20に記録され、ドライブ装置17を用いて読み出されてもよいし、図示しない外部装置から図示しないネットワークを介してコンピュータ装置10に送信されてもよい。 The storage device 14 stores a program (computer program) 18. The processor 11 uses the RAM 12 to execute the program 18 of the control device. Specifically, for example, the program 18 includes a program that causes a computer to execute the processes shown in FIGS. 3, 6, 13, and 16. FIG. As the processor 11 executes the program 18, the function of each component of this control device is realized. Program 18 may be stored in ROM 13 . The program 18 may be recorded on the storage medium 20 and read using the drive device 17, or may be transmitted from an external device (not shown) to the computer device 10 via a network (not shown).
 入出力インタフェース15は、周辺機器(キーボード、マウス、表示装置など)19とデータをやり取りする。入出力インタフェース15は、データを取得または出力する手段として機能する。バス36は、各構成要素を接続する。 The input/output interface 15 exchanges data with peripheral devices (keyboard, mouse, display device, etc.) 19 . The input/output interface 15 functions as means for acquiring or outputting data. A bus 36 connects each component.
 なお、制御装置の実現方法には様々な変形例がある。例えば、制御装置は、専用の装置として実現することができる。また、制御装置は、複数の装置の組み合わせに基づいて実現することができる。 It should be noted that there are various modifications to the method of implementing the control device. For example, the controller can be implemented as a dedicated device. Also, the control device can be realized based on a combination of multiple devices.
 各実施形態の機能における各構成要素を実現するためのプログラムを記憶媒体に記録させ、該記憶媒体に記録されたプログラムをコードとして読み出し、コンピュータにおいて実行する処理方法も各実施形態の範疇に含まれる。すなわち、コンピュータ読取可能な記憶媒体も各実施形態の範囲に含まれる。また、上述のプログラムが記録された記憶媒体、及びそのプログラム自体も各実施形態に含まれる。 A processing method in which a program for realizing each component in the function of each embodiment is recorded in a storage medium, the program recorded in the storage medium is read as code, and a computer executes the processing method is also included in the scope of each embodiment. . That is, a computer-readable storage medium is also included in the scope of each embodiment. Further, each embodiment includes a storage medium in which the above-described program is recorded, and the program itself.
 該記憶媒体は、例えばフロッピー(登録商標)ディスク、ハードディスク、光ディスク、光磁気ディスク、CD(Compact Disc)-ROM、磁気テープ、不揮発性メモリカード、またはROMであるが、この例に限らない。また該記憶媒体に記録されたプログラムは、単体で処理を実行しているプログラムに限らず、他のソフトウェア、拡張ボードの機能と共同して、OS(Operating System)上で動作して処理を実行するプログラムも各実施形態の範疇に含まれる。 The storage medium is, for example, a floppy (registered trademark) disk, hard disk, optical disk, magneto-optical disk, CD (Compact Disc)-ROM, magnetic tape, non-volatile memory card, or ROM, but is not limited to this example. In addition, the programs recorded on the storage medium are not limited to programs that execute processing independently, but also work together with other software and expansion board functions to run on an OS (Operating System) to execute processing. A program for executing the program is also included in the category of each embodiment.
 以上、実施形態を参照して本願発明を説明したが、本願発明は上記実施形態に限定されるものではない。本願発明の構成や詳細には、本願発明のスコープ内で当業者が理解しうる様々な変更をすることができる。 Although the present invention has been described with reference to the embodiments, the present invention is not limited to the above embodiments. Various changes that can be understood by those skilled in the art can be made to the configuration and details of the present invention within the scope of the present invention.
 上記の実施形態の一部または全部は、以下の付記のようにも記載されうるが、以下には限られない
 <付記>
 [付記1]
 自動走行する複数のロボットのそれぞれの位置情報と、前記複数のロボットのそれぞれによって収集されたデータと、を含む収集データを取得する取得手段と、
 前記複数のロボットのうちの少なくとも一である対象ロボットを、前記収集データに基づいて制御する制御手段と、を備える、
 制御装置。
Some or all of the above embodiments can also be described as the following notes, but are not limited to the following <Notes>
[Appendix 1]
Acquisition means for acquiring collected data including position information of each of a plurality of automatically traveling robots and data collected by each of the plurality of robots;
a control means for controlling a target robot, which is at least one of the plurality of robots, based on the collected data;
Control device.
 [付記2]
 前記収集データは、前記複数のロボットに搭載される撮影装置によって生成された撮影画像を含み、
 前記制御手段は、
 前記収集データに基づいて、イベントを検知するイベント検知手段と
 検知された前記イベントに基づいて、前記対象ロボットを制御するロボット制御手段と、を備える、
 付記1に記載の制御装置。
[Appendix 2]
the collected data includes captured images generated by imaging devices mounted on the plurality of robots;
The control means is
event detection means for detecting an event based on the collected data; and robot control means for controlling the target robot based on the detected event.
1. The control device according to appendix 1.
 [付記3]
 前記対象ロボットが、所定の場所に向かって移動している場合、
 前記ロボット制御手段は、検知された前記イベントの発生場所を通らないルートで移動するよう前記対象ロボットを制御する、
 付記2に記載の制御装置。
[Appendix 3]
When the target robot is moving toward a predetermined location,
The robot control means controls the target robot to move along a route that does not pass through the location where the detected event occurred.
The control device according to appendix 2.
 [付記4]
 前記イベント検知手段は、障害物の出現を予測し、
 前記ロボット制御手段は、当該障害物を避けるよう前記対象ロボットを制御する
 付記2または3に記載の制御装置。
[Appendix 4]
The event detection means predicts the appearance of an obstacle,
The control device according to appendix 2 or 3, wherein the robot control means controls the target robot to avoid the obstacle.
 [付記5]
 前記ロボット制御手段は、検知された前記イベントの発生場所に向かうよう前記対象ロボットを制御する、
 付記2に記載の制御装置。
[Appendix 5]
The robot control means controls the target robot to move toward the location where the detected event occurred.
The control device according to appendix 2.
 [付記6]
 前記対象ロボットがユーザを誘導する場合に、
 前記対象ロボットにおいて撮影された画像であって、前記ユーザを含む撮影画像に基づいて生体認証を行い、前記ユーザを特定する認証手段と、
 前記対象ロボットの位置情報を、前記ユーザの位置情報として所定の端末に通知する通知手段と、をさらに備える、
 付記1乃至5のいずれか一項に記載の制御装置。
[Appendix 6]
When the target robot guides the user,
authentication means for performing biometric authentication based on a captured image including the user, which is an image captured by the target robot, to identify the user;
a notification means for notifying a predetermined terminal of position information of the target robot as position information of the user;
6. The control device according to any one of appendices 1 to 5.
 [付記7]
 前記取得手段は、特定された前記ユーザに関する情報を含む個人データを取得し、
 前記制御手段は、前記個人データに基づいて、前記対象ロボットを制御する、
 付記6に記載の制御装置。
[Appendix 7]
The acquisition means acquires personal data including information about the specified user,
the control means controls the target robot based on the personal data;
The control device according to appendix 6.
 [付記8]
 前記収集データには、前記対象ロボットが取得した、前記ユーザが装着しているウェアラブル端末からの情報であって、当該ウェアラブル端末によって計測された前記ユーザの生体情報が含まれ、
 前記制御手段は、当該生体情報に基づいて、前記対象ロボットを制御する、
 付記6または7に記載の制御装置。
[Appendix 8]
The collected data includes information obtained by the target robot from a wearable terminal worn by the user and includes biometric information of the user measured by the wearable terminal,
the control means controls the target robot based on the biological information;
8. A control device according to appendix 6 or 7.
 [付記9]
 前記取得手段は、前記複数のロボットとは異なる撮影装置から他の撮影画像を取得し、前記複数のロボットとはことなるセンサから他のセンサデータを取得し、
 前記制御手段は、前記収集データと前記他の撮影画像と前記他のセンサデータとに基づいて、前記対象ロボットを制御する、
 付記1乃至8のいずれか一項に記載の制御装置。
[Appendix 9]
the acquiring means acquires another photographed image from a photographing device different from that of the plurality of robots, and acquires other sensor data from a sensor different from that of the plurality of robots;
The control means controls the target robot based on the collected data, the other captured image, and the other sensor data.
9. The control device according to any one of appendices 1 to 8.
 [付記10]
 前記取得手段は、ローカル5G(5th Generation)のネットワークを通じて前記複数のロボットから前記収集データを取得し、
 前記制御手段は、前記ローカル5Gのネットワークを通じて前記対象ロボットを制御する、
 付記1乃至9のいずれか一項に記載の制御装置。
[Appendix 10]
The acquisition means acquires the collected data from the plurality of robots through a local 5G (5th Generation) network,
the control means controls the target robot through the local 5G network;
10. The control device according to any one of appendices 1 to 9.
 [付記11]
 自動走行する複数のロボットのそれぞれの位置情報と、前記複数のロボットのそれぞれによって収集されたデータと、を含む収集データを取得し、
 前記複数のロボットのうちの少なくとも一である対象ロボットを、前記収集データに基づいて制御する、
 制御方法。
[Appendix 11]
Acquiring collected data including position information of each of a plurality of robots that automatically travel and data collected by each of the plurality of robots;
controlling a target robot, which is at least one of the plurality of robots, based on the collected data;
control method.
 [付記12]
 前記収集データは、前記複数のロボットに搭載される撮影装置によって生成された撮影画像を含み、
 前記収集データに基づいて制御するステップにおいて、
 前記収集データに基づいて、イベントを検知し、
 検知された前記イベントに基づいて、前記対象ロボットを制御する、
 付記11に記載の制御方法。
[Appendix 12]
the collected data includes captured images generated by imaging devices mounted on the plurality of robots;
In the step of controlling based on the collected data,
Detecting an event based on the collected data;
controlling the target robot based on the detected event;
The control method according to appendix 11.
 [付記13]
 前記対象ロボットが、所定の場所に向かって移動している場合、
 前記対象ロボットを制御するステップにおいて、検知された前記イベントの発生場所を通らないルートで移動するよう前記対象ロボットを制御する、
 付記12に記載の制御方法。
[Appendix 13]
When the target robot is moving toward a predetermined location,
In the step of controlling the target robot, the target robot is controlled to move along a route that does not pass through the location where the detected event occurred;
The control method according to appendix 12.
 [付記14]
 前記イベントを検知する処理において、障害物の出現を予測し、
 前記対象ロボットを制御するステップにおいて、当該障害物を避けるよう前記対象ロボットを制御する
 付記12または13に記載の制御方法。
[Appendix 14]
Predicting the appearance of an obstacle in the process of detecting the event;
14. The control method according to appendix 12 or 13, wherein in the step of controlling the target robot, the target robot is controlled to avoid the obstacle.
 [付記15]
 前記対象ロボットを制御するステップにおいて、検知された前記イベントの発生場所に向かうよう前記対象ロボットを制御する、
 付記12に記載の制御方法。
[Appendix 15]
In the step of controlling the target robot, controlling the target robot to move toward the location where the detected event occurred;
The control method according to appendix 12.
 [付記16]
 前記対象ロボットがユーザを誘導する場合に、
 前記対象ロボットにおいて撮影された画像であって、前記ユーザを含む撮影画像に基づいて生体認証を行い、前記ユーザを特定し、
 前記対象ロボットの位置情報を、前記ユーザの位置情報として所定の端末に通知する、
 付記11乃至15のいずれか一項に記載の制御方法。
[Appendix 16]
When the target robot guides the user,
performing biometric authentication based on a photographed image including the user, which is an image photographed by the target robot, and identifying the user;
Notifying a predetermined terminal of position information of the target robot as position information of the user;
16. The control method according to any one of appendices 11 to 15.
 [付記17]
 前記取得するステップにおいて、特定された前記ユーザに関する情報を含む個人データを取得し、
 前記収集データに基づいて制御するステップにおいて、前記個人データに基づいて、前記対象ロボットを制御する、
 付記16に記載の制御方法。
[Appendix 17]
obtaining personal data including information about the identified user in the obtaining step;
controlling the target robot based on the personal data in the step of controlling based on the collected data;
The control method according to appendix 16.
 [付記18]
 前記収集データには、前記対象ロボットが取得した、前記ユーザが装着しているウェアラブル端末からの情報であって、当該ウェアラブル端末によって計測された前記ユーザの生体情報が含まれ、
 前記収集データに基づいて制御するステップにおいて、当該生体情報に基づいて、前記対象ロボットを制御する、
 付記16または17に記載の制御方法。
[Appendix 18]
The collected data includes information obtained by the target robot from a wearable terminal worn by the user and includes biological information of the user measured by the wearable terminal,
controlling the target robot based on the biological information in the step of controlling based on the collected data;
18. The control method according to appendix 16 or 17.
 [付記19]
 前記取得するステップにおいて、前記複数のロボットとは異なる撮影装置から他の撮影画像を取得し、前記複数のロボットとはことなるセンサから他のセンサデータを取得し、
 前記収集データに基づいて制御するステップにおいて、前記収集データと前記他の撮影画像と前記他のセンサデータとに基づいて、前記対象ロボットを制御する、
 付記11乃至18のいずれか一項に記載の制御方法。
[Appendix 19]
In the acquiring step, acquiring another photographed image from an imaging device different from that of the plurality of robots, acquiring other sensor data from a sensor different from that of the plurality of robots,
in the step of controlling based on the collected data, controlling the target robot based on the collected data, the other captured image, and the other sensor data;
19. The control method according to any one of appendices 11 to 18.
 [付記20]
 前記取得するステップにおいて、ローカル5G(5th Generation)のネットワークを通じて前記複数のロボットから前記収集データを取得し、
 前記収集データに基づいて制御するステップにおいて、前記ローカル5Gのネットワークを通じて前記対象ロボットを制御する、
 付記11乃至19のいずれか一項に記載の制御方法。
[Appendix 20]
In the acquiring step, acquiring the collected data from the plurality of robots through a local 5G (5th Generation) network;
In the step of controlling based on the collected data, controlling the target robot through the local 5G network;
20. The control method according to any one of appendices 11 to 19.
 [付記21]
 自動走行する複数のロボットのそれぞれの位置情報と、前記複数のロボットのそれぞれによって収集されたデータと、を含む収集データを取得する処理と、
 前記複数のロボットのうちの少なくとも一である対象ロボットを、前記収集データに基づいて制御する処理と、をコンピュータに実行させるプログラムを格納する、
 記憶媒体。
[Appendix 21]
A process of acquiring collected data including position information of each of a plurality of robots that automatically travel and data collected by each of the plurality of robots;
storing a program for causing a computer to execute a process of controlling a target robot, which is at least one of the plurality of robots, based on the collected data;
storage medium.
 [付記22]
 前記収集データは、前記複数のロボットに搭載される撮影装置によって生成された撮影画像を含み、
 前記制御する処理において、
 前記収集データに基づいて、イベントを検知する処理と
 検知された前記イベントに基づいて、前記対象ロボットを制御する処理と、を実行させる、
 付記21に記載の記憶媒体。
[Appendix 22]
the collected data includes captured images generated by imaging devices mounted on the plurality of robots;
In the controlling process,
executing a process of detecting an event based on the collected data and a process of controlling the target robot based on the detected event;
The storage medium according to appendix 21.
 [付記23]
 前記対象ロボットが、所定の場所に向かって移動している場合、
 前記対象ロボットを制御する処理において、検知された前記イベントの発生場所を通らないルートで移動するよう前記対象ロボットを制御する、
 付記22に記載の記憶媒体。
[Appendix 23]
When the target robot is moving toward a predetermined location,
In the process of controlling the target robot, controlling the target robot so as to move along a route that does not pass through the location where the detected event occurred;
23. The storage medium according to appendix 22.
 [付記24]
 前記イベントを検知する処理において、障害物の出現を予測し、
 前記対象ロボットを制御する処理において、当該障害物を避けるよう前記対象ロボットを制御する
 付記22または23に記載の記憶媒体。
[Appendix 24]
Predicting the appearance of an obstacle in the process of detecting the event;
24. The storage medium according to appendix 22 or 23, wherein in the process of controlling the target robot, the target robot is controlled to avoid the obstacle.
 [付記25]
 前記対象ロボットを制御する処理において、検知された前記イベントの発生場所に向かうよう前記対象ロボットを制御する、
 付記22に記載の記憶媒体。
[Appendix 25]
In the process of controlling the target robot, controlling the target robot to move toward the location where the detected event occurred;
23. The storage medium according to appendix 22.
 [付記26]
 前記対象ロボットがユーザを誘導する場合に、
 前記対象ロボットにおいて撮影された画像であって、前記ユーザを含む撮影画像に基づいて生体認証を行い、前記ユーザを特定する処理と、
 前記対象ロボットの位置情報を、前記ユーザの位置情報として所定の端末に通知する処理と、をさらに実行させる、
 付記21乃至25のいずれか一項に記載の記憶媒体。
[Appendix 26]
When the target robot guides the user,
a process of performing biometric authentication based on a photographed image including the user, which is an image photographed by the target robot, and identifying the user;
a process of notifying a predetermined terminal of the position information of the target robot as the position information of the user;
26. The storage medium according to any one of appendices 21 to 25.
 [付記27]
 前記取得する処理において、特定された前記ユーザに関する情報を含む個人データを取得し、
 前記収集データに基づいて制御する処理において、前記個人データに基づいて、前記対象ロボットを制御する、
 付記26に記載の記憶媒体。
[Appendix 27]
Acquiring personal data including information about the identified user in the acquiring process;
controlling the target robot based on the personal data in the process of controlling based on the collected data;
A storage medium according to appendix 26.
 [付記28]
 前記収集データには、前記対象ロボットが取得した、前記ユーザが装着しているウェアラブル端末からの情報であって、当該ウェアラブル端末によって計測された前記ユーザの生体情報が含まれ、
 前記収集データに基づいて制御する処理において、当該生体情報に基づいて、前記対象ロボットを制御する、
 付記26または27に記載の記憶媒体。
[Appendix 28]
The collected data includes information obtained by the target robot from a wearable terminal worn by the user and includes biometric information of the user measured by the wearable terminal,
controlling the target robot based on the biological information in the process of controlling based on the collected data;
28. The storage medium according to appendix 26 or 27.
 [付記29]
 前記取得する処理において、前記複数のロボットとは異なる撮影装置から他の撮影画像を取得し、前記複数のロボットとはことなるセンサから他のセンサデータを取得し、
 前記収集データに基づいて制御する処理において、前記収集データと前記他の撮影画像と前記他のセンサデータとに基づいて、前記対象ロボットを制御する、
 付記21乃至28のいずれか一項に記載の記憶媒体。
[Appendix 29]
In the acquiring process, acquiring another captured image from an imaging device different from that of the plurality of robots, acquiring other sensor data from a sensor different from that of the plurality of robots,
In the process of controlling based on the collected data, the target robot is controlled based on the collected data, the other captured image, and the other sensor data;
29. The storage medium according to any one of appendices 21-28.
 [付記30]
 前記取得する処理において、ローカル5G(5th Generation)のネットワークを通じて前記複数のロボットから前記収集データを取得し、
 前記前記収集データに基づいて制御する処理において、前記ローカル5Gのネットワークを通じて前記対象ロボットを制御する、
 付記21乃至29のいずれか一項に記載の記憶媒体。
[Appendix 30]
In the acquiring process, acquiring the collected data from the plurality of robots through a local 5G (5th Generation) network,
Controlling the target robot through the local 5G network in the process of controlling based on the collected data;
29. The storage medium according to any one of appendices 21 to 29.
 100、101 制御装置
 110、111 取得部
 120、125 制御部
 121、123 イベント検知部
 122、124 ロボット制御部
 130 認証部
 140 通知部
 200 ロボット
 300 基地局
100, 101 control device 110, 111 acquisition unit 120, 125 control unit 121, 123 event detection unit 122, 124 robot control unit 130 authentication unit 140 notification unit 200 robot 300 base station

Claims (30)

  1.  自動走行する複数のロボットのそれぞれの位置情報と、前記複数のロボットのそれぞれによって収集されたデータと、を含む収集データを取得する取得手段と、
     前記複数のロボットのうちの少なくとも一である対象ロボットを、前記収集データに基づいて制御する制御手段と、を備える、
     制御装置。
    Acquisition means for acquiring collected data including position information of each of a plurality of automatically traveling robots and data collected by each of the plurality of robots;
    a control means for controlling a target robot, which is at least one of the plurality of robots, based on the collected data;
    Control device.
  2.  前記収集データは、前記複数のロボットに搭載される撮影装置によって生成された撮影画像を含み、
     前記制御手段は、
     前記収集データに基づいて、イベントを検知するイベント検知手段と
     検知された前記イベントに基づいて、前記対象ロボットを制御するロボット制御手段と、を備える、
     請求項1に記載の制御装置。
    the collected data includes captured images generated by imaging devices mounted on the plurality of robots;
    The control means is
    event detection means for detecting an event based on the collected data; and robot control means for controlling the target robot based on the detected event.
    A control device according to claim 1 .
  3.  前記対象ロボットが、所定の場所に向かって移動している場合、
     前記ロボット制御手段は、検知された前記イベントの発生場所を通らないルートで移動するよう前記対象ロボットを制御する、
     請求項2に記載の制御装置。
    When the target robot is moving toward a predetermined location,
    The robot control means controls the target robot to move along a route that does not pass through the location where the detected event occurred.
    3. A control device according to claim 2.
  4.  前記イベント検知手段は、障害物の出現を予測し、
     前記ロボット制御手段は、当該障害物を避けるよう前記対象ロボットを制御する
     請求項2または3に記載の制御装置。
    The event detection means predicts the appearance of an obstacle,
    4. The control device according to claim 2, wherein the robot control means controls the target robot to avoid the obstacle.
  5.  前記ロボット制御手段は、検知された前記イベントの発生場所に向かうよう前記対象ロボットを制御する、
     請求項2に記載の制御装置。
    The robot control means controls the target robot to move toward the location where the detected event occurred.
    3. A control device according to claim 2.
  6.  前記対象ロボットがユーザを誘導する場合に、
     前記対象ロボットにおいて撮影された画像であって、前記ユーザを含む撮影画像に基づいて生体認証を行い、前記ユーザを特定する認証手段と、
     前記対象ロボットの位置情報を、前記ユーザの位置情報として所定の端末に通知する通知手段と、をさらに備える、
     請求項1乃至5のいずれか一項に記載の制御装置。
    When the target robot guides the user,
    authentication means for performing biometric authentication based on a captured image including the user, which is an image captured by the target robot, to identify the user;
    a notification means for notifying a predetermined terminal of position information of the target robot as position information of the user;
    6. A control device according to any one of claims 1 to 5.
  7.  前記取得手段は、特定された前記ユーザに関する情報を含む個人データを取得し、
     前記制御手段は、前記個人データに基づいて、前記対象ロボットを制御する、
     請求項6に記載の制御装置。
    The acquisition means acquires personal data including information about the specified user,
    the control means controls the target robot based on the personal data;
    7. A control device according to claim 6.
  8.  前記収集データには、前記対象ロボットが取得した、前記ユーザが装着しているウェアラブル端末からの情報であって、当該ウェアラブル端末によって計測された前記ユーザの生体情報が含まれ、
     前記制御手段は、当該生体情報に基づいて、前記対象ロボットを制御する、
     請求項6または7に記載の制御装置。
    The collected data includes information obtained by the target robot from a wearable terminal worn by the user and includes biometric information of the user measured by the wearable terminal,
    the control means controls the target robot based on the biological information;
    A control device according to claim 6 or 7.
  9.  前記取得手段は、前記複数のロボットとは異なる撮影装置から他の撮影画像を取得し、前記複数のロボットとはことなるセンサから他のセンサデータを取得し、
     前記制御手段は、前記収集データと前記他の撮影画像と前記他のセンサデータとに基づいて、前記対象ロボットを制御する、
     請求項1乃至8のいずれか一項に記載の制御装置。
    the acquiring means acquires another photographed image from a photographing device different from that of the plurality of robots, and acquires other sensor data from a sensor different from that of the plurality of robots;
    The control means controls the target robot based on the collected data, the other captured image, and the other sensor data.
    9. A control device according to any one of claims 1 to 8.
  10.  前記取得手段は、ローカル5G(5th Generation)のネットワークを通じて前記複数のロボットから前記収集データを取得し、
     前記制御手段は、前記ローカル5Gのネットワークを通じて前記対象ロボットを制御する、
     請求項1乃至9のいずれか一項に記載の制御装置。
    The acquisition means acquires the collected data from the plurality of robots through a local 5G (5th Generation) network,
    the control means controls the target robot through the local 5G network;
    10. A control device according to any one of claims 1-9.
  11.  自動走行する複数のロボットのそれぞれの位置情報と、前記複数のロボットのそれぞれによって収集されたデータと、を含む収集データを取得し、
     前記複数のロボットのうちの少なくとも一である対象ロボットを、前記収集データに基づいて制御する、
     制御方法。
    Acquiring collected data including position information of each of a plurality of robots that automatically travel and data collected by each of the plurality of robots;
    controlling a target robot, which is at least one of the plurality of robots, based on the collected data;
    control method.
  12.  前記収集データは、前記複数のロボットに搭載される撮影装置によって生成された撮影画像を含み、
     前記収集データに基づいて制御するステップにおいて、
     前記収集データに基づいて、イベントを検知し、
     検知された前記イベントに基づいて、前記対象ロボットを制御する、
     請求項11に記載の制御方法。
    the collected data includes captured images generated by imaging devices mounted on the plurality of robots;
    In the step of controlling based on the collected data,
    Detecting an event based on the collected data;
    controlling the target robot based on the detected event;
    The control method according to claim 11.
  13.  前記対象ロボットが、所定の場所に向かって移動している場合、
     前記対象ロボットを制御するステップにおいて、検知された前記イベントの発生場所を通らないルートで移動するよう前記対象ロボットを制御する、
     請求項12に記載の制御方法。
    When the target robot is moving toward a predetermined location,
    In the step of controlling the target robot, the target robot is controlled to move along a route that does not pass through the location where the detected event occurred;
    The control method according to claim 12.
  14.  前記イベントを検知する処理において、障害物の出現を予測し、
     前記対象ロボットを制御するステップにおいて、当該障害物を避けるよう前記対象ロボットを制御する
     請求項12または13に記載の制御方法。
    Predicting the appearance of an obstacle in the process of detecting the event;
    14. The control method according to claim 12, wherein in the step of controlling the target robot, the target robot is controlled to avoid the obstacle.
  15.  前記対象ロボットを制御するステップにおいて、検知された前記イベントの発生場所に向かうよう前記対象ロボットを制御する、
     請求項12に記載の制御方法。
    In the step of controlling the target robot, controlling the target robot to move toward the location where the detected event occurred;
    The control method according to claim 12.
  16.  前記対象ロボットがユーザを誘導する場合に、
     前記対象ロボットにおいて撮影された画像であって、前記ユーザを含む撮影画像に基づいて生体認証を行い、前記ユーザを特定し、
     前記対象ロボットの位置情報を、前記ユーザの位置情報として所定の端末に通知する、
     請求項11乃至15のいずれか一項に記載の制御方法。
    When the target robot guides the user,
    performing biometric authentication based on a photographed image including the user, which is an image photographed by the target robot, and identifying the user;
    Notifying a predetermined terminal of position information of the target robot as position information of the user;
    Control method according to any one of claims 11 to 15.
  17.  前記取得するステップにおいて、特定された前記ユーザに関する情報を含む個人データを取得し、
     前記収集データに基づいて制御するステップにおいて、前記個人データに基づいて、前記対象ロボットを制御する、
     請求項16に記載の制御方法。
    obtaining personal data including information about the identified user in the obtaining step;
    controlling the target robot based on the personal data in the step of controlling based on the collected data;
    The control method according to claim 16.
  18.  前記収集データには、前記対象ロボットが取得した、前記ユーザが装着しているウェアラブル端末からの情報であって、当該ウェアラブル端末によって計測された前記ユーザの生体情報が含まれ、
     前記収集データに基づいて制御するステップにおいて、当該生体情報に基づいて、前記対象ロボットを制御する、
     請求項16または17に記載の制御方法。
    The collected data includes information obtained by the target robot from a wearable terminal worn by the user and includes biometric information of the user measured by the wearable terminal,
    controlling the target robot based on the biological information in the step of controlling based on the collected data;
    The control method according to claim 16 or 17.
  19.  前記取得するステップにおいて、前記複数のロボットとは異なる撮影装置から他の撮影画像を取得し、前記複数のロボットとはことなるセンサから他のセンサデータを取得し、
     前記収集データに基づいて制御するステップにおいて、前記収集データと前記他の撮影画像と前記他のセンサデータとに基づいて、前記対象ロボットを制御する、
     請求項11乃至18のいずれか一項に記載の制御方法。
    In the acquiring step, acquiring another photographed image from an imaging device different from that of the plurality of robots, acquiring other sensor data from a sensor different from that of the plurality of robots,
    in the step of controlling based on the collected data, controlling the target robot based on the collected data, the other captured image, and the other sensor data;
    Control method according to any one of claims 11 to 18.
  20.  前記取得するステップにおいて、ローカル5G(5th Generation)のネットワークを通じて前記複数のロボットから前記収集データを取得し、
     前記収集データに基づいて制御するステップにおいて、前記ローカル5Gのネットワークを通じて前記対象ロボットを制御する、
     請求項11乃至19のいずれか一項に記載の制御方法。
    In the acquiring step, acquiring the collected data from the plurality of robots through a local 5G (5th Generation) network;
    In the step of controlling based on the collected data, controlling the target robot through the local 5G network;
    Control method according to any one of claims 11 to 19.
  21.  自動走行する複数のロボットのそれぞれの位置情報と、前記複数のロボットのそれぞれによって収集されたデータと、を含む収集データを取得する処理と、
     前記複数のロボットのうちの少なくとも一である対象ロボットを、前記収集データに基づいて制御する処理と、をコンピュータに実行させるプログラムを格納する、
     記憶媒体。
    A process of acquiring collected data including position information of each of a plurality of robots that automatically travel and data collected by each of the plurality of robots;
    storing a program for causing a computer to execute a process of controlling a target robot, which is at least one of the plurality of robots, based on the collected data;
    storage medium.
  22.  前記収集データは、前記複数のロボットに搭載される撮影装置によって生成された撮影画像を含み、
     前記制御する処理において、
     前記収集データに基づいて、イベントを検知する処理と
     検知された前記イベントに基づいて、前記対象ロボットを制御する処理と、を実行させる、
     請求項21に記載の記憶媒体。
    the collected data includes captured images generated by imaging devices mounted on the plurality of robots;
    In the controlling process,
    executing a process of detecting an event based on the collected data and a process of controlling the target robot based on the detected event;
    22. A storage medium according to claim 21.
  23.  前記対象ロボットが、所定の場所に向かって移動している場合、
     前記対象ロボットを制御する処理において、検知された前記イベントの発生場所を通らないルートで移動するよう前記対象ロボットを制御する、
     請求項22に記載の記憶媒体。
    When the target robot is moving toward a predetermined location,
    In the process of controlling the target robot, controlling the target robot so as to move along a route that does not pass through the location where the detected event occurred;
    23. A storage medium according to claim 22.
  24.  前記イベントを検知する処理において、障害物の出現を予測し、
     前記対象ロボットを制御する処理において、当該障害物を避けるよう前記対象ロボットを制御する
     請求項22または23に記載の記憶媒体。
    Predicting the appearance of an obstacle in the process of detecting the event;
    24. The storage medium according to claim 22 or 23, wherein in the process of controlling the target robot, the target robot is controlled to avoid the obstacle.
  25.  前記対象ロボットを制御する処理において、検知された前記イベントの発生場所に向かうよう前記対象ロボットを制御する、
     請求項22に記載の記憶媒体。
    In the process of controlling the target robot, controlling the target robot to move toward the location where the detected event occurred;
    23. A storage medium according to claim 22.
  26.  前記対象ロボットがユーザを誘導する場合に、
     前記対象ロボットにおいて撮影された画像であって、前記ユーザを含む撮影画像に基づいて生体認証を行い、前記ユーザを特定する処理と、
     前記対象ロボットの位置情報を、前記ユーザの位置情報として所定の端末に通知する処理と、をさらに実行させる、
     請求項21乃至25のいずれか一項に記載の記憶媒体。
    When the target robot guides the user,
    a process of performing biometric authentication based on a photographed image including the user, which is an image photographed by the target robot, and identifying the user;
    a process of notifying a predetermined terminal of the position information of the target robot as the position information of the user;
    26. A storage medium according to any one of claims 21-25.
  27.  前記取得する処理において、特定された前記ユーザに関する情報を含む個人データを取得し、
     前記収集データに基づいて制御する処理において、前記個人データに基づいて、前記対象ロボットを制御する、
     請求項26に記載の記憶媒体。
    Acquiring personal data including information about the identified user in the acquiring process;
    controlling the target robot based on the personal data in the process of controlling based on the collected data;
    27. A storage medium according to claim 26.
  28.  前記収集データには、前記対象ロボットが取得した、前記ユーザが装着しているウェアラブル端末からの情報であって、当該ウェアラブル端末によって計測された前記ユーザの生体情報が含まれ、
     前記収集データに基づいて制御する処理において、当該生体情報に基づいて、前記対象ロボットを制御する、
     請求項26または27に記載の記憶媒体。
    The collected data includes information obtained by the target robot from a wearable terminal worn by the user and includes biometric information of the user measured by the wearable terminal,
    controlling the target robot based on the biological information in the process of controlling based on the collected data;
    28. A storage medium according to claim 26 or 27.
  29.  前記取得する処理において、前記複数のロボットとは異なる撮影装置から他の撮影画像を取得し、前記複数のロボットとはことなるセンサから他のセンサデータを取得し、
     前記収集データに基づいて制御する処理において、前記収集データと前記他の撮影画像と前記他のセンサデータとに基づいて、前記対象ロボットを制御する、
     請求項21乃至28のいずれか一項に記載の記憶媒体。
    In the acquiring process, acquiring another captured image from an imaging device different from that of the plurality of robots, acquiring other sensor data from a sensor different from that of the plurality of robots,
    In the process of controlling based on the collected data, the target robot is controlled based on the collected data, the other captured image, and the other sensor data;
    29. A storage medium according to any one of claims 21-28.
  30.  前記取得する処理において、ローカル5G(5th Generation)のネットワークを通じて前記複数のロボットから前記収集データを取得し、
     前記前記収集データに基づいて制御する処理において、前記ローカル5Gのネットワークを通じて前記対象ロボットを制御する、
     請求項21乃至29のいずれか一項に記載の記憶媒体。
    In the acquiring process, acquiring the collected data from the plurality of robots through a local 5G (5th Generation) network,
    Controlling the target robot through the local 5G network in the process of controlling based on the collected data;
    30. A storage medium according to any one of claims 21-29.
PCT/JP2021/029655 2021-08-11 2021-08-11 Control device, control method, and computer-readable storage medium WO2023017588A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/029655 WO2023017588A1 (en) 2021-08-11 2021-08-11 Control device, control method, and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/029655 WO2023017588A1 (en) 2021-08-11 2021-08-11 Control device, control method, and computer-readable storage medium

Publications (1)

Publication Number Publication Date
WO2023017588A1 true WO2023017588A1 (en) 2023-02-16

Family

ID=85200123

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/029655 WO2023017588A1 (en) 2021-08-11 2021-08-11 Control device, control method, and computer-readable storage medium

Country Status (1)

Country Link
WO (1) WO2023017588A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008152504A (en) * 2006-12-18 2008-07-03 Hitachi Ltd Guidance robot device and guidance system
JP2012078950A (en) * 2010-09-30 2012-04-19 Sogo Keibi Hosho Co Ltd Monitoring system with autonomous mobile body, monitoring device, autonomous mobile body, monitoring method and monitoring program
WO2019171917A1 (en) * 2018-03-05 2019-09-12 日本電気株式会社 Information processing device, information processing method and information processing program
JP2019168942A (en) * 2018-03-23 2019-10-03 日本電産シンポ株式会社 Moving body, management device, and moving body system
WO2020129312A1 (en) * 2018-12-19 2020-06-25 本田技研工業株式会社 Guidance robot control device, guidance system in which same is used, and guidance robot control method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008152504A (en) * 2006-12-18 2008-07-03 Hitachi Ltd Guidance robot device and guidance system
JP2012078950A (en) * 2010-09-30 2012-04-19 Sogo Keibi Hosho Co Ltd Monitoring system with autonomous mobile body, monitoring device, autonomous mobile body, monitoring method and monitoring program
WO2019171917A1 (en) * 2018-03-05 2019-09-12 日本電気株式会社 Information processing device, information processing method and information processing program
JP2019168942A (en) * 2018-03-23 2019-10-03 日本電産シンポ株式会社 Moving body, management device, and moving body system
WO2020129312A1 (en) * 2018-12-19 2020-06-25 本田技研工業株式会社 Guidance robot control device, guidance system in which same is used, and guidance robot control method

Similar Documents

Publication Publication Date Title
EP3568842B1 (en) Emergency drone guidance device
Doukas et al. Digital cities of the future: Extending@ home assistive technologies for the elderly and the disabled
US11688265B1 (en) System and methods for safety, security, and well-being of individuals
US20210346557A1 (en) Robotic social interaction
EP3285160A1 (en) Intention recognition for triggering voice recognition system
KR101866974B1 (en) An Action Pattern Collecting Apparatus, System and Method using the same
EP3611124B1 (en) Automatic method of detecting visually impaired, pregnant, or disabled elevator passenger(s)
US9726503B2 (en) User-worn devices, systems, and methods for directing a user in an emergency
EP2369436A2 (en) Robot apparatus, information providing method carried out by the robot apparatus and computer storage media
WO2021125510A1 (en) Method and device for navigating in dynamic environment
JP2020052856A (en) Rescue support server, rescue support system and program
JP2009123181A (en) Information presentation system
Torres et al. How feasible is WiFi fingerprint-based indoor positioning for in-home monitoring?
JP5143780B2 (en) Monitoring device and monitoring method
WO2023017588A1 (en) Control device, control method, and computer-readable storage medium
KR20150144205A (en) Smart care system for disabled person using activity information, and method thereof
KR20180040908A (en) Airport robot
JP4178846B2 (en) Autonomous driving support device and program
JP2002149824A (en) Action detecting system
JP7095220B2 (en) Robot control system
JP2019079419A (en) Robot management system
JP5403355B2 (en) Position detection and behavior recognition system in the building
JP2004029908A (en) Support system and program
JP2020187389A (en) Mobile body locus analysis apparatus, mobile body locus analysis program, and mobile body locus analysis method
AU2021106898A4 (en) Network-based smart alert system for hospitals and aged care facilities

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21953483

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE