WO2021256463A1 - Système d'imagerie et système de robot - Google Patents

Système d'imagerie et système de robot Download PDF

Info

Publication number
WO2021256463A1
WO2021256463A1 PCT/JP2021/022669 JP2021022669W WO2021256463A1 WO 2021256463 A1 WO2021256463 A1 WO 2021256463A1 JP 2021022669 W JP2021022669 W JP 2021022669W WO 2021256463 A1 WO2021256463 A1 WO 2021256463A1
Authority
WO
WIPO (PCT)
Prior art keywords
image pickup
head
display
image
posture
Prior art date
Application number
PCT/JP2021/022669
Other languages
English (en)
Japanese (ja)
Inventor
雅幸 掃部
裕和 杉山
Original Assignee
川崎重工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 川崎重工業株式会社 filed Critical 川崎重工業株式会社
Priority to JP2022531836A priority Critical patent/JP7478236B2/ja
Publication of WO2021256463A1 publication Critical patent/WO2021256463A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/04Viewing devices

Definitions

  • This disclosure relates to an imaging system and a robot system.
  • Patent Document 1 discloses a master-slave type manifold.
  • the manipulator includes an operation robot, a work robot, and an image pickup device that captures an image of a work object.
  • the image pickup device is attached to the first articulated telescopic mechanism, and the operator's helmet is attached to the second articulated telescopic mechanism.
  • the first multi-joint expansion / contraction mechanism operates following the operation of the second multi-joint expansion / contraction mechanism, so that the image pickup device follows the movement of the operator's head.
  • the image captured by the image pickup device is projected on the screen, and the operator operates the operating robot while visually recognizing the image on the screen. For example, when the operator changes the direction of the head to the left or right with respect to the front screen, the screen displays an image captured by an image pickup device that moves following the movement of the head. The operator may not be able to sufficiently see the image of the screen located in the right direction or the left direction, and may not be able to perform an accurate operation.
  • the image pickup system includes an image pickup device, a detection device for detecting the movement of the user's head, and a position of the image pickup device and a position of the image pickup device following the movement of the head detected by the detection device.
  • a variable device that changes the orientation, a position in which the image is displayed and a direction in which the image is displayed according to the movement of the head by displaying the image captured by the imaging device to the user. It is equipped with a display device that changes at least one of them.
  • FIG. 1 is a perspective view showing an example of a configuration of a robot system according to an exemplary embodiment.
  • FIG. 2 is a block diagram showing an example of a functional configuration of an imaging system according to an exemplary embodiment.
  • FIG. 3 is a flowchart showing an example of the operation of the imaging system according to the exemplary embodiment.
  • FIG. 4 is a side view showing an example of the configuration of the display device according to the first modification of the exemplary embodiment.
  • FIG. 5 is a flowchart showing an example of the operation of the imaging system according to the first modification.
  • FIG. 6 is a side view showing an example of the configuration of the display device according to the second modification of the exemplary embodiment.
  • FIG. 7 is a flowchart showing an example of the operation of the imaging system according to the second modification.
  • FIG. 1 is a perspective view showing an example of a configuration of a robot system according to an exemplary embodiment.
  • FIG. 2 is a block diagram showing an example of a functional configuration of an imaging system according to an
  • FIG. 8 is a side view showing an example of the configuration of the display device according to the modified example 3 of the exemplary embodiment.
  • FIG. 9 is a flowchart showing an example of the operation of the imaging system according to the modified example 3.
  • FIG. 10 is a side view showing an example of the configuration of the image pickup apparatus according to the modified example 4 of the exemplary embodiment.
  • FIG. 1 is a perspective view showing an example of the configuration of the robot system 1 according to the exemplary embodiment.
  • the robot system 1 includes an image pickup system 100, a robot 200, a robot operation device 300, and a robot control device 400.
  • the image pickup system 100 includes an image pickup device 110, a mobile device 120, a motion detection device 130, a display device 140, an image pickup control device 150, and an image pickup input device 160.
  • the mobile device 120 is an example of a variable device.
  • the robot 200 is an industrial robot and includes a robot arm 210, a base 220, and an end effector 230.
  • the robot 200 may be another type of robot such as a service robot, a medical robot, a drug discovery robot, and a humanoid.
  • Service robots are robots used in various service industries such as nursing care, medical care, cleaning, security, guidance, rescue, cooking, and product provision.
  • the base 220 is fixed on the support surface and supports the robot arm 210.
  • the support surface of the base 220 may be an immovable surface such as a floor surface, or may be a movable surface on a movable device such as a traveling device.
  • the robot arm 210 has at least one joint and has at least one degree of freedom.
  • the robot arm 210 is configured so that the end effector 230 is attached to the tip of the robot arm 210.
  • the robot arm 210 can move the end effector 230 so as to freely change the position and posture of the end effector 230.
  • the end effector 230 performs various actions such as gripping, adsorption, spraying of a liquid such as paint, welding, and injection of a sealing agent on the object (also referred to as "work") W according to the application of the end effector 230. It is configured to be able to be added.
  • the robot arm 210 is a vertical articulated robot arm with 6 degrees of freedom having 6 rotating joints, but is not limited thereto.
  • the type of the robot arm 210 may be any type, for example, a horizontal articulated type, a polar coordinate type, a cylindrical coordinate type, a rectangular coordinate type, or the like.
  • the joint of the robot arm 210 may be any joint such as a linear motion joint.
  • the number of joints of the robot arm 210 may be any number such as 5 or less or 7 or more.
  • the robot operating device 300 is arranged at a position away from the robot 200 and is used for remotely controlling the robot 200.
  • the robot operating device 300 may be arranged at a position where the user P who handles the robot operating device 300 can directly see the robot 200, or may be arranged at a position where the user P cannot directly see the robot 200.
  • the robot operating device 300 may be arranged in a space isolated from the space in which the robot 200 is arranged, or in a space at a position away from the space.
  • the robot operating device 300 receives inputs such as various commands, information, and data, and outputs them to the robot control device 400.
  • the robot operating device 300 can accept an input by the user P.
  • the robot operating device 300 is connected to another device and can receive input from the device.
  • the robot operating device 300 may include known input means such as a lever, a button, a touch panel, a joystick, a motion capture, a camera, and a microphone.
  • the robot operating device 300 may include a teaching pendant, which is one of the teaching devices, a smart device such as a smartphone and a tablet, a personal computer, and a terminal device such as a dedicated terminal device.
  • the robot operating device 300 may include a master machine.
  • the master machine may be configured to perform similar or similar movements to the robot arm 210.
  • the robot control device 400 controls the operation of the robot 200.
  • the robot control device 400 is connected to the robot 200 and the robot operation device 300 via wired communication or wireless communication. Any wired communication or wireless communication may be used.
  • the robot control device 400 processes commands, information, data, and the like input via the robot operation device 300.
  • the robot control device 400 may be connected to an external device and may be configured to receive and process inputs such as commands, information, and data from the device.
  • the robot control device 400 controls the operation of the robot 200 according to the above commands, information, data, and the like.
  • the robot control device 400 controls the supply of power and the like to the robot 200.
  • the robot control device 400 manages information and the like for managing the robot 200.
  • the robot control device 400 outputs various commands, information, data, and the like to the robot operation device 300 and / or the display device 140 of the image pickup system 100.
  • the robot control device 400 causes the display device 140 to visually and / or audibly present various commands, information, data, and the like.
  • the robot control device 400 may output an image for operating the robot 200, an image showing the state of the robot 200, an image for managing the robot 200, and the like.
  • the robot control device 400 includes a computer. Further, the robot control device 400 supplies the electric circuit for controlling the electric power supplied to the robot 200, the device for controlling the power other than the electric power such as the air pressure and the hydraulic pressure supplied to the robot 200, and the robot 200. Equipment for controlling substances such as cooling water and paint may be provided. Devices other than the computer may be provided separately from the robot control device 400.
  • the image pickup device 110 of the image pickup system 100 includes a camera that captures a still image and / or a moving image of a digital image.
  • the camera may be a three-dimensional camera capable of capturing a three-dimensional image including the position information of the subject in the image.
  • the mobile device 120 is equipped with an image pickup device 110 and is configured so that the position and orientation of the image pickup device 110 can be freely changed.
  • the position of the image pickup device 110 may be the three-dimensional position of the image pickup device 110 in the three-dimensional space
  • the orientation of the image pickup device 110 is the center of the optical axis of the camera of the image pickup device 110. It may be oriented, and specifically, it may be a three-dimensional direction of the center of the optical axis in the three-dimensional space.
  • the orientation of the image pickup device 110 may correspond to the posture of the image pickup device 110.
  • the moving device 120 is not particularly limited, but in the present exemplary embodiment, it is a robot arm similar to the robot arm 210 and is fixed on the support surface.
  • the support surface is a ceiling surface
  • the robot arm 210 is suspended from the ceiling surface, but the position and orientation of the support surface are not limited.
  • the image pickup device 110 is attached to the tip of the robot arm.
  • the moving device 120 is other than, for example, a traveling device capable of traveling on a support surface such as a floor surface, an orbit traveling device capable of traveling on an orbit, a crane movable on an orbit, a crane provided with an arm, and a robot arm. It may be an articulated arm and other devices such as an unmanned aerial vehicle such as a drone.
  • the track of the track traveling device may be arranged so as to extend in the vertical direction, the horizontal direction, and the direction intersecting them.
  • the track traveling device can travel in various directions and positions.
  • the image pickup device 110 may be attached to a crane hook.
  • the mobile device 120 includes a pan head (also referred to as a “gimbal”) to which the image pickup device 110 is attached, and the orientation of the image pickup device 110 may be freely changed by operating the pan head.
  • a pan head also referred to as a “gimbal”
  • the motion detection device 130 is an example of the detection device, and detects the motion of the head H of the user P who operates the robot operation device 300.
  • the motion detecting device 130 is not particularly limited, but includes at least one infrared sensor 131 and at least one infrared marker 132 attached to the head H of the user P in this exemplary embodiment.
  • a plurality of infrared sensors 131 specifically, three infrared sensors 131 are arranged around the user P toward the user P.
  • the three infrared sensors 131 are arranged at positions away from the head H of the user P.
  • a plurality of infrared markers 132 specifically, four infrared markers 132 are arranged at different positions on the head H.
  • the head includes a portion of the human body above the neck, and may include, for example, the face, the crown, the temporal region, the occipital region, and the like.
  • the infrared marker 132 emits infrared light.
  • the infrared marker 132 may be a light emitter such as an infrared LED (Light Emitting Diode) that emits infrared light by itself, or may be a reflector that reflects the irradiated infrared light, and is a light emitter and a reflection. It may be configured to include both bodies.
  • the infrared sensor 131 receives infrared light and can detect the direction, intensity, intensity distribution, etc. of the received infrared light.
  • the infrared sensor 131 may be configured to be capable of receiving only infrared light, or may be configured to emit infrared light by itself and receive infrared light such as reflected light of the infrared light. You may. In the latter case, the infrared sensor 131 may be an infrared camera. By detecting the infrared light from the four infrared markers 132 using the three infrared sensors 131, it is possible to detect the position and posture of the head H with high accuracy. Although not limited to the following, the position of the head H may be a three-dimensional position such as a predetermined reference point of the head H in the three-dimensional space.
  • the posture of the head H may be the posture of a predetermined part such as a front portion of the head H, a plane crossing the head H, and an axis passing from the jaw of the head H to the crown, a surface or an axis. Specifically, it may be the three-dimensional orientation of the predetermined portion, surface or axis in the three-dimensional space.
  • the infrared sensor 131 may be attached to the head H of the user P and the infrared marker 132 may be arranged at a position away from the head H of the user P so as to reverse the above.
  • the positions and quantities of the infrared sensor 131 and the infrared marker 132 are not particularly limited as long as they can detect the position, posture, or both the position and the posture of the head H.
  • the display device 140 perceptibly presents the image captured by the image pickup device 110 to the user P.
  • the display device 140 is arranged in the vicinity of the robot operating device 300, and is arranged at a position away from the image pickup device 110.
  • the display device 140 may present the command, information, data, and the like received from the robot control device 400 to the user P in a perceptible manner.
  • the display device 140 includes a display such as a liquid crystal display (Liquid Crystal Display) and an organic or inorganic EL display (Electro-Luminescence Display), and presents visually.
  • the display device 140 may include an audio output device such as a speaker, and may make an auditory presentation.
  • the display device 140 may be configured to make a tactile presentation.
  • the display device 140 is a head-mounted display attached to the head H of the user P.
  • the head-mounted display has a goggle-like shape, and the lens portion of the head-mounted display forms a display surface on which an image is displayed.
  • the display device 140 can change the position and direction in which the display device 140 displays an image in accordance with the movement of the head H of the user P.
  • the display device 140 may be configured so as not to be attached to the head H of the user P.
  • the display drive can change the position of the display, the posture of the display, or the position and posture of the display. It may be equipped with a device.
  • the configuration for moving the display and the configuration for changing the position and orientation of the display may be configured by a device as exemplified for the moving device 120.
  • the configuration for changing the posture of the display may be configured by a device such as a gimbal.
  • the image pickup input device 160 receives input and operations for operating the image pickup system 100 from the user P.
  • the image pickup input device 160 receives inputs such as various commands, information, and data, and outputs them to the image pickup control device 150.
  • the image pickup input device 160 may be arranged in the vicinity of the robot operation device 300 and may have a configuration similar to the configuration exemplified by the robot operation device 300.
  • the robot operation device 300 may include an image pickup input device 160 and also have a function of the image pickup input device 160.
  • FIG. 2 is a block diagram showing an example of the functional configuration of the imaging system 100 according to the exemplary embodiment.
  • the image pickup control device 150 is connected to the image pickup device 110, the mobile device 120, the motion detection device 130, the display device 140, and the image pickup input device 160 via wired communication or wireless communication. Any wired communication or wireless communication may be used.
  • the image pickup control device 150 includes a drive control device 151 that controls the drive of the image pickup device 110 and the mobile device 120, a detection control device 152 that controls the operation of the motion detection device 130, and a display control device that controls the operation of the display device 140. 153 and more.
  • the detection control device 152 controls the drive of the three infrared sensors 131, processes the result of the three infrared sensors 131 detecting the infrared light from the four infrared markers 132, and processes the result of detecting the infrared light from the four infrared markers 132 in three dimensions of the four infrared markers 132. Detects the position and posture of. That is, the detection control device 152 detects the position and posture of the head H of the user P by detecting the three-dimensional position and posture of the infrared marker 132.
  • the detection control device 152 is an example of a processing device.
  • each of the three infrared sensors 131 receives infrared light emitted from the four infrared markers 132.
  • the infrared light emitted from each infrared marker 132 is associated with identification information such as an ID set in the infrared marker 132. Therefore, each infrared sensor 131 can detect the direction, intensity, intensity distribution, and the like of infrared light of each of the four infrared markers 132.
  • the detection control device 152 uses the three-dimensional position and orientation information of each infrared sensor 131 and the detection result of the infrared light of the four infrared markers 132 by each infrared sensor 131, and uses the three of the four infrared markers 132.
  • the detection control device 152 detects the three-dimensional positions of the four infrared markers 132 according to the three-dimensional coordinate system set in the space where the three infrared sensors 131 and the robot control device 400 are arranged. Further, the detection control device 152 detects the three-dimensional position and posture of the head H of the user P by using the information on the three-dimensional positions of the four infrared markers 132. For example, the detection control device 152 expresses a posture by using posture angles such as a rolling angle, a pitching angle, and a yawing angle.
  • posture angles such as a rolling angle, a pitching angle, and a yawing angle.
  • the drive control device 151 controls the image pickup operation of the image pickup device 110. Further, the drive control device 151 controls the operation of the moving device 120 so as to change the position and posture of the image pickup device 110 according to the movement of the head H of the user P detected by the detection control device 152. The drive control device 151 moves the image pickup device 110 so as to move the image pickup device 110 according to the change amount of the position and posture of the image pickup device 110 corresponding to the change amount of the position and posture of the head H detected by the detection control device 152. Controls the behavior of. For example, the drive control device 151 controls the position and orientation of the image pickup device 110 according to the three-dimensional coordinate system set in the space where the moving device 120 is arranged, and for example, the three-dimensional coordinate system set in the moving device 120. Use.
  • the relationship between the amount of change in the position and posture of the image pickup device 110 and the amount of change in the position and posture of the head H is arbitrary.
  • the amount of change in the position and posture of the image pickup apparatus 110 may correspond to the amount of change in the position and posture of the head H on a one-to-one basis, and corresponds to a constant multiple of the amount of change in the position and posture of the head H.
  • the amount of change in the posture of the image pickup device 110 and the amount of change in the posture of the head H correspond to one-to-one
  • the amount of change in the position of the image pickup device 110 corresponds to a constant multiple of the amount of change in the position of the head H. You may.
  • the amount of change in the position of the image pickup device 110 and the amount of change in the position of the head H have a one-to-one correspondence, and the amount of change in the posture of the image pickup device 110 corresponds to a constant multiple of the amount of change in the posture of the head H. You may.
  • the display control device 153 causes the display device 140 to display the image captured by the image pickup device 110.
  • the display control device 153 may control the operation of the display drive device. Further, the display control device 153 may perform position processing for changing the position where the image of the image pickup device 110 is displayed on the display screen of the display device 140.
  • the display device 140 is a head-mounted display. Therefore, the display control device 153 causes the display device 140 to display the image captured by the image pickup device 110 without controlling the display drive device and performing position processing. For example, the display control device 153 positions the center of the image captured by the image pickup device 110 on the center of the display screen of the display device 140. As a result, the user P can always see the image captured by the image pickup device 110 in the vicinity of the front surface of the head H.
  • the display control device 153 controls the operation of the display drive device so that the display moves following the movement of the head H of the user P. You may. For example, the display control device 153 may perform the above control so that the display screen of the display is located near the front surface of the head H and / or faces the front surface of the head H. Further, the display control device 153 performs image position processing on the display screen of the display so that the center of the image captured by the image pickup device 110 moves following the movement of the head H of the user P. May be good. In this case, the center of the image captured by the image pickup device 110 and the center of the display screen do not always match.
  • the user P can see the image captured by the image pickup device 110 near the front of the head H.
  • the display control device 153 prevents the center of the image captured by the image pickup device 110 from moving following the movement of the head H of the user P on the display screen of the display in accordance with the command of the user P or the like. Image position processing may be performed. As a result, the user P can see the image of the object captured from another direction by the image pickup device 110 in front of the head H or the like.
  • the display control device 153 moves the head H of the user P at the center of the image captured by the image pickup device 110 on the display screen of the display.
  • the position processing of the image may be performed so as to move following the image.
  • the image pickup control device 150 as described above includes a computer. Further, the image pickup control device 150 may include an electric circuit for controlling the electric power supplied to the image pickup device 110, the mobile device 120, the motion detection device 130, and the display device 140. Devices other than the computer may be provided separately from the image pickup control device 150.
  • the computer of the robot control device 400 and the image pickup control device 150 includes a circuit or a processing circuit having a processor, a memory, and the like.
  • the circuit or processing circuit sends and receives commands, information, data, etc. to and from other devices.
  • the circuit or processing circuit inputs signals from various devices and outputs control signals to each controlled object.
  • the memory is composed of a semiconductor memory such as a volatile memory and a non-volatile memory, a hard disk, and a storage device such as an SSD (Solid State Drive).
  • the memory stores a program executed by a circuit or a processing circuit, various data, and the like.
  • the function of the circuit or processing circuit is realized by a computer system consisting of a processor such as a CPU (Central Processing Unit), a volatile memory such as RAM (Random Access Memory), and a non-volatile memory such as ROM (Read-Only Memory). You may.
  • the computer system may realize the function of the circuit or the processing circuit by the CPU using the RAM as a work area to execute the program recorded in the ROM.
  • a part or all of the functions of the circuit or the processing circuit may be realized by the above-mentioned computer system, or may be realized by a dedicated hardware circuit such as an electronic circuit or an integrated circuit, and the above-mentioned computer system and hardware may be realized. It may be realized by a combination of circuits.
  • the robot control device 400 and the image pickup control device 150 may execute each process by centralized control by a single computer, or may execute each process by distributed control by cooperation of a plurality of computers.
  • each function of the robot control device 400 and the image pickup control device 150 includes a microcontroller, an MPU (Micro Processing Unit), an LSI (Large Scale Integration), a system LSI, a PLC (Programmable Gate Array), and a logic circuit. It may be realized by such as.
  • the plurality of functions of the robot control device 400 and the image pickup control device 150 may be individually integrated into one chip, or may be integrated into one chip so as to include a part or all of them.
  • the circuit may be a general-purpose circuit or a dedicated circuit, respectively.
  • an FPGA Field Programmable Gate Array
  • a reconfigurable processor that can reconfigure the connection and / or setting of circuit cells inside the LSI, or multiple functions for a specific application.
  • An ASIC Application Specific Integrated Circuit
  • the image pickup control device 150 of the image pickup system 100 includes a drive control device 151, a detection control device 152, a display control device 153, and a storage unit 154.
  • the drive control device 151 includes an image pickup control unit 1511 and a first movement control unit 1512 as functional components.
  • the detection control device 152 includes a device control unit 1521 and a detection processing unit 1522 as functional components.
  • the display control device 153 includes a display control unit 1531, a second movement control unit 1532, and an image processing unit 1533 as functional components.
  • the function of the storage unit 154 is realized by the memory of the computer of the image pickup control device 150 or the like.
  • the functions of the functional components of the image pickup control device 150 other than the storage unit 154 are realized by a computer processor or the like.
  • the storage unit 154 stores various information and enables reading of the stored information.
  • the storage unit 154 may store a program, various data, and the like.
  • the storage unit 154 may store programs, data, information, and the like for operating each device of the image pickup system 100.
  • the storage unit 154 stores the coordinate system set in each device of the image pickup system 100.
  • the coordinate system is a three-dimensional coordinate system (hereinafter, also referred to as “first coordinate system”) set in the space where the infrared sensor 131 and the robot operating device 300 are arranged, and the space where the moving device 120 is arranged. Even if the set three-dimensional coordinate system (hereinafter, also referred to as "second coordinate system”) and the three-dimensional coordinate system set on the moving device 120 (hereinafter, also referred to as "third coordinate system”) are included. good.
  • the storage unit 154 stores information on the position and orientation of each infrared sensor 131, for example, the information in the first coordinate system.
  • the storage unit 154 may store the identification information of each infrared marker 132 and the information of the characteristics of the infrared light emitted from each infrared marker 132 in association with each other.
  • the storage unit 154 may store information on the position and posture of each infrared marker 132 on the head H of the user P.
  • the storage unit 154 stores information on the position and orientation of the moving device 120, for example, the information in the second coordinate system.
  • the storage unit 154 stores information on the position and orientation of the image pickup device 110 on the mobile device 120, for example, the information in the third coordinate system.
  • the storage unit 154 may store the relationship of various parameters for moving each device according to the movement of the head H of the user P.
  • the relationship between the parameters is the first relationship between the amount of change in the position and posture of the head H and the amount of change in the position and posture of each device, the amount of change in the position and posture of the head H and the display of the display device 140.
  • the second relationship with the amount of change in the position and posture of the head H, and the third relationship between the amount of change in the position and posture of the head H and the amount of change in the position of the reference point of the image on the display screen of the display device 140, etc. May include.
  • the image pickup control unit 1511 controls the drive of the image pickup device 110.
  • the image pickup control unit 1511 controls the execution and stop of the image pickup operation of the image pickup device 110, and the zoom-up and zoom-back operations of the image pickup device 110.
  • the image pickup control unit 1511 may be configured to receive information, commands, and the like from the robot operation device 300, and may control the operation of the image pickup device 110 according to the commands and the like received from the robot operation device 300.
  • the first movement control unit 1512 controls the drive of the movement device 120. For example, when the first movement control unit 1512 receives the operation information from the image pickup input device 160, the first movement control unit 1512 generates an operation command for operating the operation corresponding to the operation information and outputs the operation command to the movement device 120. As a result, the mobile device 120 performs an operation corresponding to the operation information.
  • the operation information is information indicating the content of the operation input to the image pickup input device 160 by the user P in order to operate the mobile device 120.
  • the first movement control unit 1512 when the first movement control unit 1512 receives the fluctuation amount of the position and posture of the head H of the user P from the detection processing unit 1522 of the detection control device 152, the first movement control unit 1512 reads out the first relationship from the storage unit 154.
  • the first movement control unit 1512 is the position and posture of the image pickup device 110 for moving the image pickup device 110 following the movement of the head H based on the fluctuation amount of the position and posture of the head H and the first relationship. Determine the amount of fluctuation.
  • the fluctuation amount of the position and the posture can be expressed by the second coordinate system.
  • the first movement control unit 1512 generates a command for the operation of the moving device 120 for moving the image pickup device 110 by the determined position and posture fluctuation amount, and outputs the command to the moving device 120.
  • the moving device 120 moves the image pickup device 110 so as to follow the movement of the head H.
  • the horizontal and vertical movements of the head H are associated with the horizontal and vertical movements of the image pickup apparatus 110, respectively.
  • the movement of the head H in the rolling direction, pitching direction, and yawing direction is associated with the movement of the image pickup apparatus 110 in the rolling direction, pitching direction, and yawing direction, respectively.
  • the device control unit 1521 controls the drive of each infrared sensor 131 of the motion detection device 130.
  • the device control unit 1521 may control execution and stop of irradiation of infrared light of each infrared sensor 131.
  • the apparatus control unit 1521 may control operations such as execution and stop of irradiation of infrared light of each infrared marker 132.
  • the detection processing unit 1522 processes the detection result of the infrared light from the infrared marker 132 in each infrared sensor 131, and detects the position and posture of the head H of the user P.
  • the detection processing unit 1522 is an example of a detection device.
  • the detection processing unit 1522 reads out the identification information of each infrared marker 132 and the characteristic information of the infrared light from the storage unit 154, and sets the infrared light and the infrared marker 132 detected by each infrared sensor 131. Link.
  • the detection processing unit 1522 reads out the position and orientation information of each infrared sensor 131 from the storage unit 154, and uses the information and the infrared light detection result of each infrared marker 132 in each infrared sensor 131 to make each infrared ray. The three-dimensional position of the marker 132 is detected.
  • the detection processing unit 1522 detects the three-dimensional position and posture of the head H of the user P from the three-dimensional positions of the four infrared markers 132.
  • the position and orientation of the head H can be represented in the first coordinate system.
  • the detection processing unit 1522 detects the position and posture of the head H over time, and outputs the fluctuation amount of the position and posture of the head H to the first movement control unit 1512.
  • the amount of change in the position and posture of the head H is the amount of change in the position and posture of the head H before and after the change, the position and posture of the head H before the change, and the position of the head H after the change.
  • the posture, the position of the head H after the change and the fluctuation speed of the position and the posture of the head H toward the posture, and the acceleration of the position and the posture of the head H toward the position and the posture of the head H after the change It may contain at least one selected from the group comprising.
  • the display control unit 1531 acquires the image data captured by the image pickup device 110 from the image pickup device 110, outputs the image data to the display device 140, and displays the image corresponding to the image data.
  • the display control unit 1531 may perform image processing on the image data acquired from the image pickup device 110 and output the image data after the image processing to the display device 140.
  • the second movement control unit 1532 controls the operation of the display drive device when the display device 140 includes the display drive device.
  • the second movement control unit 1532 acquires information on the amount of change in the position and posture of the head H of the user P from the detection processing unit 1522. Further, the second movement control unit 1532 reads the second relationship from the storage unit 154.
  • the second movement control unit 1532 determines the amount of change in the position and posture of the display for moving the display following the movement of the head H based on the amount of change in the position and posture of the head H and the second relationship. decide.
  • the amount of change in the position and posture can be represented by the first coordinate system.
  • the second movement control unit 1532 generates a command for operating the display drive device for moving the display by the determined position and posture fluctuation amount, and outputs the command to the display drive device.
  • the display drive device moves the display following the movement of the head H so that the display is located near the front of the head H and / or faces the head H.
  • the second movement control unit 1532 may be omitted.
  • the image processing unit 1533 controls the position where the image captured by the image pickup device 110 is displayed on the display screen of the display device 140. For example, when the display device 140 includes one display, the image processing unit 1533 follows the movement of the head H of the user P and sets the reference point of the image captured by the image pickup device 110 on the display screen of the display. You may move it. For example, when the display device 140 includes a plurality of displays arranged so that the orientation of the display screen is different, the image processing unit 1533 follows the movement of the head H of the user P and is imaged by the image pickup device 110. The reference point of the image may be moved across the display screens of a plurality of displays.
  • the image processing unit 1533 acquires information on the amount of change in the position and posture of the head H of the user P from the detection processing unit 1522. Further, the image processing unit 1533 reads out the third relationship from the storage unit 154. The image processing unit 1533 determines the amount of change in the position of the reference point on the display screen that follows the movement of the head H, based on the amount of change in the position and posture of the head H and the third relationship. Further, the image processing unit 1533 generates a command for moving the reference point by the amount of fluctuation of the determined position, and outputs the command to the display control unit 1531. The display control unit 1531 displays the image on the display of the display device 140 so that the reference point of the image captured by the image pickup device 110 moves following the movement of the head H.
  • the user P can visually recognize the image captured by the image pickup device 110 at a position close to the front of the head H.
  • the image processing unit 1533 is imaged by the image pickup device 110 on the display screen of the display regardless of the movement of the head H of the user P. Maintain the position of the reference point of the image.
  • FIG. 3 is a flowchart showing an example of the operation of the imaging system 100 according to the exemplary embodiment.
  • the user P shall attach the head-mounted display to the head H as the display device 140.
  • the image pickup control device 150 operates in the initial setting mode for determining the initial position and the initial posture of the image pickup device 110 and the head H of the user P, respectively. For example, the image pickup control device 150 starts the initial setting mode according to an activation command input to the image pickup input device 160 by the user P.
  • the image pickup control device 150 determines the initial position and the initial posture of the image pickup device 110. Specifically, the user P operates the image pickup input device 160 while visually recognizing the image captured by the image pickup device 110 on the display device 140 to operate the moving device 120, and changes the position and posture of the image pickup device 110. .. When the desired image is projected on the display device 140, the user P inputs a command to the image pickup input device 160 to determine the current position and posture of the image pickup device 110 to the initial position and initial posture of the image pickup device 110. The image pickup control device 150 determines the commanded position and posture of the image pickup device 110 as the initial position and initial posture of the image pickup device 110.
  • the image pickup control device 150 determines the initial position and the initial posture of the head H of the user P. Specifically, when the position and posture of the head H becomes a desired position and posture, the user P inputs a command for determining the initial position and initial posture of the head H to the image pickup input device 160.
  • the image pickup control device 150 causes the three infrared sensors 131 of the motion detection device 130 to detect infrared light, processes the detection result of each infrared sensor 131, and detects the position and posture of the head H.
  • the image pickup control device 150 determines the detected position and posture of the head H as the initial position and initial posture of the head H.
  • step S104 the image pickup control device 150 ends the initial setting mode and starts the operation in the normal operation mode.
  • step S105 the image pickup control device 150 causes the image pickup device 110 to start the image pickup operation.
  • the image pickup device 110 continuously captures a moving image and displays it on the display device 140.
  • step S106 the image pickup control device 150 causes the three infrared sensors 131 to continuously detect the infrared light of the infrared marker 132 of the head H.
  • step S107 the image pickup control device 150 processes the detection result of each infrared sensor 131, and detects the position and posture of the head H of the user P with respect to the initial position and the initial posture.
  • the image pickup control device 150 detects the position and posture of the head H at predetermined time intervals, thereby detecting the amount of change in the position and posture of the head H at predetermined time intervals.
  • step S108 the image pickup control device 150 determines the image pickup device 110 with respect to the initial position and the initial posture based on the first relationship stored in the storage unit 154 and the position and posture of the head H with respect to the initial position and the initial posture. Determine the position and posture of the target.
  • the image pickup control device 150 determines the target position and posture of the image pickup device 110 at predetermined time intervals, thereby determining the amount of change in the position and posture of the image pickup device 110 at predetermined time intervals.
  • step S109 the image pickup control device 150 generates an operation command corresponding to the target position and posture of the image pickup device 110 and outputs the operation command to the mobile device 120.
  • the image pickup control device 150 generates an operation command for operating the moving device 120 so that the position and posture of the image pickup device 110 satisfy the target position and posture.
  • step S110 the moving device 120 operates according to the operation command to move the image pickup device 110 to the target position and posture.
  • step S111 the image pickup control device 150 determines whether or not a command for terminating the operation of the image pickup system 100 has been input to the image pickup input device 160 by the user P, and if it has been input (Yes in step S111).
  • the process returns to step S106.
  • the image pickup control device 150 determines the position and posture of the head H based on the relationship between the initial position and initial posture of the head H of the user P and the initial position and initial posture of the image pickup device 110.
  • the position and orientation of the image pickup apparatus 110 can be changed so as to follow the fluctuation amount.
  • Modification 1 of the exemplary embodiment differs from the exemplary embodiment in that the display device 140A comprises one display 141 and a display drive device 142 for moving the display 141.
  • the image pickup control device 150 controls the operation of the display drive device 142 to move the position and orientation of the display 141 in accordance with the movement of the head H of the user P.
  • the modification 1 will be described mainly on the points different from the exemplary embodiment, and the description of the same points as the exemplary embodiment will be omitted as appropriate.
  • FIG. 4 is a side view showing an example of the configuration of the display device 140A according to the modified example 1 of the exemplary embodiment.
  • the display drive device 142 is configured to support the display 141 and to freely change the position and orientation of the display 141.
  • the display drive device 142 is a robot arm having a plurality of joints. The base of the robot arm is fixed to a support surface or the like, and a display 141 is attached to the tip of the robot arm.
  • the display drive device 142 can arbitrarily change the position and orientation of the display 141 in the three-dimensional direction.
  • the second movement control unit 1532 of the display control device 153 of the image pickup control device 150 controls the operation of the display drive device 142 so that the position and posture of the display 141 follow the movement of the head H of the user P. For example, the second movement control unit 1532 moves the display 141 upward and downward when the head H is directed upward, and the display 141 is directed to the head H when the head H is directed to the left. Move to the left and turn to the right. Further, the second movement control unit 1532 controls the operation of the display drive device 142 so as to change the position and posture of the display 141 according to the operation via the image pickup input device 160.
  • FIG. 5 is a flowchart showing an example of the operation of the image pickup system 100 according to the first modification.
  • steps S201 to S203 are the same as those of steps S101 to S103 in the exemplary embodiment, respectively.
  • step S204 the image pickup control device 150 determines the initial position and the initial posture of the display 141.
  • the user P operates the image pickup input device 160 to operate the display drive device 142, and changes the position and posture of the display 141 to a desired position and posture.
  • the user P inputs a command for determining the current position and orientation of the display 141 to the initial position and initial posture of the display 141 to the image pickup input device 160.
  • the image pickup control device 150 determines the commanded position and orientation of the display 141 as the initial position and initial posture of the display 141.
  • steps S205 to S211 are the same as those in steps S104 to S110 in the exemplary embodiment, respectively.
  • step S212 the image pickup control device 150 determines the display 141 with respect to the initial position and the initial posture based on the second relationship stored in the storage unit 154 and the position and the posture of the head H with respect to the initial position and the initial posture. Determine the position and posture of the target.
  • the image pickup control device 150 determines the target position and orientation of the display 141 at predetermined time intervals, thereby determining the amount of change in the position and orientation of the display 141 at predetermined time intervals.
  • step S213 the image pickup control device 150 generates an operation command corresponding to the target position and orientation of the display 141 and outputs the operation command to the display drive device 142.
  • the image pickup control device 150 generates an operation command for operating the display drive device 142 so that the position and posture of the display 141 satisfy the target position and posture.
  • step S214 the display drive device 142 operates according to the operation command, and moves the display 141 to the target position and posture.
  • step S215 the image pickup control device 150 determines whether or not a command to end the operation of the image pickup system 100 has been input to the image pickup input device 160 by the user P, and if it is input (Yes in step S215).
  • the process returns to step S207.
  • the image pickup control device 150 is based on the relationship between the initial position and initial posture of the head H of the user P, the initial position and initial posture of the image pickup device 110, and the initial position and initial posture of the display 141.
  • the position and posture of the image pickup apparatus 110 and the display 141 can be changed so as to follow the fluctuation amount of the position and posture of the head H.
  • the image pickup control device 150 may execute the processes of steps S209 to S211 and the processes of steps S212 to S214 in parallel, or may execute the processes in the reverse order of the above.
  • the display drive device 142 is configured to move both the position and orientation of the display 141, but is configured to move only the position of the display 141 or only the orientation of the display 141. May be done.
  • a second embodiment of the exemplary embodiment is exemplary in that the display device 140B comprises one display 143 having a display surface 143a that includes a curve that surrounds a portion of the periphery of the user P. Different from the form.
  • the image pickup control device 150 follows the movement of the head H of the user P and moves the reference point of the image captured by the image pickup device 110 on the screen of the display surface 143a to display the position and the image. Turn around.
  • the modified example 2 will be mainly described with reference to the exemplary embodiment and the points different from the modified example 1, and the description of the same points as the exemplary embodiment and the modified example 1 will be omitted as appropriate.
  • FIG. 6 is a side view showing an example of the configuration of the display device 140B according to the modified example 2 of the exemplary embodiment.
  • the display surface 143a of the display 143 surrounds the user P horizontally from both sides to the front of the user P and vertically from above and below to the front of the user P.
  • Such a display surface 143a surrounds the user P in the horizontal direction, the vertical direction, and the direction intersecting the horizontal direction and the vertical direction.
  • the display surface 143a has a curved surface shape similar to, for example, a part of a spherical surface or an ellipsoidal surface.
  • the display surface 143a may have a shape that surrounds a part of the periphery of the user P, and may have a shape that surrounds the entire circumference of the user P, for example.
  • the shape of the display surface 143a is not limited to the curved surface shape, and may be any shape including bending, bending, or both bending and bending.
  • the shape of the display surface 143a may be the same as at least a part of the surface of the cylindrical surface or the polyhedron.
  • the "cylindrical surface” has a cross-sectional shape perpendicular to the axis of a circle, an ellipse, a shape close to a circle, a shape close to an ellipse, or two of these. It may include the surface of a columnar body, which is the above combination.
  • the image processing unit 1533 and the display control unit 1531 of the display control device 153 of the image pickup control device 150 set the position of the reference point Pf of the image captured by the image pickup device 110 so as to follow the movement of the head H of the user P. It is varied on the screen of the display surface 143a.
  • the image processing unit 1533 and the display control unit 1531 move the reference point Pf upward on the screen of the display surface 143a when the head H faces upward, and the display surface when the head H faces left.
  • the reference point Pf is moved to the left with respect to the head H on the screen of 143a.
  • FIG. 7 is a flowchart showing an example of the operation of the image pickup system 100 according to the second modification.
  • steps S301 to S303 are the same as those of steps S101 to S103 in the exemplary embodiment, respectively.
  • step S304 the image pickup control device 150 determines the initial position of the reference point Pf of the image captured by the image pickup device 110 on the screen of the display surface 143a of the display 143.
  • the reference point Pf is the center of the image.
  • the image pickup control device 150 determines the position of the reference point Pfa of the image captured by the image pickup device 110 when the initial position and the initial posture of the image pickup device 110 are determined in step S302 as the initial position.
  • steps S305 to S311 are the same as in steps S104 to S110 in the exemplary embodiment, respectively.
  • step S312 the image pickup control device 150 sets the initial position of the display surface 143a on the screen based on the third relationship stored in the storage unit 154 and the position and posture of the head H with respect to the initial position and the initial posture.
  • the target reference point Pft is a reference point of the movement destination that follows the fluctuation of the position and posture of the head H.
  • the image pickup control device 150 determines the target position of the target reference point Pft at predetermined time intervals, thereby determining the amount of change in the position of the reference point Pf at predetermined time intervals.
  • step S313 the image pickup control device 150 sets the image so that the position of the reference point Pf of the image captured by the image pickup device 110 coincides with the target position of the target reference point Pft on the screen of the display surface 143a. Is processed and output to the display 143. That is, the image pickup control device 150 executes image processing associated with the target reference point Pft.
  • step S314 the display 143 displays the processed image on the screen of the display surface 143a.
  • step S315 is the same as that of step S215 in the first modification.
  • the image pickup control device 150 has the initial position and initial posture of the head H of the user P, the initial position and initial posture of the image pickup device 110, and the reference point of the image captured by the image pickup device 110. Based on the relationship with the initial position on the display surface 143a of the display 143 of Pf, the position and posture of the image pickup apparatus 110 and the display position and display direction of the image so as to follow the fluctuation amount of the position and posture of the head H. And can be varied.
  • the image pickup control device 150 may execute the processes of steps S309 to S311 and the processes of steps S312 to S314 in parallel, or may execute the processes in the reverse order of the above.
  • the image pickup control device 150 receives a command from the user P via the image pickup input device 160 or the like, so that the image captured by the image pickup device 110 follows the fluctuation amount of the position and posture of the head H.
  • the control for changing the position of the reference point Pf may be configured to be stopped or released. While the control is stopped, the position and posture of the image pickup apparatus 110 change so as to follow the fluctuation amount of the position and posture of the head H, but the position of the reference point Pf of the image on the display surface 143a does not change. ..
  • the user P can see an image of a subject such as an object W captured from another direction on the display surface 143a. For example, when the user P changes the position of the reference point Pf of the image according to the movement of the head H and then orders the image pickup input device 160 to stop following, the image is projected to a place other than the front of the head H. You can see the image.
  • the display 143 may be movable in the same manner as the modification 1.
  • the image pickup control device 150 combines the processing in the modification 2 and the processing in the modification 1 to change the display position and orientation of the image so as to follow the movement of the head H of the user P. May be good.
  • the third modification of the exemplary embodiment is different from the second modification in that the display device 140C includes a plurality of displays 141 arranged so as to surround a part around the user P.
  • the plurality of displays 141 are arranged so that the positions and orientations of the respective display surfaces 141a are different.
  • the image pickup control device 150 causes each of the plurality of displays 141 to display a part of one image captured by the image pickup device 110, that is, causes the plurality of displays 141 to display the one image as a whole.
  • the image pickup control device 150 follows the movement of the head H of the user P and moves the reference point of the image captured by the image pickup device 110 over the screen of the display surface 141a of the plurality of displays 141, thereby moving the image.
  • the modified example 3 will be mainly described with reference to the exemplary embodiment and the points different from the modified examples 1 and 2, and the description of the exemplary embodiment and the same points as the modified examples 1 and 2 will be omitted as appropriate. ..
  • FIG. 8 is a side view showing an example of the configuration of the display device 140C according to the modified example 3 of the exemplary embodiment.
  • the plurality of displays 141 are arranged so as to surround a part around the user P horizontally from both sides of the user P to the front and vertically from the upper side and the lower side to the front of the user P. Will be done.
  • the plurality of displays 141 are arranged so that the display surface 141a forms a plurality of horizontal rows and a plurality of vertical columns.
  • the plurality of display surfaces 141a surround the user P in the horizontal direction, the vertical direction, and the directions intersecting the horizontal direction and the vertical direction.
  • the plurality of displays 141 are arranged so that their respective display surfaces 141a are arranged along a spherical surface or an ellipsoidal surface and are adjacent to each other. Each display surface 141a is directed towards the center or focal point of a spherical or ellipsoidal surface.
  • the plurality of displays 141 may be arranged so as to surround the entire periphery of the user P, may be arranged so as to surround the periphery of the user P in the horizontal direction, and may be arranged so as to surround the periphery of the user P in the vertical direction. May be placed in.
  • the plurality of displays 141 are arranged in a cross-shaped array extending horizontally and vertically, a cylindrical array extending horizontally and bending, a cylindrical array extending vertically and bending, and the like. May be done.
  • the plurality of displays 141 are arranged adjacent to each other, but may be arranged at a distance from each other.
  • the image processing unit 1533 and the display control unit 1531 of the display control device 153 of the image pickup control device 150 set the position of the reference point Pf of the image captured by the image pickup device 110 so as to follow the movement of the head H of the user P. It is varied over the screens of the plurality of display surfaces 141a. For example, the image processing unit 1533 and the display control unit 1531 move the reference point Pf upward over the screens of the plurality of display surfaces 143a when the head H is directed upward, and when the head H is directed to the left, the reference point Pf is moved upward. The reference point Pf is moved to the left with respect to the head H over the screen of the display surface 143a.
  • FIG. 9 is a flowchart showing an example of the operation of the image pickup system 100 according to the modified example 3.
  • steps S401 to S403 are the same as those of steps S301 to S303 in the second modification, respectively.
  • the image pickup control device 150 determines the initial position of the reference point Pf of the image captured by the image pickup device 110.
  • the image pickup control device 150 determines the position of the reference point Pfa of the image captured by the image pickup device 110 when the initial position and the initial posture of the image pickup device 110 are determined in step S402 as the initial position.
  • the image pickup control device 150 determines the position of the display 141 that displays the reference point Pfa at the initial position and the position of the reference point Pfa on the screen of the display surface 141a of the display 141.
  • steps S405 to S411 are the same as those in steps S305 to S311 in the second modification, respectively.
  • step S412 the image pickup control device 150 together with the display 141 that displays the target reference point Pft based on the third relationship stored in the storage unit 154 and the position and posture of the head H with respect to the initial position and the initial posture. , The target position of the target reference point Pft on the screen of the display surface 141a of the display 141 is determined. The image pickup control device 150 executes the above determination at predetermined time intervals.
  • step S413 the image pickup control device 150 matches the position of the reference point Pf of the image captured by the image pickup device 110 with the target position of the target reference point Pft on the screen of the display surface 141a of the determined display 141. As such, the image is processed and output to each display 141.
  • step S414 the plurality of displays 141 display the processed image on the screen of the display surface 141a as a whole.
  • step S415 is the same as step S315 in the modification 2.
  • the image pickup control device 150 has the initial position and initial posture of the head H of the user P, the initial position and initial posture of the image pickup device 110, and the reference point of the image captured by the image pickup device 110. Based on the relationship with the initial position of Pf, the position and posture of the image pickup apparatus 110 and the display position and display direction of the image can be changed so as to follow the fluctuation amount of the position and posture of the head H.
  • the image pickup control device 150 may execute the processes of steps S409 to S411 and the processes of steps S412 to S414 in parallel, or may execute the processes in the reverse order of the above.
  • the image pickup control device 150 receives a command from the user P via the image pickup input device 160 or the like, so that the image captured by the image pickup device 110 follows the fluctuation amount of the position and posture of the head H.
  • the control for changing the position of the reference point Pf may be configured to be stopped or released.
  • the image pickup control device 150 is configured to display one image by using the entire plurality of displays 141, but the present invention is not limited to this.
  • the image pickup control device 150 may be configured to display the image captured by the image pickup device 110 on a part of the plurality of displays 141.
  • the image pickup control device 150 may be configured to select a display 141 for displaying an image so as to follow the fluctuation amount of the position and posture of the head H.
  • the plurality of displays 141 may be movable in the same manner as in the modified example 1.
  • the plurality of displays 141 may have a display surface 141a that is curved, bent, or includes both curved and bent, as in the second modification.
  • the image pickup control device 150 performs both the processing in the modification 1 and the processing in the modification 2, or the processing in the modification 1 and the processing in the modification 2 in the processing in the modification 3.
  • the display position and orientation of the image may be changed so as to follow the movement of the head H of the user P.
  • Modification example 4 of the exemplary embodiment differs from the exemplary embodiment in that the imaging system 100 includes a plurality of imaging devices 110 arranged at different positions and orientations.
  • the image pickup control device 150 changes the position and orientation of the image pickup device 110 according to the movement of the head H of the user P by switching the image pickup device 110 to be displayed on the display device 140.
  • the modified example 4 will be mainly described with reference to the exemplary embodiments and the points different from the modified examples 1 to 3, and the description of the same points as the exemplary embodiments and the modified examples 1 to 3 will be omitted as appropriate. ..
  • FIG. 10 is a perspective view showing an example of the configuration of the image pickup apparatus 110 according to the modified example 4 of the exemplary embodiment.
  • the plurality of image pickup devices 110 are arranged so as to surround at least a part of the periphery of the object W to be imaged.
  • the object W is a work target of the robot 200.
  • the plurality of image pickup devices 110 are arranged along a cylindrical surface having a vertical axis and separated from each other. The axis passes through the object W or the vicinity of the object W.
  • the plurality of image pickup devices 110 are arranged at positions in the vertical direction equivalent to each other. Each image pickup device 110 is directed toward the object W and is fixed to an inanimate object such as a ceiling via a support.
  • the arrangement of the plurality of image pickup devices 110 is not limited to the above.
  • the plurality of image pickup devices 110 may be arranged so as to surround the object W, the robot arm 210, or both the object W and the robot arm 210.
  • the plurality of image pickup devices 110 may be arranged at different positions in the vertical direction.
  • the plurality of image pickup devices 110 are arranged along a cylindrical surface having a horizontal axis, a cylindrical surface having a vertical axis, a spherical surface, an ellipsoid, or a combination of two or more thereof. May be good.
  • the plurality of image pickup devices 110 may have two or more horizontal circumferences having different vertical positions, two or more vertical circumferences having different horizontal positions or horizontal orientations, or the horizontal. It may be arranged along the combination of the circumference in the direction and the circumference in the vertical direction.
  • each image pickup device 110 are stored in advance in the storage unit 154 of the image pickup control device 150 as parameters of the image pickup device 110.
  • the posture of the image pickup device 110 is the posture angle of the center of the optical axis of the image pickup device 110.
  • the first movement control unit 1512 of the drive control device 151 of the image pickup control device 150 receives the information on the position and posture of the head H of the user P, the first movement of the head H is followed by the movement of the head H using the first relationship.
  • the target position and target posture of the image pickup device 110 for moving the image pickup device 110 are determined.
  • the first movement control unit 1512 uses the parameters of the image pickup device 110 stored in the storage unit 154 to set the image pickup device 110 having the position and the posture closest to the target position and the target posture among the plurality of image pickup devices 110. To decide from.
  • the first movement control unit 1512 determines the zoom-up rate or zoom-back rate to be executed by the image pickup device 110 in order to compensate for the difference between the determined position and posture of the image pickup device 110 and the target position and target posture. decide. For example, when the position of the image pickup device 110 is located in front of the target position in the determined direction of the image pickup device 110, the first movement control unit 1512 zooms the image pickup device 110 to zoom back. Determine the back rate. When the position of the image pickup apparatus 110 is located behind the target position in the directivity direction, the first movement control unit 1512 determines the zoom-up rate for causing the image pickup apparatus 110 to perform zoom-up imaging.
  • the first movement control unit 1512 outputs a command to the determined image pickup device 110 to execute the image pickup at the determined zoom-up rate or zoom-back rate to the image pickup control unit 1511.
  • the image pickup control unit 1511 causes the image pickup device 110 to take an image according to a command.
  • the first movement control unit 1512 is an example of a variable device.
  • the image pickup control device 150 determines an image pickup device 110 to perform imaging according to the movement of the head H of the user P, and causes the image pickup device 110 to take an image to follow the movement of the user's head.
  • An image whose imaging position and imaging direction fluctuate can be displayed on the display device 140.
  • the configuration of the modified example 4 may be applied to the modified examples 1 to 3.
  • the present disclosure is not limited to the above exemplary embodiments and modifications. That is, various modifications and improvements are possible within the scope of the present disclosure.
  • the present disclosure also includes a form in which various modifications are applied to an exemplary embodiment and a modification, and a form constructed by combining components in different exemplary embodiments and modifications. Is done.
  • the motion detection device 130 has an infrared sensor 131 and an infrared marker 132 as sensors for detecting the position and posture of the head H from a position away from the head H of the user P.
  • the present invention is not limited to this, and any configuration that can detect the movement of the head H may be provided.
  • the motion detection device 130 may include an acceleration sensor and an angular velocity sensor mounted on the head H, and may detect the acceleration and the angular velocity of the head H in the 6-axis direction.
  • the image pickup control device 150 may be configured to receive the detection result from the acceleration sensor and the angular velocity sensor via wired communication or wireless communication. The image pickup control device 150 may detect the position and posture of the head H using the detection results of the acceleration and the angular velocity.
  • the motion detection device 130 may include a three-dimensional camera arranged at a position away from the head H and capture a three-dimensional image of the head H.
  • the pixel value of each pixel of the three-dimensional image indicates the distance value to the subject projected on the pixel.
  • the image pickup control device 150 detects the image of the head H and the posture of the head H projected on the three-dimensional image by image processing such as a pattern matching method using the template of the head H, and detects the three-dimensional image.
  • the position of the head H may be detected from the pixel value of each pixel of.
  • the motion detection device 130 may include a plurality of three-dimensional cameras arranged at different positions and orientations from each other.
  • the image pickup control device 150 may generate a three-dimensional model of the head H by processing a three-dimensional image of each three-dimensional camera.
  • the image pickup control device 150 may detect the position and posture of the head H using a three-dimensional model of the head H.
  • the motion detection device 130 may include a magnetic field generator and a magnetic sensor mounted on the head H, and detect the position and orientation of the magnetic sensor.
  • the image pickup control device 150 may be configured to receive the detection result from the magnetic sensor via wired communication or wireless communication. The image pickup control device 150 may detect the position and posture of the head H by using the detection result of the position and posture of the magnetic sensor.
  • the image pickup system includes an image pickup device, a detection device for detecting the movement of the user's head, and a position of the image pickup device and a position of the image pickup device following the movement of the head detected by the detection device.
  • a variable device that changes the orientation, an image captured by the imaging device is displayed to the user, and the position where the image is displayed and the direction in which the image is displayed according to the movement of the head are selected. It is equipped with a display device that changes at least one of them.
  • the imaging system can change the position and orientation of the imaging device according to the movement of the head. Further, the image pickup system follows the movement of the head in the display device to display the image captured by the image pickup device, the direction in which the image is displayed, or the position and direction in which the image is displayed. Both can be changed. Therefore, the user can easily display an image captured at a position and orientation that follows the movement of the head in a state where the image is displayed at a position, orientation, or position and orientation corresponding to the movement of the head. And you can see it surely. Therefore, the image pickup system can make the image pickup device and the display surface of the image captured by the image pickup device follow the movement of the head.
  • variable device may be equipped with the image pickup device, and the position and orientation of the image pickup device may be changed by moving the image pickup device.
  • the image pickup device can be moved to a position and a direction that follows the movement of the head. Therefore, the image pickup apparatus can capture an image from a position and orientation that faithfully follows the movement of the head. Further, the image pickup apparatus can capture an image that continuously changes according to the movement of the head.
  • variable device may be equipped with the image pickup device, and the position and orientation of the image pickup device may be changed by moving the variable device itself.
  • variable device since the variable device itself moves, the variable device can increase the fluctuation range of the position and orientation of the image pickup device.
  • the user can have the image pickup device capture an image from a wide range of positions and orientations by the movement of the head and visually recognize the image.
  • the image pickup system includes a plurality of the image pickup devices arranged at different positions and orientations, and the variable device obtains the image pickup by switching the image pickup device for displaying an image on the display device.
  • the position and orientation of the device may be varied.
  • variable device switches the image pickup device for displaying the captured image, so that the image pickup device after the switch captures the image from the position and direction following the movement of the head and displays the image on the display device.
  • the position and orientation of the image pickup device that captured the image can be changed, so that the image can be captured from the position and orientation that follow the quick movement of the head.
  • the detection device may include a sensor that detects the position and posture of the head from a position away from the head.
  • the detection device can detect the position and posture of the head without contacting the head. Therefore, the user can move the head without being restricted by the detection device.
  • the detection device includes at least one infrared sensor, at least one infrared marker, and a processing device, and includes the at least one infrared sensor and the at least one infrared marker.
  • the processing apparatus is the infrared sensor. May detect the position and orientation of the head by processing the result of detecting the infrared light from the infrared marker.
  • the detection device can detect the position and posture of the head with high accuracy by a simple process by using an infrared sensor and an infrared marker.
  • the user can move his head without being restricted by the detector.
  • the display device is a head-mounted display attached to the head, and the display device follows the movement of the head by moving with the head.
  • the position and orientation in which the image is displayed may be changed.
  • the display surface of the head-mounted display moves together with the head, and the position and orientation of the display surface follow the movement of the head. Therefore, the configuration for making the position and orientation of the display surface follow the movement of the head becomes simple.
  • the display device is arranged so as to surround at least a part of the periphery of the user in the vertical direction, and the display device follows the movement of the head.
  • the display device follows the movement of the head.
  • the imaging system follows the vertical movement of the head and moves the reference point of the image captured by the imaging device in the vertical direction on the display surface of the display device.
  • the position of the reference point and the direction in which the reference point exists in the image can correspond to the position and orientation of the head. Therefore, the user can easily and surely view the image.
  • the display device is arranged so as to surround at least a part of the periphery of the user in the horizontal direction, and the display device follows the movement of the head and is described.
  • the display device follows the movement of the head and is described.
  • the imaging system follows the movement of the head in the left-right direction and moves the reference point of the image captured by the imaging device in the horizontal direction on the display surface of the display device.
  • the position of the reference point and the direction in which the reference point exists in the image can correspond to the position and orientation of the head. Therefore, the user can easily and surely view the image.
  • the display device has a plurality of display surfaces arranged so as to have different orientations, and the display device follows the movement of the head and the image pickup device. By moving the reference point of the image captured by the above on the plurality of display surfaces, the position and orientation of displaying the image may be changed.
  • the plurality of display surfaces of the display device can surround at least a part of the periphery of the user.
  • the position of the reference point of the image captured by the image pickup apparatus and the direction in which the reference point exists can correspond to the position and orientation of the head.
  • each of the plurality of display surfaces may be, for example, a flat surface.
  • the plurality of display surfaces may be display surfaces of a plurality of flat displays. This makes it possible to reduce the cost of the display device.
  • the display device may have a display surface including at least one of bending and bending so as to surround at least a part around the user.
  • the display device can have a display surface extending along the periphery so as to surround at least a part of the periphery of the user. This allows the display device to present the user with a continuous image.
  • the display surface of one display device may be configured to surround at least a part around the user.
  • the display device is driven to move at least one of the position of the display surface of the display device and the orientation of the display surface in accordance with the movement of the head. It may be equipped with a device.
  • the imaging system can move the position of the display surface of the display device, the orientation of the display surface, or both the position and orientation of the display surface according to the movement of the head.
  • the imaging system can maintain the position and orientation of the display surface with respect to the head even when the head moves. Therefore, the user can easily visually recognize the display surface.
  • the robot system includes an image pickup system according to one aspect of the present disclosure and a robot that performs work on an object, and the image pickup device is a combination of the object and the robot. It is placed in a position where at least one can be imaged. According to the above aspect, the same effect as that of the imaging system according to one aspect of the present disclosure can be obtained.
  • the numbers such as the ordinal number and the quantity used above are all examples for concretely explaining the technology of the present disclosure, and the present disclosure is not limited to the illustrated numbers.
  • the connection relationship between the components is exemplified for concretely explaining the technique of the present disclosure, and the connection relationship for realizing the function of the present disclosure is not limited to this.
  • Robot system 100 Imaging system 110 Imaging device 120 Mobile device (variable device) 130 Motion detection device (detection device) 131 Infrared sensor 132 Infrared marker 140, 140A, 140B, 140C Display device 141a, 143a Display surface 142 Display drive device (drive device) 1512 1st movement control unit (variable device) 1522 Detection processing unit (detection device) 1531 Display control unit 1532 Second movement control unit 1533 Image processing unit H Head P User W Object

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)

Abstract

La présente invention concerne un système d'imagerie (100) qui comprend un dispositif d'imagerie (110), un dispositif de détection (130) qui détecte le mouvement de la tête d'un utilisateur, un dispositif de changement (120) qui modifie la position et l'orientation du dispositif d'imagerie (110) de manière à suivre le mouvement de la tête détecté par le dispositif de détection (130) et un dispositif d'affichage (140) qui affiche une image capturée par le dispositif d'imagerie (110) à un utilisateur et qui fait varier la position à laquelle l'image est affichée et/ou l'orientation dans laquelle l'image est affichée de façon à suivre le mouvement de la tête.
PCT/JP2021/022669 2020-06-19 2021-06-15 Système d'imagerie et système de robot WO2021256463A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022531836A JP7478236B2 (ja) 2020-06-19 2021-06-15 撮像システム及びロボットシステム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-106026 2020-06-19
JP2020106026 2020-06-19

Publications (1)

Publication Number Publication Date
WO2021256463A1 true WO2021256463A1 (fr) 2021-12-23

Family

ID=79267954

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/022669 WO2021256463A1 (fr) 2020-06-19 2021-06-15 Système d'imagerie et système de robot

Country Status (2)

Country Link
JP (1) JP7478236B2 (fr)
WO (1) WO2021256463A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000176675A (ja) * 1998-12-17 2000-06-27 Kawasaki Heavy Ind Ltd Hmd付き溶接部モニタリング装置
WO2016189924A1 (fr) * 2015-05-28 2016-12-01 株式会社日立製作所 Dispositif et programme de fonctionnement de robot
WO2017033355A1 (fr) * 2015-08-25 2017-03-02 川崎重工業株式会社 Système manipulateur
JP2018202032A (ja) * 2017-06-08 2018-12-27 株式会社メディカロイド 医療器具の遠隔操作装置
JP2019179226A (ja) * 2018-03-30 2019-10-17 株式会社小松製作所 表示装置及び遠隔操作システム
JP2019202354A (ja) * 2018-05-21 2019-11-28 Telexistence株式会社 ロボット制御装置、ロボット制御方法及びロボット制御プログラム

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5877463B2 (ja) 2011-12-07 2016-03-08 矢崎総業株式会社 シールドシェル

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000176675A (ja) * 1998-12-17 2000-06-27 Kawasaki Heavy Ind Ltd Hmd付き溶接部モニタリング装置
WO2016189924A1 (fr) * 2015-05-28 2016-12-01 株式会社日立製作所 Dispositif et programme de fonctionnement de robot
WO2017033355A1 (fr) * 2015-08-25 2017-03-02 川崎重工業株式会社 Système manipulateur
JP2018202032A (ja) * 2017-06-08 2018-12-27 株式会社メディカロイド 医療器具の遠隔操作装置
JP2019179226A (ja) * 2018-03-30 2019-10-17 株式会社小松製作所 表示装置及び遠隔操作システム
JP2019202354A (ja) * 2018-05-21 2019-11-28 Telexistence株式会社 ロボット制御装置、ロボット制御方法及びロボット制御プログラム

Also Published As

Publication number Publication date
JP7478236B2 (ja) 2024-05-02
JPWO2021256463A1 (fr) 2021-12-23

Similar Documents

Publication Publication Date Title
US9104981B2 (en) Robot teaching system and method using imaging based on training position
US20130338525A1 (en) Mobile Human Interface Robot
CN111614919B (zh) 影像记录装置以及头戴式显示器
SE504846C2 (sv) Styrutrustning med ett rörligt styrorgan
JP2006513504A (ja) プロジェクタによる位置および向きの読み取り
CN108536142B (zh) 基于数字光栅投影的工业机器人防撞预警***及方法
WO2017122270A1 (fr) Dispositif d'affichage d'image
US20230256606A1 (en) Robot System with Object Detecting Sensors
CN112008692A (zh) 示教方法
JP2009285737A (ja) 入力インタフェース
JP2001148025A (ja) 位置検出装置及びその方法、平面姿勢検出装置及びその方法
WO2021256463A1 (fr) Système d'imagerie et système de robot
EP3147752B1 (fr) Agencement pour fournir une interface utilisateur
WO2021256464A1 (fr) Système de capture d'image et système de robot
JPH05318361A (ja) 物体操作方式
WO2017086771A1 (fr) Système de surveillance visuelle avec une capacité de suivi ou de localisation de cible
WO2021073733A1 (fr) Procédé de commande d'un dispositif par un être humain
JP7224559B2 (ja) 遠隔制御マニピュレータシステムおよび遠隔制御支援システム
Yu et al. Efficiency and learnability comparison of the gesture-based and the mouse-based telerobotic systems
US20230190403A1 (en) System for monitoring a surgical luminaire assembly
JP6005496B2 (ja) 遠隔監視装置および遠隔監視方法
US20230214004A1 (en) Information processing apparatus, information processing method, and information processing program
KR20190091870A (ko) 모션센서와 vr을 활용한 로봇 제어 시스템
US20200057501A1 (en) System, devices, and methods for remote projection of haptic effects
JPH0430981A (ja) 遠隔操縦型ロボットのテレビカメラ制御装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21824947

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022531836

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21824947

Country of ref document: EP

Kind code of ref document: A1