WO2023047804A1 - 撮像装置、撮像システム、撮像方法及びプログラム - Google Patents
撮像装置、撮像システム、撮像方法及びプログラム Download PDFInfo
- Publication number
- WO2023047804A1 WO2023047804A1 PCT/JP2022/029384 JP2022029384W WO2023047804A1 WO 2023047804 A1 WO2023047804 A1 WO 2023047804A1 JP 2022029384 W JP2022029384 W JP 2022029384W WO 2023047804 A1 WO2023047804 A1 WO 2023047804A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- target area
- imaging device
- imaging
- area
- information
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 726
- 230000033001 locomotion Effects 0.000 claims description 115
- 238000000034 method Methods 0.000 description 107
- 238000005259 measurement Methods 0.000 description 57
- 238000012545 processing Methods 0.000 description 56
- 238000010586 diagram Methods 0.000 description 55
- 238000004891 communication Methods 0.000 description 37
- 230000003287 optical effect Effects 0.000 description 35
- 238000012544 monitoring process Methods 0.000 description 8
- 230000000295 complement effect Effects 0.000 description 4
- 229910044991 metal oxide Inorganic materials 0.000 description 4
- 150000004706 metal oxides Chemical class 0.000 description 4
- 239000004065 semiconductor Substances 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000004904 shortening Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/40—Systems for automatic generation of focusing signals using time delay of the reflected waves, e.g. of ultrasonic waves
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
- G03B13/34—Power focusing
- G03B13/36—Autofocus systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
Definitions
- the present invention relates to an imaging device, an imaging system, an imaging method, and a program.
- An autofocus imaging device that automatically sets the focal position is known.
- Japanese Patent Application Laid-Open No. 2002-200002 describes that a predetermined position designated by a user is focused.
- the present embodiment aims to provide an imaging device, an imaging method, and a program capable of appropriately adjusting the focus.
- An imaging device is an imaging device capable of imaging an object, comprising an imaging device, an object information acquisition unit that acquires position information of an object existing in an imaging area of the imaging device, and A target area acquisition unit that sets a target area based on the position information of the reference object acquired by the object information acquisition unit; and if an object other than the reference object exists in the target area, focus on the object. and a focal position control section for controlling the focal position of the imaging device.
- An image capturing method is an image capturing method for capturing an image of an object, and includes a step of acquiring position information of an object existing in an image capturing area, and a step of acquiring the position information of the object. setting a target area based on position information of a reference object; and, if an object other than the reference object exists within the target area, controlling the focal position of an imaging device so as to focus on the object.
- a program according to one aspect of the present embodiment is a program that causes a computer to execute an imaging method for imaging an object, and includes a step of acquiring position information of an object existing in an imaging area, and acquiring the position information of the object. setting a target area based on the position information of the reference object acquired in the step; and controlling the focus position of the computer.
- the focus can be adjusted appropriately.
- FIG. 1 is a schematic block diagram of an imaging device according to the first embodiment.
- FIG. 2 is a schematic diagram for explaining an example of the target area.
- FIG. 3 is a schematic diagram for explaining an example of the target area.
- FIG. 4 is a schematic diagram showing an example when a plurality of reference objects are set.
- FIG. 5 is a flowchart for explaining the processing flow for setting the focal position.
- FIG. 6 is a schematic diagram for explaining an example of a target area according to the second embodiment.
- FIG. 7 is a flowchart for explaining the processing flow for setting the focal position.
- FIG. 8 is a schematic diagram illustrating an example in which motion of an object is set as a predetermined condition.
- FIG. 9 is a schematic block diagram of an imaging device according to the fourth embodiment.
- FIG. 9 is a schematic block diagram of an imaging device according to the fourth embodiment.
- FIG. 10 is a schematic diagram for explaining an example of the target area.
- FIG. 11 is a schematic diagram for explaining an example of the target area.
- FIG. 12 is a flowchart for explaining the processing flow for setting the focal position.
- FIG. 13 is a schematic diagram for explaining an example of a target area according to the fifth embodiment.
- FIG. 14 is a schematic diagram showing an example when a plurality of reference objects are set.
- FIG. 15 is a flowchart for explaining the processing flow for setting the focal position.
- FIG. 16 is a schematic diagram for explaining an example in which motion of an object is set as a predetermined condition.
- FIG. 17 is a schematic block diagram of an imaging device according to the seventh embodiment.
- FIG. 18 is a schematic diagram for explaining an example of the target area.
- FIG. 17 is a schematic block diagram of an imaging device according to the seventh embodiment.
- FIG. 19 is a schematic diagram for explaining an example of the target area.
- FIG. 20 is a schematic diagram showing another example of the target area.
- FIG. 21 is a schematic diagram showing another example of the target area.
- FIG. 22 is a schematic diagram showing an example of the target area when the first mode is set.
- FIG. 23 is a schematic diagram showing an example of the target area when the second mode is set.
- FIG. 24 is a flowchart for explaining the target area setting flow when the imaging device moves.
- FIG. 25 is a flowchart for explaining the processing flow for setting the focal position.
- FIG. 26 is a schematic block diagram of an imaging device according to the eighth embodiment.
- FIG. 27 is a schematic diagram showing an example of a target area in the eighth embodiment.
- FIG. 28 is a flowchart for explaining the alarm notification flow.
- FIG. 29 is a schematic diagram illustrating an example in which motion of an object is set as a predetermined condition.
- FIG. 30 is a schematic block diagram of an imaging device according to the tenth embodiment.
- FIG. 31 is a schematic diagram for explaining an example of the target area.
- FIG. 32 is a schematic diagram for explaining an example of the target area.
- FIG. 33 is a flowchart for explaining the setting flow of the target area.
- FIG. 34 is a schematic diagram for explaining setting of the focal position.
- FIG. 35 is a flowchart for explaining the processing flow for setting the focal position.
- FIG. 36 is a schematic diagram showing an example of focus position setting in the eleventh embodiment.
- FIG. 37 is a flowchart for explaining the processing flow for setting the focus position in the eleventh embodiment.
- FIG. 30 is a schematic block diagram of an imaging device according to the tenth embodiment.
- FIG. 31 is a schematic diagram for explaining an example of the target area.
- FIG. 32 is a schematic diagram for explaining an
- FIG. 38 is a schematic diagram showing an example of focus position setting in the twelfth embodiment.
- FIG. 39 is a schematic diagram showing an example of focus position setting in another example of the twelfth embodiment.
- FIG. 40 is a flowchart for explaining the processing flow for setting the focus position in the twelfth embodiment.
- FIG. 41 is a schematic diagram illustrating an example in which motion of an object is set as a predetermined condition.
- FIG. 1 is a schematic block diagram of an imaging device according to the first embodiment.
- An imaging device 100 according to the first embodiment is an imaging device that images an object within an imaging range.
- the imaging device 100 is an autofocus camera capable of automatically setting a focal position.
- the imaging device 100 may be a video camera that captures a moving image by capturing each predetermined frame, or may be a camera that captures a still image.
- the imaging device 100 may be used for any purpose, and may be used, for example, as a monitoring camera set at a predetermined position inside a facility or outdoors.
- the imaging device 100 includes an optical element 10, an imaging element 12, an image processing circuit 13, an object position measuring unit 14, an input unit 16, a display unit 18, a communication unit 20, It has a storage unit 22 and a control unit 24 .
- the optical element 10 is an optical system element such as a lens.
- the number of optical elements 10 may be one or plural.
- the imaging device 12 is a device that converts light incident through the optical device 10 into an image signal that is an electrical signal.
- the imaging element 12 is, for example, a CCD (Charge Coupled Device) sensor, a CMOS (Complementary Metal Oxide Semiconductor) sensor, or the like.
- the image processing circuit 13 generates image data for each frame from the image signal generated by the imaging device 12 .
- the image data is, for example, data including luminance and color information of each pixel in one frame, and may be data to which a gradation is assigned to each pixel.
- the object position measuring unit 14 is a sensor that measures the position of the object to be measured with respect to the imaging device 100 (relative position of the object).
- An object here may be any object, and may be a living thing or an inanimate object, and the same shall apply hereinafter. Also, the object here may refer to a movable object, but is not limited to that and may refer to an immovable object.
- the object position measurement unit 14 measures the distance from the imaging device 100 to the object as the relative position of the object.
- the object position measuring unit 14 may be any sensor capable of measuring the relative position of an object, and may be, for example, a TOF (Time Of Flight) sensor.
- a TOF sensor for example, a light emitting element (e.g., LED (Light Emitting Diode)) that emits light and a light receiving unit that receives light are provided.
- the distance to the object is measured from the time of flight of the light that has returned to the light-receiving unit.
- the object position measuring unit 14 measures the distance from the imaging device 100 to the object as the relative position of the object.
- the direction in which the object exists with respect to the imaging device 100 may also be measured.
- the object position measurement unit 14 measures the position (coordinates) of the object in a coordinate system having the imaging device 100 as the origin, and calculates the relative position of the object. May be measured as position.
- the input unit 16 is a mechanism that receives input (operation) from the user, and may be, for example, a button, keyboard, touch panel, or the like.
- the display unit 18 is a display panel that displays images.
- the display unit 18 may be capable of displaying an image for the user to set a target area AR, which will be described later, in addition to the image captured by the imaging device 100 .
- the communication unit 20 is a communication module that communicates with an external device, and may be, for example, an antenna or a Wi-Fi (registered trademark) module.
- the imaging device 100 communicates with an external device by wireless communication, but may be wired communication, and any communication method may be used.
- the storage unit 22 is a memory that stores various types of information such as captured image data, calculation contents and programs of the control unit 24.
- a main unit such as RAM (Random Access Memory) and ROM (Read Only Memory) is stored.
- At least one of a storage device and an external storage device such as an HDD (Hard Disk Drive) is included.
- the program for the control unit 24 stored in the storage unit 22 may be stored in a recording medium readable by the imaging device 100 .
- the control unit 24 is an arithmetic device and includes an arithmetic circuit such as a CPU (Central Processing Unit).
- the control unit 24 includes a target area acquisition unit 30 , an object information acquisition unit 32 , a focus position control unit 34 , an imaging control unit 36 and an image acquisition unit 38 .
- the control unit 24 reads out and executes a program (software) from the storage unit 22 to operate the target area acquisition unit 30, the object information acquisition unit 32, the focus position control unit 34, the imaging control unit 36, and the image acquisition unit 38. Realize and execute those processes.
- the control unit 24 may execute these processes by one CPU, or may be provided with a plurality of CPUs and may execute the processes by the plurality of CPUs.
- at least part of the processing of the target region acquisition unit 30, the object information acquisition unit 32, the focus position control unit 34, the imaging control unit 36, and the image acquisition unit 38 may be realized by hardware circuits.
- the object information acquisition unit 32 acquires position information of an object existing within the imaging area AR0.
- the object information acquisition unit 32 controls the object position measurement unit 14 to cause the object position measurement unit 14 to measure the relative position of the object with respect to the imaging device 100 .
- the object information acquisition unit 32 acquires the measurement result of the relative position of the object with respect to the imaging device 100 by the object position measurement unit 14 as the position information of the object.
- the object information acquisition unit 32 sequentially acquires the position information of the object by acquiring the position information of the object at predetermined time intervals.
- the object information acquisition unit 32 can also acquire information indicating the shape of the object (for example, the 3D shape of the object) based on the positional information of the object.
- the object information acquisition unit 32 can acquire the 3D shape of the object by accumulating a plurality of pieces of position information such as TOF image information.
- the target area acquisition unit 30 acquires information on the target area AR set within the imaging area of the imaging device 100 .
- the target area AR is an area set for automatically adjusting the focal position.
- the information of the target area AR is information indicating the position of the target area AR, that is, the position information of the target area AR.
- the target area AR will be described below.
- FIGS. 2 and 3 are schematic diagrams for explaining an example of the target area.
- 2 is a view of the imaging device 100 and the target area AR viewed from above in the vertical direction
- FIG. 3 is a view of the imaging device 100 and the target region AR viewed from the horizontal direction.
- the direction Z is defined as a vertical direction
- the direction X is defined as one horizontal direction orthogonal to the direction Z
- the direction Y is defined as a direction orthogonal to the direction Z and the direction X (horizontal direction).
- the range in which an image can be captured by the imaging device 100 is defined as an imaging area AR0.
- the imaging area AR0 refers to an area (space) within the angle of view of the imaging device 12, in other words, refers to a range captured as an image in real space.
- the target area AR is an area (space) set within the range of the imaging area AR0.
- the target area acquisition unit 30 sets the target area AR based on the position information of the reference object B.
- the reference object B is an object positioned within the imaging area AR0, which serves as a reference for setting the position of the target area AR.
- the target area acquisition unit 30 sets the target area AR based on the position information of the reference object B acquired by the object information acquisition unit 32 .
- the reference object B may be an object positioned between the first position AX1 and the second position AX2 within the imaging area AR0.
- the first position AX1 is a position that is the first distance L1 from the imaging device 100
- the second position AX2 is the second distance L2 that is shorter than the first distance L1 from the imaging device 100. position.
- the first position AX1 is a virtual position (coordinates) including positions (coordinates) at a first distance L1 from the imaging device 100 in the imaging area AR0. It can be said that it is a positive aspect.
- the second position AX2 can be said to be a virtual plane that includes positions (coordinates) within the imaging region AR0 that are the second distance L2 from the imaging device 100 . That is, the reference object B is surrounded by a virtual plane at a second distance L2 from the imaging device 100 and a virtual plane at a first distance L1 from the imaging device 100 in the imaging region AR0. It can be said that it is located within the area (space) AR0a.
- first position AX1 is not limited to a virtual plane in which all positions (coordinates) included in the first position AX1 are the first distance L1 from the imaging device 100, and at least A part of the positions (coordinates) may be a virtual plane that is the first distance L1 from the imaging device 100 .
- second position AX2 may be a virtual plane in which at least some positions (coordinates) included in the second position AX2 are the second distance L2 from the imaging device 100 .
- the reference object B is not limited to being positioned between the first position AX1 and the second position AX2, and may be positioned at any position within the imaging area AR0.
- the range in which the object position measurement unit 14 can perform distance measurement is a distance measurement area (distance measurement space)
- the reference object B may be an object positioned within the distance measurement area.
- the imaging area AR in FIGS. 2 to 4 may be treated as the ranging area.
- the reference object B may be an object that is stationary within the imaging area AR0, that is, an object that does not move. That is, for example, the reference object B may be an object such as a facility whose position is fixed within the imaging area AR0.
- the reference object B may be set by any method, for example, the target area acquisition unit 30 may automatically set the reference object B. In this case, for example, the target area acquiring unit 30 may select the reference object B by any method from among the objects positioned within the imaging area AR0. Further, for example, the reference object B may be set by the user. In this case, for example, the user inputs information for selecting the reference object B to the input unit 16, and the target region acquisition unit 30 sets the reference object B based on the information for specifying the reference object B specified by the user. may In this case, for example, an image within the imaging area AR is displayed on the display unit 18 in real time, and the user selects the image of the reference object B from among the objects shown in the image within the imaging area AR. , information for selecting the reference object B may be input.
- the target area acquisition unit 30 sets the target area AR based on the position information of the reference object B acquired by the object information acquisition unit 32 .
- the target area acquiring unit 30 acquires a predetermined size area (space) around the reference object B, that is, a predetermined size area including the position of the reference object B as the target area AR. set as In the examples of FIGS. 2 and 3, the target area acquiring unit 30 sets a circle (here, a sphere) having a predetermined radius centered on the position of the reference object B as the target area AR.
- the reference object B is positioned within the area AR0a between the first position AX1 and the second position AX2, so the target area AR is also located between the first position AX1 and the second position AX2. It is positioned within the area AR0a between the second position AX2.
- the target area AR is not limited to being set in a spherical shape centered on the position of the reference object B, and may be an area arbitrarily set based on the position information of the reference object B, and may be an area that is set at the first position AX1. and the second position AX2.
- FIG. 4 is a schematic diagram showing an example when a plurality of reference objects are set.
- a plurality of reference objects B may be set.
- the target area acquisition unit 30 sets the target area AR based on the position information of the plurality of reference objects B.
- FIG. 4 the target area acquisition unit 30 may set an area (space) surrounded by a plurality of reference objects B as the target area AR.
- the focus position control section 34 sets the focus position of the imaging device 100 .
- the focal position control section 34 controls the focal position by controlling the position of the optical element 10 , that is, by moving the position of the optical element 10 .
- the focal position control unit 34 adjusts the focal position to the object existing within the target area AR.
- the object here is an object other than the reference object B, and preferably refers to a moving object in the example of this embodiment.
- the focal position control unit 34 sets the focal position to the position of the object determined to exist within the target area AR.
- the focus position control unit 34 determines whether the object exists within the target area AR based on the position information of the object acquired by the object information acquisition unit 32 .
- the focus position control unit 34 determines that the object exists within the target area AR, and the object information acquiring unit Focus on the position of the object obtained by 32 .
- the focus position control unit 34 does not adjust the focus position for an object that does not exist within the target area AR.
- the focus position control unit 34 continues to focus on the object during the period when the focused object exists within the target area AR. That is, the focal position control unit 34 determines whether the object continues to exist within the target area AR based on the position information of the object obtained by the object information obtaining unit 32 at predetermined time intervals, and determines whether the object continues to exist within the target area AR. During the period in which the object continues to exist within the target area AR, the object continues to be focused. On the other hand, when the focused object moves out of the target area AR, that is, when it no longer exists within the target area AR, the focus position control unit 34 removes the focus position from the object. , focus on a position other than the object.
- the focal position control unit 34 does not need to focus on an object existing within the target area AR from the start of operation of the imaging device 100 (timing when imaging becomes possible). . That is, the focal position control section 34 may adjust the focal position with respect to an object that has entered the target area AR after the start of operation. In other words, the focal position control unit 34 controls an object that exists within the target area AR at a certain timing but does not exist within the target area AR at a timing prior to that timing. You can adjust the focal position from the timing. In other words, when an object moves from outside the target area AR into the target area AR, the object may be recognized as an object to be focused on by the focus position control section 34 . That is, the focal position control unit 34 may focus on an object that has moved from outside the target area AR into the target area AR.
- the focal position control unit 34 may adjust the focal position to a preset set position when an object does not exist within the target area AR.
- the set position may be set arbitrarily, but is preferably set within the target area AR, such as the center position of the target area AR.
- FIG. 2 shows an example in which the object A moves to a position A0 outside the target area AR, a position A1 inside the target area AR, and a position A2 outside the target area AR in this order.
- the focus position control unit 34 does not focus on the object A at the timing when the object A is present at the position A0, and instead focuses on the set position, for example. Then, the focus position control unit 34 focuses on the object A at the timing when the object A is positioned at the position A1, that is, at the timing when the object A enters the target area AR.
- the focus position control unit 34 continues to focus on the object A while the object A is positioned within the target area AR.
- the focus position control unit 34 removes the focus position from the object A and returns the focus position to the set position. That is, the focus position control unit 34 adjusts the focus position to the object A from the timing when the object A enters the target area AR, and during the period when the object A is moving within the target area AR, the moving object A , and the focal position is removed from the object A at the timing when the object A moves out of the target area AR.
- the focus position may be set by the user.
- the focal position is set by the focal position control section 34 as described above.
- the manual mode the user inputs an operation to set the focal position to the input unit 16, and the focal position control unit 34 sets the focal position according to the user's operation.
- the imaging control unit 36 controls imaging by the imaging device 100 to capture an image.
- the imaging control unit 36 controls, for example, the imaging element 12 and causes the imaging element 12 to acquire an image signal.
- the imaging control unit 36 may cause the imaging element 12 to automatically acquire an image signal, or may acquire an image signal according to a user's operation.
- the image acquisition unit 38 acquires image data acquired by the imaging device 12 .
- the image acquisition unit 38 controls the image processing circuit 13 to cause the image processing circuit 13 to generate image data from the image signal generated by the imaging element 12, and acquires the image data.
- the image acquisition unit 38 causes the storage unit 22 to store the image data.
- FIG. 5 is a flowchart for explaining the processing flow for setting the focal position.
- the control unit 24 acquires the position information of the reference object B by the object information acquisition unit 32 (step S10), and the target area acquisition unit 30 acquires the position information of the reference object B based on the position information of the target object B.
- An area AR is set (step S12).
- the control unit 24 acquires the position information of the object by the object information acquisition unit 32 (step S14).
- the order in which steps S10, S12, and S14 are performed may be arbitrary.
- the controller 24 uses the focus position controller 34 to determine whether the object is positioned within the target area AR based on the position information of the object (step S16). If the object is not located within the target area AR (step S16; No), the process returns to step S14 to continue acquiring the position information of the object. On the other hand, if the object is located within the target area AR (step S16; Yes), the focus position control unit 34 focuses on the object (step S18). After that, the acquisition of the position information of the object is continued, and it is determined whether the object has moved outside the target area AR (step S20).
- step S20 If the object does not move outside the target area AR (step S20; No), that is, if the object continues to exist within the target area AR, the process returns to step S18 to continue focusing on the object. If the object has moved outside the target area AR (step S20; Yes), the focus position control unit 34 defocuses the object (step S22). Thereafter, if the process is not to be ended (step S24; No), the process returns to step S14, and if the process is to be ended (step S24; Yes), this process is ended.
- the imaging device 100 has the imaging element 12, the object information acquisition section 32, the target area acquisition section 30, and the focus position control section .
- the object information acquisition unit 32 acquires position information of an object existing in the imaging area AR0 of the image sensor 12 .
- the target area acquisition unit 30 sets the target area AR based on the position information of the reference object B acquired by the object information acquisition unit 32 .
- the focus position control unit 34 controls the focus position of the imaging device 100 so that the focus position is aligned with the object.
- the imaging apparatus 100 sets the target area AR based on the position of the reference object B, and if an object exists within the target area AR, the imaging apparatus 100 performs imaging so as to focus on the object. Control the focal position of the device 100 . Therefore, for example, when there is an object to be noticed in monitoring, etc., it is possible to set the object as the reference object B and appropriately focus on the object in the vicinity thereof.
- the target area acquisition unit 30 may set the area around the reference object B as the target area AR. Therefore, objects in the vicinity of the reference object B of interest can be properly focused.
- the target area acquisition unit 30 may set an area surrounded by a plurality of reference objects B as the target area AR. Therefore, it is possible to appropriately focus on objects in the vicinity of a plurality of reference objects B of interest.
- the target area acquisition unit 30 may set the target area AR based on the position information of the reference object B that is stationary within the imaging area AR0. Therefore, it is possible to appropriately focus on an object in the vicinity of the stationary reference object B.
- FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
- the target area acquisition unit 30 has a first position AX1 at a first distance L1 from the imaging device 100 and a second position AX2 at a second distance L2 shorter than the first distance L1 from the imaging device 100.
- the target area AR may be set based on the position information of the reference object B positioned between. By setting an object located between the first position AX1 and the second position AX2 as the reference object B, it becomes possible to appropriately focus on objects near the reference object B located at such a position. .
- the second embodiment differs from the first embodiment in that the target area AR is set based on the position information of the moving reference object B.
- the target area acquisition unit 30 sets the target area AR based on the position information of the moving reference object B.
- the object information acquisition unit 32 acquires the position information of the reference object B one by one.
- the target area acquisition unit 30 sets the target area AR such that the target area AR also moves as the reference object B moves, that is, as the position information of the reference object B changes.
- the target area acquisition unit 30 preferably sets the target area AR such that the target area AR is also moved while the position (relative position) of the target area AR with respect to the reference object B is kept the same. That is, it can be said that the target area acquiring unit 30 sequentially updates the position of the target area AR while keeping the same position of the target area AR with respect to the reference object B as the reference object B moves.
- the focal position control unit 34 adjusts the focal position to the object existing within the target area AR.
- the focal position control unit 34 Do not focus on objects. That is, the focal position control unit 34 does not treat a stationary object as an object to be focused on, even if the object is located within the target area AR. Do not align.
- the focus position control unit 34 adjusts the focus position to the object. Whether or not the object is moving can be determined based on the position information of the object obtained by the object information acquisition unit 32 . That is, when the position information of an object that is continuous in time series changes, it can be determined that the object is moving.
- FIG. 6 is a schematic diagram for explaining an example of a target area according to the second embodiment.
- FIG. 6 exemplifies a case where the reference object B moves in the order of position B1, position B2, position B3, and position B4, and the target area AR is set as an area centered on the reference object B. .
- the target area AR also moves as the reference object B moves.
- the position of the target area AR at the timing when the reference object B is at the position B1 is assumed to be the position AR1
- the position of the target area AR at the timing when the reference object B is at the position B2 is assumed to be the position AR2
- the reference object The position of the target area AR at the timing when B is at position B3 is assumed to be position AR3
- the position of the target area AR at the timing when reference object B is at position B4 is assumed to be position AR4.
- the focal position control unit 34 instead of focusing on the object, focus on the set position.
- the stationary object Aa is positioned within the target area AR.
- the object Aa is stationary, and as the target area AR moves, the stationary object Aa is positioned within the target area AR.
- Adjust the focus position to the set position without adjusting the Since the set position here is set with reference to the target area AR, the set position also moves along with the movement of the target area AR.
- the set position is preferably moved while maintaining the same position (relative position) with respect to the target area AR.
- the moving object Ab is positioned within the target area AR.
- the focal position control unit 34 adjusts the focal position to the object Ab, and the object Ab is positioned within the target area AR. During the period, the focus position is kept on the object Ab.
- the object Ab is positioned outside the target area AR.
- the focus position control unit 34 removes the focus position from the object Ab at the timing when the object Ab is positioned outside the target area AR, and adjusts the focus position to the set position.
- FIG. 7 is a flowchart for explaining the processing flow for setting the focal position.
- the control unit 24 acquires the position information of the reference object B by the object information acquisition unit 32 (step S30), and the target area acquisition unit 30 acquires the position information of the reference object B based on the position information of the reference object B.
- An area AR is set (step S32).
- the control unit 24 acquires the position information of the object by the object information acquiring unit 32 (step S34).
- the order in which steps S30, S32, and S34 are performed may be arbitrary.
- the control unit 24 uses the focus position control unit 34 to determine whether the object is positioned within the target area AR based on the position information of the object (step S36). If the object is not located within the target area AR (step S36; No), the process returns to step S30 to continue acquiring the position information of the object while acquiring the position information of the reference object and updating the target area AR. On the other hand, if the object is located within the target area AR (step S36; Yes), the focus position control unit 34 determines whether the object is moving (step S38). If the object in the target area AR has not moved (step S38; No), the focus position control unit 34 returns to step S30, acquires the position information of the reference object, updates the target area AR, and adjusts the position of the object. Continue to acquire location information.
- step S40 the focus position control unit 34 adjusts the focus position to the object (step S40). After that, while acquiring the position information of the reference object and updating the target area AR, the acquisition of the position information of the object is continued, and it is determined whether the object has moved outside the target area AR (step S42). If the object does not move outside the target area AR (step S42; No), that is, if the object continues to exist within the target area AR, the process returns to step S40 to continue focusing on the object. If the object has moved outside the target area AR (step S42; Yes), the focus position control unit 34 defocuses the object (step S44). Thereafter, if the process is not to be ended (step S46; No), the process returns to step S30, and if the process is to be ended (step S46; Yes), this process is ended.
- the target area acquisition unit 30 sets the target area AR based on the position information of the moving reference object B.
- the target area acquisition unit 30 sets the target area AR such that the target area AR moves as the reference object B moves. Therefore, for example, when an object of interest moves in monitoring or the like, by moving the target area AR in accordance with the object, it is possible to appropriately focus on the vicinity of the moving object.
- the third embodiment differs from the first embodiment in that the focal position is adjusted to an object that exists within the target area AR and satisfies a predetermined condition.
- the focal position is adjusted to an object that exists within the target area AR and satisfies a predetermined condition.
- descriptions of parts that are common to the first embodiment will be omitted.
- the third embodiment can also be applied to the first embodiment.
- the focal position control unit 34 adjusts the focal position to an object that exists within the target area AR and satisfies a predetermined condition.
- the focal position control unit 34 does not focus on an object that does not satisfy at least one of being present in the target area AR and satisfying a predetermined condition.
- the focal position control unit 34 continues to focus on the object while the object on which the focal position is adjusted satisfies a predetermined condition and continues to exist within the target area AR.
- the focus position control unit 34 removes the focus position from the object when at least one of the existence of the object within the target area AR and the satisfaction of the predetermined condition is no longer satisfied.
- the focal position control unit 34 detects that the object Remove the focus position from
- the focal position control unit 34 may determine whether or not the predetermined condition is satisfied by any method. You can The positional information of the object here may refer to the measurement result of the object position measuring unit 14, and the image of the object may refer to image data of the object captured by the imaging device 12. FIG.
- the predetermined condition here may be any condition other than that the object exists within the target area AR.
- the predetermined condition may be at least one of that the object is performing a predetermined motion, that the object has a predetermined shape, and that the object faces a predetermined direction. Also, any two of these may be used as the predetermined conditions, or all of them may be used as the predetermined conditions.
- the focus position control unit 34 determines that the predetermined conditions are satisfied when all the conditions are satisfied.
- the focus position control unit 34 determines whether the object is moving in a predetermined manner based on the position information of the object that is continuously acquired in time series.
- the focal position control unit 34 adjusts the focal position with respect to an object existing within the target area AR and performing a predetermined motion.
- the focal position control unit 34 does not focus on an object that does not meet at least one of being within the target area AR and performing a predetermined movement.
- the focal position control unit 34 keeps the focal position on the object while the object on which the focal position is adjusted exists in the target area AR and continues a predetermined movement.
- the focus position control unit 34 removes the focus position from the object when the object no longer satisfies at least one of being present in the target area AR and performing a predetermined movement.
- the motion of the object here refers to the mode of movement of the object, and may refer to, for example, the direction and speed of movement of the object.
- the predetermined motion means moving downward in the vertical direction at a speed of 10 m/h or more
- the focal position control unit 34 moves downward in the vertical direction by 10 m/h or more in the target area AR. Focus on an object moving at a speed of
- the motion of an object is not limited to indicating the moving direction and moving speed of an object, and may refer to any mode of movement.
- motion of an object may refer to at least one of the direction and speed of movement of the object.
- FIG. 8 is a schematic diagram illustrating an example in which the motion of an object is set as a predetermined condition.
- the predetermined condition is that the object moves downward in the vertical direction (the direction opposite to the Z direction), that is, the moving direction of the object.
- the object A moves downward in the vertical direction from position A0a through positions A1a and A2a to position A3a, and stops at position A3a.
- the position A0a is outside the target area AR
- the positions A1a, A2a, and A3a are inside the target area AR.
- the focus position control unit 34 does not focus on the object A because the object A is outside the target area AR at the timing when the object A exists at the position A0a. .
- the focal position control unit 34 focuses on the object A at the timing when the object A is present at the position A1a, that is, at the timing when the object A enters the target area AR while moving downward in the vertical direction.
- the focus position control unit 34 continues to focus on the object A even at the timing when the object A is present at the position A2a, and removes the focus position from the object A at the timing when the object A moves to the position A3a and stops. , to return the focal position to the set position.
- the focus position control section 34 determines whether the object has a predetermined shape based on the image data showing the object.
- the focal position is adjusted to an object that exists within the target area AR and has a predetermined shape.
- the focal position control unit 34 does not focus on an object that does not satisfy at least one of being within the target area AR and having a predetermined shape.
- the focal position control unit 34 continues to focus on the object while the object on which the focal position is adjusted has a predetermined shape and continues to exist within the target area AR.
- the focus position control unit 34 removes the focus position from the object when the object no longer satisfies at least one of being in the target area AR and having a predetermined shape.
- the shape of the object here may be, for example, at least one of the size of the object and the outline of the object.
- the focus position control unit 34 adjusts the focus position to an object of a predetermined size or more that exists within the target area AR.
- 3D shape information acquired by the object information acquiring unit 32 may be used to acquire the shape information of the object.
- the focal position control unit 34 determines whether the object faces a predetermined direction based on the image data showing the object. A focal position is adjusted to an object that exists within the target area AR and faces in a predetermined direction. The focal position control unit 34 does not focus on an object that does not satisfy at least one of being in the target area AR and being oriented in a predetermined direction. The focal position control unit 34 continues to focus on the object while the object on which the focal position is adjusted continues to exist within the target area AR while facing the predetermined direction.
- the focus position control unit 34 removes the focus position from the object.
- 3D shape information acquired by the object information acquisition unit 32 may be used to acquire information on the orientation of the object.
- the predetermined condition may be set by any method, for example, it may be set in advance.
- the focal position control unit 34 may read out information indicating a predetermined condition (for example, moving direction and moving speed) from the storage unit 22, or may read the information indicating the predetermined condition from the storage unit 22, or may transmit the information to another device via the communication unit 20.
- a predetermined condition may be obtained from.
- the focus position control section 34 may automatically set the predetermined condition.
- the user may set a predetermined condition. In this case, for example, the user inputs information specifying a predetermined condition (for example, moving direction and moving speed) to the input unit 16, and the focal position control unit 34 controls the predetermined condition based on the information specified by the user. may be set.
- the focal position control unit 34 may focus on an object that is present in the target area AR and is performing a predetermined movement.
- the focus position control unit 34 keeps the focus position on the object while the object is moving in a predetermined manner, and removes the focus position from the object when the object stops moving in the predetermined direction.
- satisfying a predetermined motion is also a condition for focusing, so that an object that is moving in a specific manner can be tracked and the focus position can be adjusted appropriately. It is possible to match.
- the focus position control unit 34 may focus on an object that exists in the target area AR and has a predetermined shape.
- the object having a predetermined shape is also a condition for adjusting the focus position, so that an object with a specific shape can be tracked and the focus position can be appropriately adjusted. It becomes possible.
- the focal position control unit 34 may focus on an object that exists in the target area AR and faces in a predetermined direction. In this way, in addition to being within the target area AR, by setting the condition that the object is oriented in a predetermined direction as a condition for adjusting the focus position, an object in a specific direction is tracked and the focus position is appropriately adjusted. becomes possible.
- FIG. 9 is a schematic block diagram of an imaging device according to the fourth embodiment.
- An imaging device 100 according to the fourth embodiment is an imaging device that images an object within an imaging range.
- the imaging device 100 is an autofocus camera capable of automatically setting a focal position.
- the imaging device 100 may be a video camera that captures a moving image by capturing each predetermined frame, or may be a camera that captures a still image.
- the imaging device 100 may be used for any purpose, and may be used, for example, as a monitoring camera set at a predetermined position inside a facility or outdoors.
- the imaging device 100 includes an optical element 10, an imaging element 12, an image processing circuit 13, an object position measuring unit 14, an input unit 16, a display unit 18, a communication unit 20, It has a storage unit 22 and a control unit 24 .
- the optical element 10 is an optical system element such as a lens.
- the number of optical elements 10 may be one or plural.
- the imaging device 12 is a device that converts light incident through the optical device 10 into an image signal that is an electrical signal.
- the imaging element 12 is, for example, a CCD (Charge Coupled Device) sensor, a CMOS (Complementary Metal Oxide Semiconductor) sensor, or the like.
- the image processing circuit 13 generates image data for each frame from the image signal generated by the imaging device 12 .
- the image data is, for example, data including luminance and color information of each pixel in one frame, and may be data to which a gradation is assigned to each pixel.
- the object position measuring unit 14 is a sensor that measures the position of the object to be measured with respect to the imaging device 100 (relative position of the object).
- An object here may be any object, and may be a living thing or an inanimate object, and the same shall apply hereinafter. Also, the object here may refer to a movable object, but is not limited to that and may refer to an immovable object.
- the object position measurement unit 14 measures the distance from the imaging device 100 to the object as the relative position of the object.
- the object position measuring unit 14 may be any sensor capable of measuring the relative position of an object, and may be, for example, a TOF (Time Of Flight) sensor.
- a TOF sensor for example, a light emitting element (e.g., LED (Light Emitting Diode)) that emits light and a light receiving unit that receives light are provided.
- the distance to the object is measured from the time of flight of the light that has returned to the light-receiving unit.
- the object position measuring unit 14 measures the distance from the imaging device 100 to the object as the relative position of the object.
- the direction in which the object exists with respect to the imaging device 100 may also be measured.
- the object position measurement unit 14 measures the position (coordinates) of the object in a coordinate system having the imaging device 100 as the origin, and calculates the relative position of the object. May be measured as position.
- the input unit 16 is a mechanism that receives input (operation) from the user, and may be, for example, a button, keyboard, touch panel, or the like.
- the display unit 18 is a display panel that displays images.
- the display unit 18 may be capable of displaying an image for the user to set a target area AR, which will be described later, in addition to the image captured by the imaging device 100 .
- the communication unit 20 is a communication module that communicates with an external device, and may be, for example, an antenna or a Wi-Fi (registered trademark) module.
- the imaging device 100 communicates with an external device by wireless communication, but may be wired communication, and any communication method may be used.
- the storage unit 22 is a memory that stores various types of information such as captured image data, calculation contents and programs of the control unit 24.
- a main unit such as RAM (Random Access Memory) and ROM (Read Only Memory) is stored.
- At least one of a storage device and an external storage device such as an HDD (Hard Disk Drive) is included.
- the program for the control unit 24 stored in the storage unit 22 may be stored in a recording medium readable by the imaging device 100 .
- the control unit 24 is an arithmetic device and includes an arithmetic circuit such as a CPU (Central Processing Unit).
- the control unit 24 includes a target area acquisition unit 30 , an object information acquisition unit 32 , a focus position control unit 34 , an imaging control unit 36 and an image acquisition unit 38 .
- the control unit 24 reads out and executes a program (software) from the storage unit 22 to operate the target area acquisition unit 30, the object information acquisition unit 32, the focus position control unit 34, the imaging control unit 36, and the image acquisition unit 38. Realize and execute those processes.
- the control unit 24 may execute these processes by one CPU, or may be provided with a plurality of CPUs and may execute the processes by the plurality of CPUs.
- at least part of the processing of the target region acquisition unit 30, the object information acquisition unit 32, the focus position control unit 34, the imaging control unit 36, and the image acquisition unit 38 may be realized by hardware circuits.
- the target area acquisition unit 30 acquires information on the target area AR set within the imaging area of the imaging device 100 .
- the target area AR is an area set for automatically adjusting the focal position.
- the information of the target area AR refers to the shape information and movement information of the target area AR, which will be described later in detail.
- the target area AR will be described below.
- FIGS. 10 and 11 are schematic diagrams for explaining an example of the target area.
- 10 is a view of the imaging device 100 and the target area AR viewed from above in the vertical direction
- FIG. 11 is a view of the imaging device 100 and the target region AR viewed from the horizontal direction.
- the direction Z is defined as a vertical direction
- the direction X is defined as one horizontal direction orthogonal to the direction Z
- the direction Y is defined as a direction orthogonal to the direction Z and the direction X (horizontal direction).
- the range in which an image can be captured by the imaging device 100 is defined as an imaging area AR0.
- the imaging area AR0 refers to an area (space) within the angle of view of the imaging device 12, in other words, refers to a range captured as an image in real space.
- the target area AR is an area (space) set within the range of the imaging area AR0.
- the target area acquisition unit 30 sets the target area AR such that the target area AR moves within the imaging area AR0.
- the target area acquisition unit 30 acquires shape information and movement information of the target area AR, and based on the shape information and movement information of the target area AR, moves the target area AR so that the target area AR moves within the imaging area AR0.
- the shape information of the target area AR is information indicating the shape and size of the target area AR
- the movement information of the target area AR is information indicating how the target area AR moves.
- the movement information of the target area AR may be, for example, the position of the reference coordinates of the target area AR for each time, or the initial position of the reference coordinates of the target area AR, the movement direction and the movement of the reference coordinates of the target area AR.
- the reference coordinates here are the center coordinates of the target area AR, and the target area AR is set as a circular (spherical) area having a predetermined radius around the reference coordinates.
- the target area acquisition unit 30 preferably moves the target area AR so that the shape and size of the target area AR are kept the same.
- the target area AR is preferably set to be positioned within the imaging area AR0 and within the area between the first position AX1 and the second position AX2.
- the target area AR is set so as to move within the imaging area AR0 and within the area between the first position AX1 and the second position AX2 (in other words, so as not to go out of this area).
- the first position AX1 is a position that is the first distance L1 from the imaging device 100
- the second position AX2 is the second distance L2 that is shorter than the first distance L1 from the imaging device 100. position. As shown in FIGS.
- the first position AX1 is a virtual position (coordinates) including positions (coordinates) at a first distance L1 from the imaging device 100 within the imaging area AR0. It can be said that it is a positive aspect.
- the second position AX2 can be said to be a virtual plane that includes positions (coordinates) within the imaging region AR0 that are the second distance L2 from the imaging device 100 . That is, the target area AR is surrounded by a virtual plane at a second distance L2 from the imaging device 100 and a virtual plane at a first distance L1 from the imaging device 100 in the imaging region AR0. It may be set to move in space.
- first position AX1 is not limited to a virtual plane in which all positions (coordinates) included in the first position AX1 are the first distance L1 from the imaging device 100, and at least A part of the positions (coordinates) may be a virtual plane that is the first distance L1 from the imaging device 100 .
- second position AX2 may be a virtual plane in which at least some positions (coordinates) included in the second position AX2 are the second distance L2 from the imaging device 100 .
- the size and shape of the target area AR are not limited to those described above and may be arbitrary, and may not be a circular (spherical) area having a predetermined radius centered on the reference coordinates.
- the target area AR is an area set within the imaging area AR0, but it is not limited to this.
- the target area AR may be an area set within the ranging area.
- the imaging area AR0 in FIGS. 10 and 11 may be treated as the ranging area.
- the target area acquisition unit 30 may acquire the shape information and movement information of the target area AR by any method.
- shape information and movement information of the target area AR may be set in advance.
- the target area acquisition unit 30 may read the preset shape information and movement information of the target area AR from the storage unit 22, or may read the target area AR from another device via the communication unit 20.
- shape information and movement information may be acquired.
- the target area acquisition unit 30 may automatically set the shape information and movement information of the target area AR.
- the user may set the shape information and movement information of the target area AR.
- the user inputs the shape information and movement information of the target area AR (for example, the size of the target area AR and the position of the reference coordinates for each time) to the input unit 16, and the target area acquisition unit 30
- the target area AR may be set based on the shape information and movement information of the target area AR specified by .
- the object information acquisition unit 32 acquires position information of an object existing within the imaging area AR0.
- the object information acquisition unit 32 controls the object position measurement unit 14 to cause the object position measurement unit 14 to measure the relative position of the object with respect to the imaging device 100 .
- the object information acquisition unit 32 acquires the measurement result of the relative position of the object with respect to the imaging device 100 by the object position measurement unit 14 as the position information of the object.
- the object information acquisition unit 32 sequentially acquires the position information of the object by acquiring the position information of the object at predetermined time intervals.
- the object information acquisition unit 32 can also acquire information indicating the shape of the object (for example, the 3D shape of the object) based on the positional information of the object.
- the object information acquisition unit 32 can acquire the 3D shape of the object by accumulating a plurality of pieces of position information such as TOF image information.
- the focus position control section 34 sets the focus position of the imaging device 100 .
- the focal position control section 34 controls the focal position by controlling the position of the optical element 10 , that is, by moving the position of the optical element 10 .
- the focal position control unit 34 adjusts the focal position to the object existing within the target area AR.
- An object here preferably refers to a moving object.
- the focal position control unit 34 sets the focal position to the position of the object determined to exist within the target area AR.
- the focus position control unit 34 determines whether the object exists within the target area AR based on the position information of the object acquired by the object information acquisition unit 32 . If the position of the object acquired by the object information acquisition unit 32 overlaps with the position of the target area AR at that timing, the focus position control unit 34 determines that the object exists within the target area AR, and determines that the object exists within the target area AR.
- the focal position is adjusted to the position of the object acquired by the information acquisition unit 32 . On the other hand, the focus position control unit 34 does not adjust the focus position for an object that does not exist within the target area AR.
- the focus position control unit 34 continues to focus on the object during the period when the focused object exists within the target area AR. That is, the focal position control unit 34 determines whether the object continues to exist within the target area AR based on the position information of the object obtained by the object information obtaining unit 32 at predetermined time intervals, and determines whether the object continues to exist within the target area AR. During the period in which the object continues to exist within the target area AR, the object continues to be focused. On the other hand, when the focused object moves out of the target area AR, that is, when it no longer exists within the target area AR, the focus position control unit 34 removes the focus position from the object. , focus on a position other than the object.
- the focal position control unit 34 does not need to focus on an object existing within the target area AR from the start of operation of the imaging device 100 (timing when imaging becomes possible). . That is, the focal position control section 34 may adjust the focal position with respect to an object that has entered the target area AR after the start of operation. In other words, the focal position control unit 34 controls an object that exists within the target area AR at a certain timing but does not exist within the target area AR at a timing prior to that timing. You can adjust the focal position from the timing. In other words, when an object moves from outside the target area AR into the target area AR, the object may be recognized as an object to be focused on by the focus position control section 34 . That is, the focal position control unit 34 may focus on an object that has moved from outside the target area AR into the target area AR.
- the focal position control unit 34 may adjust the focal position to a preset set position when an object does not exist within the target area AR.
- the set position may be set arbitrarily, but is preferably set within the target area AR, such as the center position of the target area AR.
- the focus position control unit 34 does not focus on a stationary object, but rather focuses on a moving object. More specifically, when a stationary (non-moving) object is positioned within the target area AR due to the movement of the target area AR, the focal position control unit 34 does not focus. That is, the focal position control unit 34 does not treat a stationary object as an object to be focused on, even if the object is located within the target area AR. Do not align. On the other hand, when the moving object is positioned within the target area AR, that is, when the moving object reaches the target area AR, the focus position control unit 34 adjusts the focus position to the object. Whether or not the object is moving can be determined based on the position information of the object obtained by the object information acquisition unit 32 . That is, when the position information of an object that is continuous in time series changes, it can be determined that the object is moving.
- FIG. 10 exemplifies a case where the target area AR moves in order of position AR1, position AR2, position AR3, and position AR4.
- the focus position control unit 34 since there is no object within the target area AR at the position AR1, the focus position control unit 34 does not focus on the object at the timing when the target area AR is at the position AR1. focus on.
- the stationary object Aa is positioned within the target area AR. In this case, the object Aa is stationary, and as the target area AR moves, the stationary object Aa is positioned within the target area AR.
- the set position also moves along with the movement of the target area AR.
- the set position is preferably moved while maintaining the same position (relative position) with respect to the target area AR.
- the moving object Ab is positioned within the target area AR. In this case, the moving object Ab enters the target area AR. In the middle, keep focusing on the object Ab.
- the object Ab moves out of the target area AR.
- the focus position control unit 34 removes the focus position from the object Ab at the timing when the object Ab moves out of the target area AR, and adjusts the focus position to the set position.
- the focus position control unit 34 focuses on the object Ab from the timing when the moving object Ab enters the target area AR, and during the period when the object Ab is moving within the target area AR, The focus position is moved in accordance with the moving object Ab, and the focus position is removed from the object Ab at the timing when the object Ab moves out of the target area AR.
- the focus position may be set by the user.
- the focal position is set by the focal position control section 34 as described above.
- the manual mode the user inputs an operation to set the focal position to the input unit 16, and the focal position control unit 34 sets the focal position according to the user's operation.
- the imaging control unit 36 controls imaging by the imaging device 100 to capture an image.
- the imaging control unit 36 controls, for example, the imaging element 12 and causes the imaging element 12 to acquire an image signal.
- the imaging control unit 36 may cause the imaging element 12 to automatically acquire an image signal, or may acquire an image signal according to a user's operation.
- the image acquisition unit 38 acquires image data acquired by the imaging device 12 .
- the image acquisition unit 38 controls the image processing circuit 13 to cause the image processing circuit 13 to generate image data from the image signal generated by the imaging element 12, and acquires the image data.
- the image acquisition unit 38 causes the storage unit 22 to store the image data.
- FIG. 12 is a flowchart for explaining the processing flow for setting the focal position.
- the control unit 24 sets the target area AR so that the target area AR moves by the target area acquiring unit 30 (step S10). Then, the control unit 24 acquires the position information of the object using the object information acquisition unit 32 (step S12). The order in which steps S10 and S12 are performed may be arbitrary.
- the control unit 24 uses the focus position control unit 34 to determine whether the object is positioned within the target area AR based on the position information of the object (step S14).
- step S14 If the object is not located within the target area AR (step S14; No), the process returns to step S10 to continue acquiring the position information of the object while updating the target area AR (that is, moving the target area AR).
- the focus position control unit 34 determines whether the object is moving (step S16). If the object within the target area AR has not moved (step S16; No), the focus position control unit 34 returns to step S10 without focusing on the object, updates the target area AR, and adjusts the position of the object. Continue to acquire location information. If the object within the target area AR is moving (step S16; Yes), the focus position control unit 34 adjusts the focus position to the object (step S18).
- step S20 it is determined whether the object has moved outside the target area AR. If the object does not move outside the target area AR (step S20; No), that is, if the object continues to exist within the target area AR, the process returns to step S18 to continue focusing on the object. If the object has moved outside the target area AR (step S20; Yes), the focus position control unit 34 defocuses the object (step S22). After that, if the process is not to be ended (step S24; No), the process returns to step S10, and if the process is to be ended (step S24; Yes), this process is ended.
- the imaging device 100 has the imaging element 12, the object information acquisition section 32, the target area acquisition section 30, and the focus position control section .
- the object information acquisition unit 32 acquires position information of an object existing in the imaging area AR0 of the image sensor 12 .
- the target area acquisition unit 30 sets the target area AR so as to move within the imaging area AR0.
- the focus position control unit 34 controls the focus position of the imaging device 100 so that the focus position is aligned with the object.
- the imaging apparatus 100 is set to move the target area AR, and if an object exists within the target area AR, the focal point of the imaging apparatus 100 is set so as to adjust the focal position to the object. position control. Therefore, when an area to be noticed changes in time series, for example, in monitoring, it is possible to adjust the focus appropriately according to the change.
- the focus position control unit 34 does not adjust the focus position to the object when the object which does not move is located within the target area AR, and when the object which moves is located within the target area AR, the object is not focused. can be focused on. Therefore, it is possible to suppress the focusing position on a stationary object that has entered the target area AR as a result of the movement of the target area AR, and appropriately adjust the focus position on a moving object that has entered the target area AR. be able to.
- the target area acquisition unit 30 sets the target area AR to a first position AX1 at a first distance L1 from the imaging device 100 and a second distance L2 from the imaging device 100 that is shorter than the first distance L1.
- the target area AR may be set so as to be located between the second position AX2. Therefore, it is possible to properly focus on an object that has entered the target area AR.
- the fifth embodiment differs from the fourth embodiment in that the target area AR is set based on the position information of the moving reference object B.
- FIG. 13 is a schematic diagram for explaining an example of the target area according to the fifth embodiment.
- the target area acquisition unit 30 sets the target area AR based on the position information of the reference object B that moves.
- the reference object B is an object positioned within the imaging area AR0 that serves as a reference for setting the position of the target area AR, and is a moving object.
- the reference object B may be an object positioned between the first position AX1 and the second position AX2 within the imaging area AR0.
- the reference object B is not limited to being positioned between the first position AX1 and the second position AX2, and may be positioned at any position.
- the reference object B may be set by any method, for example, the target area acquisition unit 30 may automatically set the reference object B. In this case, for example, the target area acquiring unit 30 may select the reference object B by any method from among the objects positioned within the imaging area AR0. Further, for example, the reference object B may be set by the user. In this case, for example, the user inputs information for selecting the reference object B to the input unit 16, and the target region acquisition unit 30 sets the reference object B based on the information for specifying the reference object B specified by the user. may In this case, for example, an image within the imaging area AR is displayed on the display unit 18 in real time, and the user selects the image of the reference object B from among the objects shown in the image within the imaging area AR. , information for selecting the reference object B may be input.
- the target area acquisition unit 30 sets the target area AR based on the position information of the reference object B acquired by the object information acquisition unit 32 .
- the target area acquiring unit 30 acquires a predetermined size area (space) around the reference object B, that is, a predetermined size area including the position of the reference object B as the target area AR. set as In the example of FIG. 13, the target area acquiring unit 30 sets a circle (here, a sphere) having a predetermined radius centered on the position of the reference object B as the target area AR.
- the reference object B is positioned within the area AR0a between the first position AX1 and the second position AX2, so the target area AR is also located between the first position AX1 and the second position AX2. It is positioned within the area AR0a between the second position AX2.
- the target area AR is not limited to being set in a spherical shape centered on the position of the reference object B, and may be an area arbitrarily set based on the position information of the reference object B.
- the target area AR does not have to be positioned within the area AR0a between the first position AX1 and the second position AX2.
- FIG. 14 is a schematic diagram showing an example when multiple reference objects are set.
- a plurality of reference objects B may be set.
- the target area acquisition unit 30 sets the target area AR based on the position information of the plurality of reference objects B.
- FIG. 14 the target area acquisition unit 30 may set an area (space) surrounded by a plurality of reference objects B as the target area AR.
- the object information acquisition unit 32 sequentially acquires the position information of the reference object B.
- the target area acquisition unit 30 sequentially sets the target area AR based on the position information of the reference object B acquired by the object information acquisition unit 32 .
- the target area acquisition unit 30 sets the target area AR such that the target area AR also moves as the reference object B moves, that is, as the positional information of the reference object B changes.
- the target area acquisition unit 30 preferably sets the target area AR such that the target area AR is also moved while the position (relative position) of the target area AR with respect to the reference object B is kept the same. That is, it can be said that the target area acquiring unit 30 sequentially updates the position of the target area AR while keeping the same position of the target area AR with respect to the reference object B as the reference object B moves.
- FIG. 13 exemplifies a case where the reference object B moves in the order of position B1, position B2, position B3, and position B4, and the target area AR is set as an area centered on the reference object B. .
- the target area AR also moves as the reference object B moves.
- the position of the target area AR at the timing when the reference object B is at the position B1 is assumed to be the position AR1
- the position of the target area AR at the timing when the reference object B is at the position B2 is assumed to be the position AR2
- the reference object The position of the target area AR at the timing when B is at position B3 is assumed to be position AR3
- the position of the target area AR at the timing when reference object B is at position B4 is assumed to be position AR4.
- the focus position control unit 34 instead of focusing on the object, focus on the set position.
- the stationary object Aa is positioned within the target area AR.
- the object Aa is stationary, and as the target area AR moves, the stationary object Aa is positioned within the target area AR.
- Adjust the focus position to the set position without adjusting the Since the set position here is set with reference to the target area AR, the set position also moves along with the movement of the target area AR.
- the set position is preferably moved while maintaining the same position (relative position) with respect to the target area AR.
- the moving object Ab is positioned within the target area AR.
- the focal position control unit 34 adjusts the focal position to the object Ab, and the object Ab is positioned within the target area AR. During the period, the focus position is kept on the object Ab.
- the object Ab is positioned outside the target area AR.
- the focus position control unit 34 removes the focus position from the object Ab at the timing when the object Ab is positioned outside the target area AR, and adjusts the focus position to the set position.
- FIG. 15 is a flowchart for explaining the processing flow for setting the focal position.
- the control unit 24 acquires the position information of the reference object B by the object information acquisition unit 32 (step S30), and the target area acquisition unit 30 acquires the position information of the reference object B based on the position information of the reference object B.
- An area AR is set (step S32).
- the control unit 24 acquires the position information of the object by the object information acquiring unit 32 (step S34).
- the order in which steps S30, S32, and S34 are performed may be arbitrary.
- the control unit 24 uses the focus position control unit 34 to determine whether the object is positioned within the target area AR based on the position information of the object (step S36). If the object is not located within the target area AR (step S36; No), the process returns to step S30 to continue acquiring the position information of the object while acquiring the position information of the reference object and updating the target area AR. On the other hand, if the object is located within the target area AR (step S36; Yes), the focus position control unit 34 determines whether the object is moving (step S38). If the object in the target area AR has not moved (step S38; No), the focus position control unit 34 returns to step S30, acquires the position information of the reference object, updates the target area AR, and adjusts the position of the object. Continue to acquire location information.
- step S40 the focus position control unit 34 adjusts the focus position to the object (step S40). After that, while acquiring the position information of the reference object and updating the target area AR, the acquisition of the position information of the object is continued, and it is determined whether the object has moved outside the target area AR (step S42). If the object does not move outside the target area AR (step S42; No), that is, if the object continues to exist within the target area AR, the process returns to step S40 to continue focusing on the object. If the object has moved outside the target area AR (step S42; Yes), the focus position control unit 34 defocuses the object (step S44). Thereafter, if the process is not to be ended (step S46; No), the process returns to step S30, and if the process is to be ended (step S46; Yes), this process is ended.
- the target area acquisition unit 30 sets the target area AR to move based on the position information of the moving reference object B acquired by the object information acquisition unit 32. . That is, the target area acquiring unit 30 sets the target area AR such that the target area AR moves as the reference object B moves. Therefore, for example, when an object of interest moves in monitoring or the like, by moving the target area AR in accordance with the object, it is possible to appropriately focus on the vicinity of the moving object.
- the target area acquisition unit 30 sets the target area AR to move as the reference object B moves so that the position of the target area AR with respect to the reference object B remains the same. Therefore, by appropriately moving the target area AR in accordance with the object of interest, it is possible to more appropriately focus on the vicinity of the moving object.
- the sixth embodiment differs from the fourth embodiment in that the focal position is adjusted to an object that exists within the target area AR and satisfies a predetermined condition.
- the sixth embodiment descriptions of parts that are common to the fourth embodiment will be omitted.
- the sixth embodiment is also applicable to the fifth embodiment.
- the focal position control unit 34 adjusts the focal position to an object that exists within the target area AR and satisfies a predetermined condition.
- the focal position control unit 34 does not focus on an object that does not satisfy at least one of being present in the target area AR and satisfying a predetermined condition.
- the focal position control unit 34 continues to focus on the object while the object on which the focal position is adjusted satisfies a predetermined condition and continues to exist within the target area AR.
- the focus position control unit 34 removes the focus position from the object when at least one of the existence of the object within the target area AR and the satisfaction of the predetermined condition is no longer satisfied.
- the focal position control unit 34 detects that the object Remove the focus position from
- the focal position control unit 34 may determine whether or not the predetermined condition is satisfied by any method. You can The positional information of the object here may refer to the measurement result of the object position measuring unit 14, and the image of the object may refer to image data of the object captured by the imaging device 12. FIG.
- the predetermined condition here may be any condition other than that the object exists within the target area AR.
- the predetermined condition may be at least one of that the object is performing a predetermined motion, that the object has a predetermined shape, and that the object faces a predetermined direction. Also, any two of these may be used as the predetermined conditions, or all of them may be used as the predetermined conditions.
- the focus position control unit 34 determines that the predetermined conditions are satisfied when all the conditions are satisfied.
- the focus position control unit 34 determines whether the object is moving in a predetermined manner based on the position information of the object that is continuously acquired in time series.
- the focal position control unit 34 adjusts the focal position with respect to an object existing within the target area AR and performing a predetermined motion.
- the focal position control unit 34 does not focus on an object that does not meet at least one of being within the target area AR and performing a predetermined movement.
- the focal position control unit 34 keeps the focal position on the object while the object on which the focal position is adjusted exists in the target area AR and continues a predetermined movement.
- the focus position control unit 34 removes the focus position from the object when the object no longer satisfies at least one of being present in the target area AR and performing a predetermined movement.
- the motion of the object here refers to the mode of movement of the object, and may refer to, for example, the direction and speed of movement of the object.
- the predetermined motion means moving downward in the vertical direction at a speed of 10 m/h or more
- the focal position control unit 34 moves downward in the vertical direction by 10 m/h or more in the target area AR. Focus on an object moving at a speed of
- the motion of an object is not limited to indicating the moving direction and moving speed of an object, and may refer to any mode of movement.
- motion of an object may refer to at least one of the direction and speed of movement of the object.
- FIG. 16 is a schematic diagram illustrating an example in which the motion of an object is set as a predetermined condition.
- the predetermined condition is that the object moves downward in the vertical direction (the direction opposite to the Z direction), that is, the moving direction of the object.
- the object A moves vertically downward from position A0a through positions A1a and A2a to position A3a and stops at position A3a.
- the position A0a is outside the target area AR, and the positions A1a, A2a, and A3a are inside the target area AR.
- the focus position control unit 34 does not focus on the object A because the object A is outside the target area AR at the timing when the object A exists at the position A0a. .
- the focal position control unit 34 focuses on the object A at the timing when the object A is present at the position A1a, that is, at the timing when the object A enters the target area AR while moving downward in the vertical direction.
- the focus position control unit 34 continues to focus on the object A even at the timing when the object A is present at the position A2a, and removes the focus position from the object A at the timing when the object A moves to the position A3a and stops. , to return the focal position to the set position.
- the focus position control section 34 determines whether the object has a predetermined shape based on the image data showing the object.
- the focal position is adjusted to an object that exists within the target area AR and has a predetermined shape.
- the focal position control unit 34 does not focus on an object that does not satisfy at least one of being within the target area AR and having a predetermined shape.
- the focal position control unit 34 continues to focus on the object while the object on which the focal position is adjusted has a predetermined shape and continues to exist within the target area AR.
- the focus position control unit 34 removes the focus position from the object when the object no longer satisfies at least one of being in the target area AR and having a predetermined shape.
- the shape of the object here may be, for example, at least one of the size of the object and the outline of the object.
- the focus position control unit 34 adjusts the focus position to an object of a predetermined size or more that exists within the target area AR.
- 3D shape information acquired by the object information acquiring unit 32 may be used to acquire the shape information of the object.
- the focal position control unit 34 determines whether the object faces a predetermined direction based on the image data showing the object. A focal position is adjusted to an object that exists within the target area AR and faces in a predetermined direction. The focal position control unit 34 does not focus on an object that does not satisfy at least one of being in the target area AR and being oriented in a predetermined direction. The focal position control unit 34 continues to focus on the object while the object on which the focal position is adjusted continues to exist within the target area AR while facing the predetermined direction.
- the focus position control unit 34 removes the focus position from the object.
- 3D shape information acquired by the object information acquisition unit 32 may be used to acquire information on the orientation of the object.
- the predetermined condition may be set by any method, for example, it may be set in advance.
- the focal position control unit 34 may read out information indicating a predetermined condition (for example, moving direction and moving speed) from the storage unit 22, or may read the information indicating the predetermined condition from the storage unit 22, or may transmit the information to another device via the communication unit 20.
- a predetermined condition may be obtained from.
- the focus position control section 34 may automatically set the predetermined condition.
- the user may set a predetermined condition. In this case, for example, the user inputs information specifying a predetermined condition (for example, moving direction and moving speed) to the input unit 16, and the focal position control unit 34 controls the predetermined condition based on the information specified by the user. may be set.
- the focal position control section 34 may focus on an object that is present in the target area AR and is performing a predetermined movement.
- the focal point control unit 34 keeps the focal point on the object while the object is moving, and removes the focal point from the object when the object stops moving. In this way, in addition to being within the target area AR, satisfying a predetermined motion is also a condition for matching the focus position. It is possible to match.
- the focus position control unit 34 may focus on an object that exists in the target area AR and has a predetermined shape.
- the object having a predetermined shape is also a condition for adjusting the focus position, so that an object with a specific shape can be tracked and the focus position can be appropriately adjusted. It becomes possible.
- the focal position control unit 34 may focus on an object that exists in the target area AR and faces in a predetermined direction. In this way, in addition to being within the target area AR, by setting the focal position to be oriented in a predetermined direction as a condition for adjusting the focal position, an object in a specific orientation is tracked and the focal position is adjusted appropriately. becomes possible.
- FIG. 17 is a schematic block diagram of an imaging device according to the seventh embodiment.
- An imaging device 100 according to the seventh embodiment is an imaging device that images an object within an imaging range.
- the imaging device 100 is an autofocus camera capable of automatically setting a focal position.
- the imaging device 100 may be a video camera that captures a moving image by capturing each predetermined frame, or may be a camera that captures a still image.
- the imaging device 100 may be used for any purpose, and may be used, for example, as a monitoring camera set at a predetermined position inside a facility or outdoors.
- the imaging device 100 includes an optical element 10, an imaging element 12, an image processing circuit 13, an object position measurement unit 14, a self-position measurement unit 15, an input unit 16, a display unit 18, and a , a communication unit 20 , a storage unit 22 , and a control unit 24 .
- the optical element 10 is an optical system element such as a lens.
- the number of optical elements 10 may be one or plural.
- the imaging device 12 is a device that converts light incident through the optical device 10 into an image signal that is an electrical signal.
- the imaging element 12 is, for example, a CCD (Charge Coupled Device) sensor, a CMOS (Complementary Metal Oxide Semiconductor) sensor, or the like.
- the image processing circuit 13 generates image data for each frame from the image signal generated by the imaging device 12 .
- the image data is, for example, data including luminance and color information of each pixel in one frame, and may be data to which a gradation is assigned to each pixel.
- the object position measuring unit 14 is a sensor that measures the position of the object to be measured with respect to the imaging device 100 (relative position of the object).
- An object here may be any object, and may be a living thing or an inanimate object, and the same shall apply hereinafter. Also, the object here may refer to a movable object, but is not limited to that and may refer to an immovable object.
- the object position measurement unit 14 measures the distance from the imaging device 100 to the object as the relative position of the object.
- the object position measuring unit 14 may be any sensor capable of measuring the relative position of an object, and may be, for example, a TOF (Time Of Flight) sensor.
- a TOF sensor for example, a light emitting element (e.g., LED (Light Emitting Diode)) that emits light and a light receiving unit that receives light are provided, and the object is irradiated from the light emitting element.
- the distance to the object is measured from the time of flight of the light returned to the light-receiving unit.
- the object position measuring unit 14 measures the distance from the imaging device 100 to the object as the relative position of the object.
- the direction in which the object exists with respect to the imaging device 100 may also be measured.
- the object position measurement unit 14 measures the position (coordinates) of the object in a coordinate system having the imaging device 100 as the origin, and calculates the relative position of the object. May be measured as position.
- the self-position measuring unit 15 is a sensor that measures the position of the imaging device 100 (that is, the self-position). In this embodiment, the self-localization unit 15 measures the position (coordinates) and orientation (orientation) of the imaging device 100 .
- the self-localization unit 15 may be any sensor capable of measuring the position and orientation of the imaging device 100. For example, it may be a three-dimensional acceleration sensor or a gyro sensor that measures acceleration in three axes of the imaging device 100. you can For example, the self-localization unit 15 measures the acceleration in three axes of the imaging device 100 to determine the position and orientation of the imaging device 100, that is, the position and orientation of the imaging device 100 before movement, and the position and orientation of the imaging device after movement. 100 positions and orientations can be measured. However, the self-localization unit 15 is not limited to measuring both the position and orientation of the imaging device 100 , and may measure at least one of the position and orientation of the imaging device 100 .
- the input unit 16 is a mechanism that receives input (operation) from the user, and may be, for example, a button, keyboard, touch panel, or the like.
- the display unit 18 is a display panel that displays images.
- the display unit 18 may be capable of displaying an image for the user to set a target area AR, which will be described later, in addition to the image captured by the imaging device 100 .
- the communication unit 20 is a communication module that communicates with an external device, and may be, for example, an antenna or a Wi-Fi (registered trademark) module.
- the imaging device 100 communicates with an external device by wireless communication, but may be wired communication, and any communication method may be used.
- the storage unit 22 is a memory that stores various types of information such as captured image data, calculation contents and programs of the control unit 24.
- a main unit such as RAM (Random Access Memory) and ROM (Read Only Memory) is stored.
- At least one of a storage device and an external storage device such as an HDD (Hard Disk Drive) is included.
- the program for the control unit 24 stored in the storage unit 22 may be stored in a recording medium readable by the imaging device 100 .
- the control unit 24 is an arithmetic device and includes an arithmetic circuit such as a CPU (Central Processing Unit).
- the control unit 24 includes a self-position acquisition unit 28 , a target area acquisition unit 30 , an object information acquisition unit 32 , a focus position control unit 34 , an imaging control unit 36 and an image acquisition unit 38 .
- the control unit 24 controls the self-position acquisition unit 28, the target area acquisition unit 30, the object information acquisition unit 32, the focus position control unit 34, and the imaging control unit 36.
- An image acquisition unit 38 is implemented to execute those processes.
- the control unit 24 may execute these processes by one CPU, or may be provided with a plurality of CPUs and may execute the processes by the plurality of CPUs.
- at least part of the processing of the self-position acquisition unit 28, the target area acquisition unit 30, the object information acquisition unit 32, the focus position control unit 34, the imaging control unit 36, and the image acquisition unit 38 is realized by hardware circuits.
- the self-position acquisition unit 28 acquires position information of the imaging device 100 .
- the position information of the imaging device 100 is information indicating the position (coordinates) and orientation (orientation) of the imaging device 100 .
- the self-position acquisition unit 28 controls the self-position measurement unit 15 to cause the self-position measurement unit 15 to measure the position and orientation of the imaging device 100 .
- the self-position acquisition unit 28 acquires the measurement result of the position and orientation of the imaging device 100 by the self-position measurement unit 15 as position information of the imaging device 100 .
- the object information acquisition unit 32 sequentially acquires the position information of the imaging device 100 by acquiring the position information of the imaging device 100 at predetermined time intervals. Note that the position information of the imaging device 100 is not limited to both the position and orientation of the imaging device 100 , and may be information indicating at least one of the position and orientation of the imaging device 100 .
- the self-position acquisition unit 28 determines whether the imaging device 100 has moved based on the position information of the imaging device 100 .
- the self-position acquisition unit 28 determines that the imaging device 100 has moved when the position information of the imaging device 100 has changed, and determines that the imaging device 100 has moved when the position information of the imaging device 100 has not changed. decide not to.
- the position information of the imaging device 100 changes the position information of the imaging device 100 acquired immediately before (here, at least one of the position and orientation) and the position information of the imaging device 100 acquired this time (here, at least one of the position and orientation). on the other hand) is equal to or greater than a predetermined value.
- the self-position acquisition unit 28 may acquire the degree of movement of the imaging device 100 when determining that the imaging device 100 has moved.
- the degree of movement of the imaging device 100 here refers to the direction and amount of movement of the position (coordinates) and orientation (orientation) of the imaging device 100 .
- the self-position acquisition unit 28 uses the direction and amount of change when the position and orientation of the imaging device 100 acquired immediately before to the position and orientation of the imaging device 100 acquired this time as the degree of movement of the imaging device 100. can be calculated.
- the target area acquisition unit 30 acquires information on the target area AR set within the imaging area of the imaging device 100 .
- the target area AR is an area set for automatically adjusting the focal position.
- the information of the target area AR is information indicating the position of the target area AR, that is, the position information of the target area AR.
- the target area AR will be described below.
- FIGS. 18 and 19 are schematic diagrams for explaining an example of the target area.
- 18 is a view of the imaging device 100 and the target area AR viewed from above in the vertical direction
- FIG. 19 is a view of the imaging device 100 and the target region AR viewed from the horizontal direction.
- the direction Z is defined as a vertical direction
- the direction X is defined as one horizontal direction orthogonal to the direction Z
- the direction Y is defined as a direction orthogonal to the direction Z and the direction X (horizontal direction).
- the range in which an image can be captured by the imaging device 100 is defined as an imaging area AR0.
- the imaging area AR0 refers to an area (space) within the angle of view of the image sensor 12, in other words, refers to a range captured as an image in real space.
- the target area AR is an area (space) set within the range of the imaging area AR0.
- the target area AR is an area within the imaging area AR0 and between the first position AX1 and the second position AX2.
- the first position AX1 is a position that is the first distance L1 from the imaging device 100
- the second position AX2 is the second distance L2 that is shorter than the first distance L1 from the imaging device 100. position.
- the first position AX1 is a virtual position (coordinates) including positions (coordinates) at a first distance L1 from the imaging device 100 within the imaging area AR0. It can be said that it is a positive aspect.
- the second position AX2 can be said to be a virtual plane that includes positions (coordinates) within the imaging region AR0 that are the second distance L2 from the imaging device 100 . That is, the target area AR is surrounded by a virtual plane at a second distance L2 from the imaging device 100 and a virtual plane at a first distance L1 from the imaging device 100 in the imaging region AR0. It can be said that it is space.
- the first position AX1 is not limited to a virtual plane in which all positions (coordinates) included in the first position AX1 are the first distance L1 from the imaging device 100, and at least A part of the positions (coordinates) may be a virtual plane that is the first distance L1 from the imaging device 100 .
- the second position AX2 may be a virtual plane in which at least some positions (coordinates) included in the second position AX2 are the second distance L2 from the imaging device 100 .
- 20 and 21 are schematic diagrams showing other examples of target regions.
- the target area AR is separated from the imaging area AR0 by the first position AX1 and the second position AX2 in the optical axis direction of the imaging device 100 (the depth direction of the image).
- the radiation direction (the direction in which the angle of view spreads) with respect to the optical axis direction of the imaging device 100 is not partitioned with respect to the imaging area AR0.
- the end surface of the target area AR in the widening direction of the angle of view matches the end surface of the imaging area AR0 in the widening direction of the angle of view.
- the target area AR may be separated from the imaging area AR0 also in the widening direction of the angle of view.
- the target area AR may be separated from the imaging area AR0 by the third position AX3 also in the widening direction of the angle of view.
- the third position AX3 is a virtual surface (here, a closed curved surface of a cylindrical side surface) including a position (coordinates) at a predetermined distance radially outward from the optical axis LX of the imaging device 100.
- the target area AR is an area (space) surrounded by the first position AX1, the second position AX2, and the third position AX3.
- the third position AX3 is not limited to a virtual plane in which all positions (coordinates) included in the third position AX3 are the first distance L1 from the optical axis LX, and at least A part of the positions (coordinates) may be a virtual plane that is the third distance L3 from the optical axis LX.
- the third position AX3 may be a virtual plane that spreads outward in the radial direction (horizontal direction and elevation angle direction) at a predetermined angle as it moves away from the imaging device 100 along the optical axis direction.
- the size and shape of the target area AR are not limited to those described above and may be arbitrary.
- the position of the target area AR is not limited to the above description and may be arbitrary.
- the target area AR is not limited to being positioned between the first position AX1 and the second position AX2.
- the target area AR is an area set within the imaging area AR0, but it is not limited to this.
- the target area AR may be an area set within the distance measurement area.
- the imaging area AR0 in FIGS. 18 to 21 may be treated as the ranging area.
- the target area acquisition unit 30 may acquire information on the target area AR by any method.
- the position of the target area AR may be set in advance.
- the target area acquisition unit 30 may read the preset position information of the target area AR from the storage unit 22, or may receive the position information of the target area AR from another device via the communication unit 20. can be obtained.
- the target area acquiring unit 30 may automatically set the position of the target area AR.
- the user may set the position of the target area AR.
- the user inputs information specifying the position of the target area AR (for example, the values of the first distance L1, the second distance L2, and the third distance L3) to the input unit 16, and the target area acquisition unit 30 may set the target area AR based on the position information of the target area AR specified by the user.
- the target area AR may be set by specifying coordinates. For example, in the example of FIG. 18, the coordinates P1, P2, P3, and P4 that are the vertex positions of the target area AR may be designated, and the area surrounded by the coordinates P1 to P4 may be set as the target area AR.
- Target area when imaging device moves it is required to appropriately set the target area AR even when the imaging device 100 moves.
- the target area acquiring unit 30 fixes the position of the target area AR and does not move the target area AR. Accordingly, by fixing the target area AR, it is possible to suppress unintentional change of the area of interest due to the movement of the imaging device 100 .
- the target area acquiring unit 30 acquires mode information indicating whether or not to move the target area AR, and the self-position acquiring unit 28 determines that the imaging device 100 has moved. If so, it is determined whether or not to move the target area AR based on the mode information.
- mode information indicating whether or not to move the target area AR
- self-position acquiring unit 28 determines that the imaging device 100 has moved. If so, it is determined whether or not to move the target area AR based on the mode information.
- the target area acquisition unit 30 acquires mode information.
- Mode information is information indicating whether or not to move the target area AR when the imaging device 100 moves. Either information indicating the first mode or information indicating the second mode is assigned to the mode information.
- the first mode is a mode in which the target area AR is not moved (the position of the target area AR is fixed) when the imaging device 100 is moved. This is the mode for moving the AR.
- FIG. 22 is a schematic diagram showing an example of the target area when the first mode is set.
- the target area acquisition unit 30 acquires the mode information indicating the first mode
- the target area acquisition unit 30 sets the target area AR to the first mode in which the target area AR is fixed
- the self-position acquisition unit 28 determines that the imaging device 100 has moved.
- the target area AR is not moved. That is, in the first mode, regardless of the position of the imaging device 100, the position of the target area AR is fixed.
- FIG. 22 exemplifies a case where the imaging device 100 moves from the position 100a to the position 100b. In this case, the position of the imaging area AR0 moves from the position AR0a to the position AR0b.
- the position of the target area AR remains fixed and does not move. Since the range-finding area also moves along with the movement of the imaging device 100, it can be said that the range-finding area moves from the position AR0a to the position AR0b.
- FIG. 23 is a schematic diagram showing an example of the target area when the second mode is set.
- the target area acquisition unit 30 acquires the mode information indicating the second mode
- the target area acquisition unit 30 sets the target area AR to the second mode that permits the movement of the target area AR. If so, the target area AR is moved.
- the target area acquisition unit 30 preferably moves the target area AR so that the shape and size of the target area AR are kept the same.
- the target area acquiring unit 30 preferably moves the target area AR based on the degree of movement of the imaging apparatus 100 so that the position (relative position) of the target area AR with respect to the imaging apparatus 100 is kept the same.
- FIG. 23 exemplifies a case where the imaging device 100 moves from the position 100a to the position 100b.
- the position of the imaging area AR0 moves from the position AR0a to the position AR0b.
- the second mode is set, the position of the target area AR moves from the position ARa to the position ARb. That is, when the imaging device 100 is at the position 100a, the target area AR is positioned at the position ARa, and when the imaging device 100 moves to the position 100b, the target area AR moves to the position ARb.
- the target area acquisition unit 30 preferably does not move the position of the target area AR in either the first mode or the second mode.
- the target area acquisition unit 30 may acquire the information of the mode information by any method. For example, mode information, that is, whether to set the first mode or the second mode may be set in advance. In this case, the target area acquisition unit 30 may read preset mode information from the storage unit 22 or acquire mode information from another device via the communication unit 20 . Further, for example, when the mode information is not set in advance, the target area acquiring section 30 may automatically set the mode information. Alternatively, for example, the user may set the mode information. In this case, for example, the user inputs information designating the mode (information designating whether the first mode or the second mode) to the input unit 16, and the target area acquisition unit 30 receives the information specified by the user. The mode may be set based on the mode information obtained.
- mode information that is, whether to set the first mode or the second mode may be set in advance.
- the target area acquisition unit 30 may read preset mode information from the storage unit 22 or acquire mode information from another device via the communication unit 20 . Further, for example, when the mode information is not set in advance
- the target area acquisition unit 30 may switch between the first mode and the second mode. In this case, when the target area acquiring section 30 acquires the mode information indicating that the mode is to be switched, the target area acquiring section 30 may switch the mode based on the mode information.
- FIG. 24 is a flowchart for explaining the target area setting flow when the imaging device moves.
- the control unit 24 acquires the mode information and the information of the target area AR by the target area acquiring unit 30 (step S10), sets the mode based on the mode information, and sets the mode based on the information of the target area AR. Based on this, the target area AR is set.
- the control unit 24 uses the self-position acquisition unit 28 to determine whether the imaging device 100 has moved (step S12). Based on the position information of the imaging device 100, the self-position acquisition unit 28 determines whether the imaging device 100 has moved.
- step S12; Yes When it is determined that the imaging device 100 has moved (step S12; Yes) and when the second mode is set (step S14; Yes), the target region acquiring unit 30 acquires the target region AR. Move (step S16). Thereafter, if the process is not to be ended (step S18; No), the process returns to step S12, and if the process is to be ended (step S18; Yes), this process is ended. On the other hand, when it is determined that the imaging device 100 does not move (step S12; No), the process proceeds to step S18 without moving the target area AR. If it is determined that the imaging device 100 has moved and the second mode is set (step S14; No), the process proceeds to step S18 without moving the target area AR.
- the seventh embodiment it is not essential to set either the first mode or the second mode as described above, and even if the imaging device 100 moves, the position of the target area AR moves. It is sufficient if it can be set so as not to do so.
- the object information acquisition unit 32 acquires position information of an object existing within the imaging area AR0.
- the object information acquisition unit 32 controls the object position measurement unit 14 to cause the object position measurement unit 14 to measure the relative position of the object with respect to the imaging device 100 .
- the object information acquisition unit 32 acquires the measurement result of the relative position of the object with respect to the imaging device 100 by the object position measurement unit 14 as the position information of the object.
- the object information acquisition unit 32 sequentially acquires the position information of the object by acquiring the position information of the object at predetermined time intervals.
- the object information acquisition unit 32 can also acquire information indicating the shape of the object (for example, the 3D shape of the object) based on the positional information of the object.
- the object information acquisition unit 32 can acquire the 3D shape of the object by accumulating a plurality of pieces of position information such as TOF image information.
- the focus position control section 34 sets the focus position of the imaging device 100 .
- the focal position control section 34 controls the focal position by controlling the position of the optical element 10 , that is, by moving the position of the optical element 10 .
- the focal position control unit 34 adjusts the focal position to the object existing within the target area AR. In other words, the focal position control unit 34 sets the focal position to the position of the object determined to exist within the target area AR. In this embodiment, the focus position control unit 34 determines whether the object exists within the target area AR based on the position information of the object acquired by the object information acquisition unit 32 . When the position of the object acquired by the object information acquiring unit 32 overlaps with the position of the target area AR, the focus position control unit 34 determines that the object exists within the target area AR, and the object information acquiring unit Focus on the position of the object obtained by 32 .
- the focus position control unit 34 determines that the object exists within the target area AR. Decide and focus on the object.
- the focus position control unit 34 does not adjust the focus position for an object that does not exist within the target area AR. That is, for example, when the distance from the imaging device 100 to the object is longer than the first distance L1 or shorter than the second distance L2, the focus position control unit 34 determines that the object does not exist within the target area AR. Make a decision and do not focus on that object.
- the focus position control unit 34 continues to focus on the object during the period when the focused object exists within the target area AR. That is, the focal position control unit 34 determines whether the object continues to exist within the target area AR based on the position information of the object obtained by the object information obtaining unit 32 at predetermined time intervals, and determines whether the object continues to exist within the target area AR. During the period in which the object continues to exist within the target area AR, the object continues to be focused. On the other hand, when the focused object moves out of the target area AR, that is, when it no longer exists within the target area AR, the focus position control unit 34 removes the focus position from the object. , focus on a position other than the object.
- the focal position control unit 34 does not need to focus on an object existing within the target area AR from the start of operation of the imaging device 100 (timing when imaging becomes possible). . That is, the focal position control section 34 may adjust the focal position with respect to an object that has entered the target area AR after the start of operation. In other words, the focal position control unit 34 controls an object that exists within the target area AR at a certain timing but does not exist within the target area AR at a timing prior to that timing. You can adjust the focal position from the timing. In other words, when an object moves from outside the target area AR into the target area AR, the object may be recognized as an object to be focused on by the focus position control section 34 . That is, the focal position control unit 34 may focus on an object that has moved from outside the target area AR into the target area AR.
- the focal position control unit 34 may adjust the focal position to a preset set position when an object does not exist within the target area AR.
- the set position may be set arbitrarily, but is preferably set within the target area AR, such as the center position of the target area AR.
- the focus position control unit 34 does not focus on a stationary object, but rather focuses on a moving object. More specifically, when the second mode is set, the focus position control unit 34 moves the target area AR along with the movement of the imaging device 100 so that the stationary (non-moving) object is If the object is located within the target area AR, the object is not focused. That is, the focal position control unit 34 does not treat a stationary object as an object to be focused on, even if the object is located within the target area AR. Do not align. On the other hand, when the moving object is positioned within the target area AR, that is, when the moving object reaches the target area AR, the focus position control unit 34 adjusts the focus position to the object. Whether or not the object is moving can be determined based on the position information of the object obtained by the object information acquisition unit 32 . That is, when the position information of an object that is continuous in time series changes, it can be determined that the object is moving.
- FIG. 18 shows an example in which the object A moves from the position A0, through the positions A1 and A2, to the position A3, toward the imaging device 100.
- FIG. The position A0 is farther from the imaging device 100 than the first distance L1 and is outside the target area AR.
- the positions A1 and A2 are within the target area AR because the distances to the imaging device 100 are equal to or less than the first distance L1 and equal to or more than the second distance L2.
- the position A3 is closer to the imaging device 100 than the second distance L2 and is outside the target area AR.
- the focus position control unit 34 does not focus on the object A at the timing when the object A is present at the position A0, and instead focuses on the set position, for example. Then, the focus position control unit 34 focuses on the object A at the timing when the object A is present at the position A1, that is, at the timing when the object A enters the target area AR. The focus position control unit 34 continues to focus on the object A even at the timing when the object A is present at the position A2, and at the timing when the object A moves to the position A3, that is, when the object A moves out of the target area AR. At this timing, the focal position is removed from the object A, and the focal position is returned to the set position.
- the focus position control unit 34 adjusts the focus position to the object A from the timing when the object A enters the target area AR, and during the period when the object A is moving within the target area AR, the moving object A , and the focal position is removed from the object A at the timing when the object A moves out of the target area AR.
- the focus position may be set by the user.
- the focal position is set by the focal position control section 34 as described above.
- the manual mode the user inputs an operation to set the focal position to the input unit 16, and the focal position control unit 34 sets the focal position according to the user's operation.
- the imaging control unit 36 controls imaging by the imaging device 100 to capture an image.
- the imaging control unit 36 controls, for example, the imaging element 12 and causes the imaging element 12 to acquire an image signal.
- the imaging control unit 36 may cause the imaging element 12 to automatically acquire an image signal, or may acquire an image signal according to a user's operation.
- the image acquisition unit 38 acquires image data acquired by the imaging device 12 .
- the image acquisition unit 38 controls the image processing circuit 13 to cause the image processing circuit 13 to generate image data from the image signal generated by the imaging element 12, and acquires the image data.
- the image acquisition unit 38 causes the storage unit 22 to store the image data.
- FIG. 25 is a flowchart for explaining the processing flow for setting the focal position.
- the control unit 24 acquires information about the target area AR using the target area acquiring unit 30 (step S20), and acquires object position information using the object information acquiring unit 32 (step S22). .
- the order in which steps S20 and S22 are performed may be arbitrary.
- the controller 24 uses the focus position controller 34 to determine whether the object is positioned within the target area AR based on the position information of the object (step S24). If the object is not located within the target area AR (step S24; No), the process returns to step S22 to continue acquiring the position information of the object.
- step S26 the focus position control unit 34 focuses on the object (step S26). After that, the acquisition of the position information of the object is continued, and it is determined whether the object has moved outside the target area AR (step S28). If the object does not move outside the target area AR (step S28; No), that is, if the object continues to exist within the target area AR, the process returns to step S26 to continue focusing on the object. If the object has moved outside the target area AR (step S28; Yes), the focus position control unit 34 defocuses the object (step S30). Thereafter, if the process is not to be ended (step S32; No), the process returns to step S22, and if the process is to be ended (step S32; Yes), this process is ended.
- the imaging device 100 includes the imaging element 12, the self-position acquisition unit 28, the object information acquisition unit 32, the target area acquisition unit 30, and the focal position control unit 34.
- the self position acquisition unit 28 acquires position information of the imaging device 100
- the object information acquisition unit 32 acquires position information of an object existing in the imaging area AR0 of the imaging device 12
- the target area acquisition unit 30 acquires the image
- a target area AR is set within the area AR0, and if an object exists within the target area AR, the focus position control unit 34 controls the focus position of the imaging device 100 so as to focus on the object.
- the target area acquiring unit 30 fixes the position of the target area AR when the self-position acquiring unit 28 determines that the imaging device 1000 has moved.
- the imaging device 100 controls the focal position of the imaging device 100 so that the focal position is aligned with the object when the object exists within the target area AR. Then, even when the imaging device 100 moves, the position of the target area AR is fixed. Therefore, according to the present embodiment, the position of the target area AR can be fixed even when the imaging area AR0 moves. Therefore, the object in the target area is appropriately focused while changing the imaging area AR0. be able to.
- the target area acquisition unit 30 determines that the imaging device 100 has moved when the self-position acquisition unit 28 determines that the imaging device 100 has moved. Fix the AR position.
- the target area acquisition unit 30 determines that the imaging device 100 has moved. Change the position of AR. Therefore, according to the present embodiment, it is possible to set whether the focused area is changed or fixed in accordance with the imaging area AR0 depending on the situation, so that the focal position can be adjusted appropriately.
- the target area acquisition unit 30 determines that the target area AR is located at a first position AX1 at which the distance from the imaging device 100 is the first distance L1, and at a second distance L2 at which the distance from the imaging device 100 is shorter than the first distance L1.
- the target area AR may be set so as to be located between the second position AX2 where Therefore, it is possible to properly focus on an object that has entered the target area AR.
- FIG. 26 is a schematic block diagram of an imaging device according to the eighth embodiment. As shown in FIG. 26 , an imaging device 100 ⁇ /b>A according to the eighth embodiment includes a notification control section 40 in the control section 24 .
- the notification control unit 40 calculates the distance D when the position of the target area AR does not move even if the imaging device 100A moves (that is, when the first mode is set).
- a distance D is the distance between the boundary B of the imaging area AR0 and the target area AR.
- the boundary B of the imaging area AR0 refers to the boundary position between the inside of the imaging area AR0 and the outside of the imaging area AR0, in other words, the edge of the imaging area AR0.
- the notification control unit 40 calculates, as a distance D, the shortest distance between the boundary B of the imaging area AR0 and the target area AR when the target area AR is located within the imaging area AR0.
- the notification control unit 40 regards the length of the shortest straight line as the distance D among the straight lines connecting each point on the periphery of the target area AR to each point on the boundary B of the imaging area AR0.
- the distance D may be calculated by any method.
- the position of the set target area AR the distance D may be calculated.
- the notification control unit 40 determines whether the distance D is less than a predetermined distance. That is, the notification control unit 40 determines whether the distance D has become less than the predetermined distance due to the movement of the imaging device 100A.
- the predetermined distance here may be set arbitrarily.
- the notification control unit 40 causes the imaging device 100A to output an alarm when the distance D is less than the predetermined distance.
- the warning here is information indicating that the distance D is less than a predetermined distance, and may be of any content.
- the notification control unit 40 may cause the display unit 18 to display arbitrary information (for example, characters or symbols indicating a warning) indicating that the distance D is less than a predetermined distance.
- Any sound (for example, an alarm) indicating that the distance D is less than a predetermined distance may be output to a sound output unit (speaker) (not shown), or a tactile stimulus (not shown) provided in the imaging device 100A (for example, A device that outputs a vibration) may output any tactile stimulus (eg, vibration) that indicates that the distance D is less than a predetermined distance.
- the notification control unit 40 does not output an alarm when the distance D is not less than the predetermined distance, that is, when the distance D is equal to or greater than the predetermined distance.
- FIG. 27 is a schematic diagram showing an example of a target area in the eighth embodiment.
- FIG. 27 exemplifies a case where the imaging device 100A moves from the position 100Aa to the position 100Ab.
- the position of the imaging area AR0 moves from the position AR0a to the position AR0b.
- the position of the target area AR is fixed and does not move.
- the notification control unit 40 does not output an alarm when the imaging device 100A is at the position 100Aa, and outputs an alarm when the imaging device 100B is at the position 100Ab. Since the range-finding area also moves along with the movement of the imaging device 100, the range-finding area moves from the position AR0a to the position AR0b, and the boundary between the range-finding areas can also be called the boundary B.
- the user is notified that there is a risk that the target area AR may move out of the range of the imaging area AR0 due to movement of the imaging device 100A, for example. can be notified in advance. Therefore, it is possible to prevent the target area AR from moving out of the range of the imaging area AR0 by further moving the imaging apparatus 100A.
- the mode is set to either the first mode or the second mode, and in the first mode, the distance between the boundary of the imaging area AR0 and the target area AR is set to a predetermined value. It was assumed that an alarm would be issued when the distance becomes less than the distance.
- either the first mode or the second mode may not be set, and the position of the target area AR is set so as not to move even if the imaging device 100A moves. It should be possible. That is, the imaging apparatus 100A according to the eighth embodiment is set so that the position of the target area AR does not move even if the imaging apparatus 100 moves, and the distance between the boundary of the imaging area AR0 and the target area AR is less than a predetermined distance, an alarm may be issued.
- FIG. 28 is a flowchart for explaining the alarm notification flow.
- the control unit 24 acquires information on the target area AR using the target area acquiring unit 30 (step S40), and sets the target area AR. Then, the control unit 24 uses the self-position acquisition unit 28 to determine whether the imaging device 100A has moved (step S42). If it is determined that the imaging device 100A has moved (step S42; Yes), the control unit 24 causes the notification control unit 40 to determine whether the distance D between the target area AR and the boundary B is less than a predetermined distance (step S44).
- step S46 If the distance D between the target area AR and the boundary B is less than the predetermined distance, the notification control unit 40 outputs an alarm (step S46). Thereafter, if the process is not to be ended (step S48; No), the process returns to step S42, and if the process is to be ended (step S48; Yes), this process is ended. On the other hand, when it is determined that the imaging device 100A does not move (step S42; No), the process also proceeds to step S48. Further, when it is determined that the imaging device 100A has moved and the distance D between the target area AR and the boundary B is not less than the predetermined distance (step S44; No), that is, the distance D is equal to or greater than the predetermined distance. If there is, the process proceeds to step S48 without moving the target area AR.
- the notification control unit 40 outputs an alarm.
- the user is notified that there is a risk that the target area AR may move out of the range of the imaging area AR0 due to movement of the imaging device 100A, for example. can be notified in advance. Therefore, it is possible to prevent the target area AR from moving out of the range of the imaging area AR0 by further moving the imaging apparatus 100A.
- the control unit 24 controls the moving mechanism so that the distance D is further increased.
- the movement of the imaging device 100A in the shortening direction may be stopped. This movement stop processing may be performed together with the output of the alarm, or the movement stop processing may be performed instead of outputting the alarm.
- the ninth embodiment differs from the seventh embodiment in that the focal position is adjusted to an object that exists within the target area AR and satisfies a predetermined condition.
- the ninth embodiment descriptions of parts that are common to the seventh embodiment will be omitted.
- the ninth embodiment is also applicable to the eighth embodiment.
- the focal position control unit 34 adjusts the focal position to an object that exists within the target area AR and satisfies a predetermined condition.
- the focal position control unit 34 does not focus on an object that does not satisfy at least one of being present in the target area AR and satisfying a predetermined condition.
- the focus position control unit 34 continues to focus on the object during the period in which the focused object satisfies a predetermined condition and continues to exist within the target area AR.
- the focus position control unit 34 removes the focus position from the object when at least one of the existence of the object within the target area AR and the satisfaction of the predetermined condition is no longer satisfied.
- the focal position control unit 34 detects the object Remove the focus position from
- the focal position control unit 34 may determine whether or not the predetermined condition is satisfied by any method. You can The positional information of the object here may refer to the measurement result of the object position measuring unit 14, and the image of the object may refer to image data of the object captured by the imaging device 12. FIG.
- the predetermined condition here may be any condition other than that the object exists within the target area AR.
- the predetermined condition may be at least one of that the object is performing a predetermined motion, that the object has a predetermined shape, and that the object faces a predetermined direction. Also, any two of these may be used as the predetermined conditions, or all of them may be used as the predetermined conditions.
- the focus position control unit 34 determines that the predetermined conditions are satisfied when all the conditions are satisfied.
- the focus position control unit 34 determines whether the object is moving in a predetermined manner based on the position information of the object that is continuously acquired in time series.
- the focal position control unit 34 adjusts the focal position with respect to an object existing within the target area AR and performing a predetermined motion.
- the focal position control unit 34 does not focus on an object that does not satisfy at least one of being in the target area AR and performing a predetermined movement.
- the focal position control unit 34 keeps the focal position on the object while the object on which the focal position is adjusted exists in the target area AR and continues a predetermined movement.
- the focal position control unit 34 removes the focal position from the object when at least one of the existence of the object within the target area AR and the movement of the object is not satisfied.
- the motion of the object here refers to the mode of movement of the object, and may refer to, for example, the direction and speed of movement of the object.
- the predetermined motion means moving downward in the vertical direction at a speed of 10 m/h or more
- the focal position control unit 34 moves downward in the vertical direction by 10 m/h or more in the target area AR. Focus on an object moving at a speed of
- the motion of an object is not limited to indicating the moving direction and moving speed of an object, and may refer to any mode of movement.
- motion of an object may refer to at least one of the direction and speed of movement of the object.
- FIG. 29 is a schematic diagram illustrating an example in which the motion of an object is set as a predetermined condition.
- the predetermined condition is that the object moves downward in the vertical direction (the direction opposite to the Z direction), that is, the moving direction of the object.
- the example of FIG. 29 shows the case where the object A moves vertically downward from position A0a through positions A1a and A2a to position A3a and stops at position A3a.
- the position A0a is outside the target area AR, and the positions A1a, A2a, and A3a are inside the target area AR.
- the focus position control unit 34 does not focus on the object A because the object A is outside the target area AR at the timing when the object A exists at the position A0a. .
- the focal position control unit 34 focuses on the object A at the timing when the object A exists at the position A1a, that is, at the timing when the object A enters the target area AR while moving downward in the vertical direction.
- the focus position control unit 34 continues to focus on the object A even when the object A exists at the position A2a, and removes the focus position from the object A when the object A moves to the position A3a and stops. , to return the focal position to the set position.
- the focus position control section 34 determines whether the object has a predetermined shape based on the image data showing the object.
- the focal position is adjusted to an object that exists within the target area AR and has a predetermined shape.
- the focal position control unit 34 does not focus on an object that does not satisfy at least one of being within the target area AR and having a predetermined shape.
- the focal position control unit 34 continues to focus on the object while the object on which the focal position is adjusted has a predetermined shape and continues to exist within the target area AR.
- the focus position control unit 34 removes the focus position from the object when the object no longer satisfies at least one of being in the target area AR and having a predetermined shape.
- the shape of the object here may be, for example, at least one of the size of the object and the outline of the object.
- the focus position control unit 34 adjusts the focus position to an object of a predetermined size or more that exists within the target area AR.
- 3D shape information acquired by the object information acquiring unit 32 may be used to acquire the shape information of the object.
- the focal position control unit 34 determines whether the object faces a predetermined direction based on the image data showing the object. A focal position is adjusted to an object that exists within the target area AR and faces in a predetermined direction. The focal position control unit 34 does not focus on an object that does not satisfy at least one of being in the target area AR and being oriented in a predetermined direction. The focal position control unit 34 continues to focus on the object while the object on which the focal position is adjusted continues to exist within the target area AR while facing the predetermined direction.
- the focus position control unit 34 removes the focus position from the object.
- 3D shape information acquired by the object information acquisition unit 32 may be used to acquire information on the orientation of the object.
- the predetermined condition may be set by any method, for example, it may be set in advance.
- the focal position control unit 34 may read out information indicating preset conditions (for example, moving direction and moving speed) from the storage unit 22, or may read the information from the other device via the communication unit 20.
- a predetermined condition may be obtained from.
- the focus position control section 34 may automatically set the predetermined condition.
- the user may set a predetermined condition. In this case, for example, the user inputs information specifying a predetermined condition (for example, moving direction and moving speed) to the input unit 16, and the focal position control unit 34 controls the predetermined condition based on the information specified by the user. may be set.
- the focal position control unit 34 may focus on an object that is present in the target area AR and is performing a predetermined movement.
- the focus position control unit 34 keeps the focus position on the object while the object is moving in a predetermined manner, and removes the focus position from the object when the object stops moving in the predetermined direction.
- satisfying a predetermined motion is also a condition for focusing, so that an object that is moving in a specific manner can be tracked and the focus position can be adjusted appropriately. It is possible to match. For example, it is possible to detect a fall within the target area AR.
- the focus position control unit 34 may focus on an object that exists in the target area AR and has a predetermined shape.
- the object having a predetermined shape is also set as a condition for adjusting the focus position, so that an object with a specific shape can be tracked and the focus position can be appropriately adjusted. It becomes possible.
- the focal position control unit 34 may focus on an object that exists in the target area AR and faces in a predetermined direction. In this way, in addition to being within the target area AR, by setting the condition that the object is oriented in a predetermined direction as a condition for adjusting the focus position, an object in a specific direction is tracked and the focus position is appropriately adjusted. becomes possible.
- FIG. 30 is a schematic block diagram of an imaging device according to the tenth embodiment.
- An imaging device 100 according to the tenth embodiment is an imaging device that images an object within an imaging range.
- the imaging device 100 is an autofocus camera capable of automatically setting a focal position.
- the imaging device 100 may be a video camera that captures a moving image by capturing each predetermined frame, or may be a camera that captures a still image.
- the imaging device 100 may be used for any purpose, and may be used, for example, as a monitoring camera set at a predetermined position inside a facility or outdoors.
- the imaging device 100 includes an optical element 10, an imaging element 12, an image processing circuit 13, an object position measuring unit 14, an input unit 16, a display unit 18, a communication unit 20, It has a storage unit 22 and a control unit 24 .
- the optical element 10 is an optical system element such as a lens. One or more optical elements 10 may be provided.
- the imaging device 12 is a device that converts light incident through the optical device 10 into an image signal that is an electrical signal.
- the imaging element 12 is, for example, a CCD (Charge Coupled Device) sensor, a CMOS (Complementary Metal Oxide Semiconductor) sensor, or the like.
- the image processing circuit 13 generates image data for each frame from the image signal generated by the imaging device 12 .
- the image data is, for example, data including luminance and color information of each pixel in one frame, and may be data to which a gradation is assigned to each pixel.
- the object position measuring unit 14 is a sensor that measures the position of the object to be measured with respect to the imaging device 100 (relative position of the object).
- An object here may be any object, and may be a living thing or an inanimate object, and the same shall apply hereinafter. Also, the object here may refer to a movable object, but is not limited to that and may refer to an immovable object.
- the object position measurement unit 14 measures the distance from the imaging device 100 to the object as the relative position of the object.
- the object position measuring unit 14 may be any sensor capable of measuring the relative position of an object, and may be, for example, a TOF (Time Of Flight) sensor.
- a TOF sensor for example, a light emitting element (e.g., LED (Light Emitting Diode)) that emits light and a light receiving unit that receives light are provided.
- the distance to the object is measured from the time of flight of the light that has returned to the light-receiving unit.
- the object position measuring unit 14 measures the distance from the imaging device 100 to the object as the relative position of the object.
- the direction in which the object exists with respect to the imaging device 100 may also be measured.
- the object position measurement unit 14 measures the position (coordinates) of the object in a coordinate system having the imaging device 100 as the origin, and calculates the relative position of the object. May be measured as position.
- the input unit 16 is a mechanism that receives input (operation) from the user, and may be, for example, a button, keyboard, touch panel, or the like.
- the display unit 18 is a display panel that displays images.
- the display unit 18 may be capable of displaying an image for the user to set a target area AR, which will be described later, in addition to the image captured by the imaging device 100 .
- the communication unit 20 is a communication module that communicates with an external device, and may be, for example, an antenna or a Wi-Fi (registered trademark) module.
- the imaging device 100 communicates with an external device by wireless communication, but may be wired communication, and any communication method may be used.
- the storage unit 22 is a memory that stores various types of information such as captured image data, calculation contents and programs of the control unit 24.
- a main unit such as RAM (Random Access Memory) and ROM (Read Only Memory) is stored.
- At least one of a storage device and an external storage device such as an HDD (Hard Disk Drive) is included.
- the program for the control unit 24 stored in the storage unit 22 may be stored in a recording medium readable by the imaging device 100 .
- the control unit 24 is an arithmetic device and includes an arithmetic circuit such as a CPU (Central Processing Unit).
- the control unit 24 includes a target area acquisition unit 30 , an object information acquisition unit 32 , an area position information acquisition unit 33 , a focus position control unit 34 , an imaging control unit 36 and an image acquisition unit 38 .
- the control unit 24 reads a program (software) from the storage unit 22 and executes it, thereby obtaining a target area acquisition unit 30, an object information acquisition unit 32, an area position information acquisition unit 33, a focus position control unit 34, and an imaging control unit 36. , and the image acquisition unit 38, and execute those processes.
- control unit 24 may execute these processes by one CPU, or may be provided with a plurality of CPUs and may execute the processes by the plurality of CPUs. Moreover, at least part of the processing of the target area acquisition unit 30, the object information acquisition unit 32, the area position information acquisition unit 33, the focus position control unit 34, the imaging control unit 36, and the image acquisition unit 38 is realized by hardware circuits. You may
- the target area acquisition unit 30 acquires information of the target area AR for setting the focal position
- the object information acquisition unit 32 acquires position information of the object to be imaged
- the area position information acquisition unit 33 Acquisition of area position information for synchronizing target areas AR among a plurality of image capturing apparatuses 100, focus position control unit 34 controlling the focus position of image capturing apparatus 100, and image capturing control unit 36 controlling image capturing by image capturing apparatus 100.
- the image acquisition unit 38 acquires the image captured by the imaging device 100 .
- imaging system In the present embodiment, a plurality of imaging devices 100 are capturing images, and the target areas AR of the respective imaging devices 100 are set such that the target regions AR of the respective imaging devices 100 partially overlap each other.
- An imaging system having a plurality of imaging devices 100 is hereinafter referred to as an imaging system 1 .
- the imaging system 1 includes a first imaging device 100a and a second imaging device 100b.
- the number of imaging devices 100 included in the imaging system 1 is not limited to two. Any number greater than or equal to three may be used.
- Each imaging device 100 sets a target area AR.
- a method of setting the target area AR will be described below.
- the target area AR of the first imaging device 100a is referred to as a first target area ARa
- the target area AR of the second imaging device 100b is referred to as a second target area ARb.
- the first target area ARa and the second target area ARb are not distinguished, they are simply referred to as the target area AR.
- the first imaging device 100a acquires the information of the target area AR (first target area ARa) set within the imaging area AR0 of the first imaging device 100a by the target area acquisition unit 30.
- the target area AR is an area set for automatically adjusting the focal position.
- the information of the target area AR is information indicating the position of the target area AR, that is, the position information of the target area AR.
- FIGS. 31 and 32 are schematic diagrams for explaining an example of the target area.
- 31 is a view of the imaging device 100 and the target area AR viewed from above in the vertical direction
- FIG. 32 is a view of the imaging device 100 and the target region AR viewed from the horizontal direction.
- the direction Z is defined as a vertical direction
- the direction X is defined as one horizontal direction orthogonal to the direction Z
- the direction Y is defined as a direction orthogonal to the direction Z and the direction X (horizontal direction).
- the range in which an image can be captured by the imaging device 100 is defined as an imaging area AR0.
- the imaging area AR0 refers to an area (space) within the angle of view of the imaging device 12, in other words, refers to a range captured as an image in real space.
- the target area AR is an area (space) set within the range of the imaging area AR0.
- the first target area ARa is positioned between the first position AX1 and the second position AX2 within the imaging area AR0 of the first imaging device 100a.
- the first position AX1 is a position where the distance from the first imaging device 100a is the first distance L1
- the second position AX2 is the second position where the distance from the first imaging device 100a is shorter than the first distance L1. This position is the distance L2.
- the first position AX1 is located at a first distance L1 from the first imaging device 100a within the imaging area AR0 of the first imaging device 100a. It can be said that it is a virtual surface including positions (coordinates).
- the second position AX2 is a virtual plane that includes positions (coordinates) at a second distance L2 from the first imaging device 100a within the imaging area AR0 of the first imaging device 100a.
- the first target area ARa is defined as a virtual plane having a second distance L2 from the first imaging device 100a and a second distance L2 from the first imaging device 100a within the imaging region AR0 of the first imaging device 100a. It can be said that it is a space that occupies at least a part of the space surrounded by the virtual plane with one distance L1.
- the first position AX1 is not limited to a virtual plane in which all positions (coordinates) included in the first position AX1 are the first distance L1 from the first imaging device 100a.
- At least a part of the position (coordinates) of the image may be a virtual plane that is the first distance L1 from the first imaging device 100a.
- the second position AX2 may be a virtual plane in which at least some positions (coordinates) included in the second position AX2 are at the second distance L2 from the first imaging device 100a.
- the size and shape of the first target area ARa are not limited to those described above and may be arbitrary.
- the position of the first target area ARa is not limited to the above description and may be arbitrary.
- the first target area ARa is not limited to being positioned between the first position AX1 and the second position AX2.
- the first target area ARa is an area set within the imaging area AR0 of the first imaging device 100a, but it is not limited to this.
- the imaging area AR0 in FIGS. 31 to 32 may be treated as the ranging area.
- the target area acquisition unit 30 of the first imaging device 100a may acquire information on the first target area ARa by any method.
- the position of the first target area ARa may be set in advance.
- the target area acquiring unit 30 of the first imaging device 100a may read the preset position information of the first target area ARa from the storage unit 22, or may read the information from the other device via the communication unit 20. , the position information of the first target area ARa may be obtained.
- the target area acquisition unit 30 may automatically set the position of the first target area ARa.
- the user may set the position of the first target area ARa.
- the user inputs information specifying the position of the first target area ARa (for example, the values of the first distance L1 and the second distance L2) to the input unit 16 of the first imaging device 100a, and the target area
- the acquisition unit 30 may set the first target area ARa based on the position information of the first target area ARa designated by the user.
- the first target area ARa may be set by specifying coordinates. For example, in the example of FIG. 31, the coordinates P1, P2, P3, and P4 that are the vertex positions of the target area AR may be designated, and the area surrounded by the coordinates P1 to P4 is set as the first target area ARa. good.
- the first imaging device 100a acquires the region position information by the region position information acquisition section 33 .
- the area position information is information indicating the position (relative position) of the first target area ARa with respect to the reference object B.
- FIG. Specifically, the first imaging device 100a uses the object information acquisition unit 32 to acquire the position information of the reference object B existing within the imaging area AR0.
- the object information acquisition unit 32 controls the object position measurement unit 14 to cause the object position measurement unit 14 to measure the relative position of the reference object B with respect to the first imaging device 100a.
- the object information acquisition unit 32 acquires the measurement result of the relative position of the reference object B with respect to the first imaging device 100a by the object position measurement unit 14 as the position information of the reference object B.
- the area position information acquisition unit 33 of the first imaging device 100a obtains the position (relative position) of the first target area ARa with respect to the reference object B based on the position information of the reference object B and the position information of the first target area ARa. Then, the position of the first target area ARa with respect to the reference object B is obtained as area position information.
- the area position information can be said to be information indicating the coordinates of the first target area ARa based on the position (coordinates) of the reference object B, and can also be information indicating the deviation of the position of the first target area ARa from the position of the reference object B. I can say.
- the position of the first target area ARa with respect to the reference object B may refer to, for example, the position of the reference point (for example, the center point) of the first target area ARa with respect to the reference object B, or the position of the first target area ARa with respect to the reference object B. It may indicate the position of each vertex of the target area ARa.
- the region position information acquisition unit 33 of the first imaging device 100a acquires region position information using three or more reference objects B as references. That is, the area position information acquisition unit 33 acquires information indicating the position of the first target area ARa with respect to each of the three or more reference objects B as area position information.
- the area position information acquisition unit 33 acquires information indicating the position of the first target area ARa with respect to each of the three or more reference objects B as area position information.
- three reference objects Ba, Bb, and Bc are set.
- the position of ARa and the position of the first target area ARa with respect to the reference object Bc are obtained as area position information.
- the number of reference objects B used for area position information is not limited to three or more, and may be one or any number of two or more.
- the area position information acquisition unit 33 of the first imaging device 100a may select an object as the reference object B by any method.
- the area position information acquisition unit 33 may automatically select the reference object B.
- the area position information acquisition unit 33 uses the position information of the object acquired by the object information acquisition unit 32 to determine the area within the imaging area AR0 (or within the range-finding area) of both the first imaging device 100a and the second imaging device 100b. ), and the reference object B may be selected from among the extracted objects. Further, for example, the user may specify the reference object B.
- the user inputs information specifying the reference object B to the input unit 16 of the first imaging device 100a based on the image within the imaging area AR0 displayed on the display unit 18 (for example, on the image).
- the target region acquiring unit 30 may set the object specified by the user as the reference object B.
- the area position information which is the relative position between the first target area ARa and the reference object B, is calculated.
- the first target area ARa may be set based on the position of the reference object B and the area position information after the reference object B and the area position information are set.
- the region position information acquisition unit 33 of the first imaging device 100a transmits the acquired region position information to the second imaging device 100b via the communication unit 20. Note that since the second imaging device 100b does not set the region position information, the region position information acquisition unit 33 shown in FIG. 30 may not be included.
- the target area acquisition unit 30 of the second imaging device 100b acquires the area position information from the first imaging device 100a via the communication unit 20 .
- the second imaging device 100b sets the second target region ARb by the target region acquisition unit 30 based on the region position information acquired from the first imaging device 100a. A specific description will be given below.
- the target area acquisition unit 30 of the second imaging device 100b acquires information on the reference object B.
- Information about the reference object B refers to information about the reference object B used by the first imaging device 100a to acquire area position information. This can be said to be information indicating which of the objects existing within the area) is the reference object B.
- FIG. The target area acquisition unit 30 of the second imaging device 100b may acquire information on the reference object B by any method. For example, information on the reference object B is transmitted from the first imaging device 100a together with region position information, and the target region acquiring unit 30 of the second imaging device 100b acquires information on the reference object B from the first imaging device 100a. good too. Further, for example, the user may input information on the reference object B.
- the user recognizes information of the reference object B in advance, and based on the image within the imaging area AR0 displayed on the display unit 18, the reference object B is input to the input unit 16 of the second imaging device 100b. (for example, by touching an object on the image), the target region acquiring unit 30 may set the object specified by the user as the reference object B.
- the object information acquisition unit 32 of the second imaging device 100b acquires the position information of the reference object B specified in the information of the reference object B acquired by the target area acquisition unit 30.
- the object information acquisition unit 32 controls the object position measurement unit 14 to measure the relative position of the reference object B with respect to the second imaging device 100b.
- the object information acquisition unit 32 acquires the measurement result of the relative position of the reference object B with respect to the second imaging device 100b by the object position measurement unit 14 as the position information of the reference object B.
- the target area acquisition unit 30 of the second imaging device 100b sets the second target area ARb based on the position information of the reference object B and the area position information.
- the target area acquiring unit 30 determines a position shifted from the position of the reference object B with respect to the second imaging device 100b by the deviation of the position of the first target area ARa with respect to the reference object B indicated by the area position information as a second target. Set as area ARb.
- the first target area ARa and the second target area ARb are set at positions shifted from the common reference object B by the same amount of deviation. , are set to overlap.
- the first target area ARa and the second target area ARb are set so as to completely overlap, that is, the entire first target area ARa and the entire second target area ARb overlap without any deviation. .
- the method of setting the first target area ARa and the second target area ARb is not limited to the above description and is arbitrary, and is not limited to being set using the reference object B or the area position information.
- the first target area ARa and the second target area ARb may be set arbitrarily so that at least a partial area (space) of the first target area ARa and at least a partial area (space) of the second target area ARb overlap. may be set in the following manner.
- the second target area ARb is set to be positioned within the imaging area AR0 of the second imaging device 100b, and is set to be positioned within the ranging area of the second imaging device 100b.
- the second target area ARb is located between a first position at a first distance L1 from the second imaging device 100b and a second position at a second distance L2 from the second imaging device 100b. may be located in
- FIG. 33 is a flowchart for explaining the setting flow of the target area.
- the first imaging device 100a uses the target area acquisition unit 30 to set the first target area ARa (step S10), and the object information acquisition unit 32 acquires the position information of the reference object B (first Information on the relative position of the reference object B with respect to the imaging device 100a) is acquired (step S12).
- the execution order of steps S10 and S12 may be arbitrary.
- the first imaging device 100a acquires region position information based on the position information of the reference object B by the region position information acquisition unit 33 (step S14), and transmits the region position information to the second imaging device 100b. (Step S16).
- the second imaging device 100b acquires the region position information from the first imaging device 100a (step S18), and the object information acquisition unit 32 obtains the position information of the reference object B (the relative position of the reference object B with respect to the second imaging device 100b). information) is obtained (step S20).
- the second imaging device 100b uses the target area acquisition unit 30 to set the second target area ARb based on the position information and the area position information of the reference object B (step S22).
- the target region AR may be set by the same method when there are three or more imaging devices 100. . That is, for example, when a third imaging device is provided, the third imaging device sets the target area AR of the third imaging device based on the position information and area position information of the reference object B, like the second imaging device 100b. you can
- the object information acquisition unit 32 acquires position information of an object existing within the imaging area AR0.
- the object information acquisition unit 32 controls the object position measurement unit 14 to cause the object position measurement unit 14 to measure the relative position of the object with respect to the imaging device 100 .
- the object information acquisition unit 32 acquires the measurement result of the relative position of the object with respect to the imaging device 100 by the object position measurement unit 14 as the position information of the object.
- the object information acquisition unit 32 sequentially acquires the position information of the object by acquiring the position information of the object at predetermined time intervals.
- the object information acquisition unit 32 can also acquire information indicating the shape of the object (for example, the 3D shape of the object) based on the positional information of the object.
- the object information acquisition unit 32 can acquire the 3D shape of the object by accumulating a plurality of pieces of position information such as TOF image information.
- the focus position control unit 34 sets the focus position of the imaging device 100 .
- the focal position control section 34 controls the focal position by controlling the position of the optical element 10 , that is, by moving the position of the optical element 10 .
- the focal position control unit 34 adjusts the focal position to the object existing within the target area AR. In other words, the focal position control unit 34 sets the focal position to the position of the object determined to exist within the target area AR. In this embodiment, the focus position control unit 34 determines whether the object exists within the target area AR based on the position information of the object acquired by the object information acquisition unit 32 . When the position of the object acquired by the object information acquiring unit 32 overlaps with the position of the target area AR, the focus position control unit 34 determines that the object exists within the target area AR, and the object information acquiring unit Focus on the position of the object obtained by 32 . On the other hand, the focus position control unit 34 does not adjust the focus position for an object that does not exist within the target area AR.
- the focus position control unit 34 continues to focus on the object during the period when the focused object exists within the target area AR. That is, the focal position control unit 34 determines whether the object continues to exist within the target area AR based on the position information of the object obtained by the object information obtaining unit 32 at predetermined time intervals, and determines whether the object continues to exist within the target area AR. During the period in which the object continues to exist within the target area AR, the object continues to be focused. On the other hand, when the focused object moves out of the target area AR, that is, when it no longer exists within the target area AR, the focus position control unit 34 removes the focus position from the object. , focus on a position other than the object.
- the focal position control unit 34 does not need to focus on an object existing within the target area AR from the start of operation of the imaging device 100 (timing when imaging becomes possible). . That is, the focal position control section 34 may adjust the focal position with respect to an object that has entered the target area AR after the start of operation. In other words, the focal position control unit 34 controls an object that exists within the target area AR at a certain timing but does not exist within the target area AR at a timing prior to that timing. You can adjust the focal position from the timing. In other words, when an object moves from outside the target area AR into the target area AR, the object may be recognized as an object to be focused on by the focus position control section 34 . That is, the focal position control unit 34 may focus on an object that has moved from outside the target area AR into the target area AR.
- the focal position control unit 34 may adjust the focal position to a preset set position when an object does not exist within the target area AR.
- the set position may be set arbitrarily, but is preferably set within the target area AR, such as the center position of the target area AR.
- FIG. 34 is a schematic diagram for explaining setting of the focal position.
- FIG. 34 shows an example where object A is moving from position A0 through position A1 to position A2.
- the position A0 is outside the first target area ARa and the second target area ARb
- the position A1 is inside the first target area ARa and the second target area ARb
- the position A2 is outside the first target area. It is outside the range of ARa and the second target area ARb.
- the focus position control units 34 of the first imaging device 100a and the second imaging device 100b do not focus on the object A at the timing when the object A is present at the position A0, and instead focus on the set position, for example. match.
- the focal position control units 34 of the first imaging device 100a and the second imaging device 100b control the timing when the object A exists at the position A1, that is, when the object A is within the range of the first target area ARa and the second target area ARb. , the object A is focused.
- the focus position control unit 34 continues to focus on the object A while the object A is positioned within the range of the first target area ARa and the second target area ARb, and at the timing when the object A moves to the position A2, That is, at the timing when the object A goes out of the range of the first target area ARa and the second target area ARb, the focus position is removed from the object A and the focus position is returned to the set position.
- the focus position control unit 34 adjusts the focus position to the object A from the timing when the object A enters the target area AR, and during the period when the object A is moving within the target area AR, the moving object A , and the focal position is removed from the object A at the timing when the object A moves out of the target area AR.
- the object A is within the range of the first target area ARa and outside the range of the second target area ARb.
- the first imaging device 100a focuses on the object A, but the second imaging device 100b does not focus on the object A.
- the second imaging device 100b focuses on the object A, but the first imaging device 100a does not focus on object A;
- the focus position may be set by the user.
- the focal position is set by the focal position control section 34 as described above.
- the manual mode the user inputs an operation to set the focal position to the input unit 16, and the focal position control unit 34 sets the focal position according to the user's operation.
- the imaging control unit 36 controls imaging by the imaging device 100 to capture an image.
- the imaging control unit 36 controls, for example, the imaging element 12 and causes the imaging element 12 to acquire an image signal.
- the imaging control unit 36 may cause the imaging element 12 to automatically acquire an image signal, or may acquire an image signal according to a user's operation.
- the image acquisition unit 38 acquires image data acquired by the imaging device 12 .
- the image acquisition unit 38 controls the image processing circuit 13 to cause the image processing circuit 13 to generate image data from the image signal generated by the imaging element 12, and acquires the image data.
- the image acquisition unit 38 causes the storage unit 22 to store the image data.
- FIG. 35 is a flowchart for explaining the processing flow for setting the focal position.
- the control unit 24 acquires information about the target area AR using the target area acquiring unit 30 (step S30), and acquires position information of the object using the object information acquiring unit 32 (step S32). .
- the order in which steps S30 and S32 are performed may be arbitrary.
- the control unit 24 uses the focus position control unit 34 to determine whether the object is positioned within the target area AR based on the position information of the object (step S34). If the object is not located within the target area AR (step S34; No), the process returns to step S32 to continue acquiring the position information of the object.
- step S34 if the object is located within the target area AR (step S34; Yes), the focus position control unit 34 focuses on the object (step S36). After that, the acquisition of the positional information of the object is continued, and it is determined whether the object has moved outside the target area AR (step S38). If the object does not move outside the target area AR (step S38; No), that is, if the object continues to exist within the target area AR, the process returns to step S36 to continue focusing on the object. If the object has moved outside the target area AR (step S38; Yes), the focus position control unit 34 defocuses the object (step S40). Thereafter, if the process is not to be ended (step S42; No), the process returns to step S32, and if the process is to be ended (step S42; Yes), this process is ended.
- the imaging system 1 has a plurality of imaging devices 100 .
- Each imaging device 100 includes an imaging element 12, an object information acquisition unit 32 that acquires position information of an object existing in an imaging area AR0 of the imaging element 12, and a target area acquisition unit that sets a target area AR within the imaging area AR0. and a focus position control unit 34 that controls the focus position of the imaging device 100 so as to focus on the object when the object exists within the target area AR.
- the target area acquiring unit 30 of each imaging device 100 sets the target area AR such that at least some areas of the respective target areas AR overlap each other.
- the imaging system 1 sets the target area AR so that at least a part of the target areas of the respective imaging devices 100 overlap each other, and an object exists in each of the target areas AR.
- the focal position of the imaging device 100 is controlled so that the focal position is aligned with the object. Therefore, according to the present embodiment, it is possible to match the regions of interest for a plurality of imaging devices 100, so that the object in the region of interest can be appropriately focused.
- the imaging system 1 includes at least a first imaging device 100a and a second imaging device 100b as the plurality of imaging devices 100.
- the area position information acquisition unit 33 of the first imaging device 100a acquires area position information indicating the position of the first target area ARa with respect to the reference object B, based on the position information of the reference object B acquired by the object information acquisition unit 32.
- the target area acquisition unit 30 of the second imaging device 100b acquires the area position information from the first imaging device 100a, and sets the second target area ARb based on the area position information. Therefore, the imaging system 1 according to the present embodiment can appropriately set the second target area ARb so as to overlap with the first target area ARa. Furthermore, by using the information of the reference object B, the second target area ARb can be appropriately set without sharing the position information between the imaging devices 100 .
- the area position information acquisition unit 33 of the first imaging device 100a acquires information indicating the position of the first target area ARa with respect to each of the three or more reference objects B as area position information.
- the area position information acquisition unit 33 of the first imaging device 100a acquires information indicating the position of the first target area ARa with respect to each of the three or more reference objects B as area position information.
- the target area acquisition unit 30 of the second imaging device 100b sets the second target area ARb based on the information of the reference object B as well. Therefore, according to this embodiment, the second target area ARb can be appropriately set.
- the imaging device adjusts the focal position to the object based on the position where the object enters. It differs from the tenth embodiment in that 100 is selected. In the eleventh embodiment, the description of the parts that are common to the tenth embodiment will be omitted.
- FIG. 36 is a schematic diagram showing an example of focus position setting in the eleventh embodiment.
- the intrusion position information is information indicating an intrusion position of an object to be focused. Assuming that the area (space) where the target areas AR of the imaging devices 100 overlap is an overlap area ARW, the intrusion position indicates the position in the overlap area ARW where the object has entered. It refers to whether an object has entered from the boundary (periphery) of That is, for example, like the object Ab in FIG. 36, when the object Ab intrudes from the periphery (boundary) of the overlapping area ARW in the Y direction, it can be said that the intrusion position is on the Y direction side. Since the intrusion position information indicates the intrusion position of the object whose focus position is to be adjusted, it can be said that it indicates from which intrusion position the intrusion object is to be focused.
- the intrusion position information is set for each imaging device 100 .
- the intrusion position information of each imaging device 100 is preferably set such that the intrusion position of the object to be focused is different from each other.
- the intrusion position of the object to be focused is on the X direction side
- the intrusion position information acquired by the second imaging device 100b the focal position is aligned.
- the entry position of the target object may be on the Y direction side.
- the intrusion position of the object to be focused may overlap.
- the intrusion position of the object to be focused is on the X direction side and the Y direction side
- the intrusion position of the object to be focused may be on the Y direction side.
- Each imaging device 100 may acquire intrusion position information by any method.
- intrusion position information may be preset.
- each imaging device 100 may read preset intrusion position information from the storage unit 22, or may acquire intrusion position information from another device via the communication unit 20. .
- each imaging device 100 may automatically set the intrusion position information.
- the user may set the intrusion position information. In this case, for example, the user may input the intrusion position information to the input unit 16, and the target area obtaining unit 30 may obtain the intrusion position information input by the user.
- each imaging device 100 When an object enters the overlapping area ARW from the entry position specified in the entry position information, each imaging device 100 focuses on the object. That is, when an object intrudes into the overlap area ARW, each imaging device 100 detects the intrusion position of the object based on the position information of the object acquired by the object information acquiring unit 32, and detects the intrusion position of the object from the boundary of the overlap area ARW in which direction. Determine if the object has entered. Then, each of the imaging devices 100 determines whether the specified intrusion position matches the intrusion position specified in the intrusion position information. to match. On the other hand, if the specified intrusion position does not match the intrusion position specified in the intrusion position information, the object is not focused.
- the intrusion position of the object to be focused is on the opposite side of the X direction
- the intrusion position information acquired by the second imaging device 100b is on the intrusion position of the object to be focused is on the Y direction side
- the entire areas of the first target area ARa and the second target area ARb overlap each other.
- the position Aa2 is a position overlapping the boundary (periphery) of the overlapping area ARW on the side opposite to the X direction.
- the position Aa3 is within the overlapping area ARW.
- the intrusion position is on the opposite side of the X direction. Therefore, the first imaging device 100a focuses on the object Aa from the timing when the object is positioned at the position Aa2, and the second imaging device 100b does not focus on the object Aa.
- the position Ab1 is outside the overlapping area ARW and located on the Y-direction side of the overlapping area ARW.
- the position Ab2 is a position that overlaps the Y-direction boundary (periphery) of the overlapping area ARW.
- the position Ab3 is within the overlapping area ARW.
- the second imaging device 100b focuses on the object Ab from the timing when the object is positioned at the position Ab2, and the first imaging device 100a does not focus on the object Ab.
- FIG. 37 is a flowchart for explaining the processing flow for setting the focus position in the eleventh embodiment.
- the control unit 24 acquires the information of the target area AR and the intrusion position information by the target area acquisition unit 30 (step S50), and acquires the position information of the object by the object information acquisition unit 32. (Step S52).
- the order in which steps S50 and S52 are performed may be arbitrary.
- the control unit 24 uses the focus position control unit 34 to determine whether the object is positioned within the overlapping area ARW based on the position information of the object (step S54).
- step S54 determines whether the object has entered from the entry position (set entry position) indicated by the entry position information ( Step S56), if the object enters from the set entry position, the object is focused (step S58). Thereafter, if the process is not to be ended (step S60; No), the process returns to step S52, and if the process is to be ended (step S60; Yes), this process is ended. If the object is not located within the overlapping area ARW (step S54; No) or if the object has not entered from the set intrusion position (step S56; No), step Proceed to S60. Note that when an object is outside the overlapping area ARW but is located within the target area AR of its own imaging device 100, the focus position control unit 34 sets the focus position to the object regardless of the entry position of the object. can be combined.
- the focal position control unit 34 of each imaging device 100 acquires intrusion position information indicating the intrusion position of an object whose focus position is to be adjusted, and obtains the intrusion position information in the overlapping area ARW. , when an object enters from the position specified in the intrusion position information, the focal position is adjusted to the object.
- the imaging device 100 that focuses on the object in the overlapping area ARW, can be selected according to the intrusion position of the object. can be done.
- first target area ARa and the second target area ARb may be set by any method in which at least some of the areas overlap each other.
- the imaging device 100 that focuses on the object located in the overlapping area ARW is assigned based on the designation information that designates whether or not to adjust the focus position when the object is located in the overlapping area ARW. This is different from the first embodiment.
- descriptions of portions that are common to the tenth embodiment will be omitted.
- the twelfth embodiment can also be applied to the eleventh embodiment.
- FIG. 38 is a schematic diagram showing an example of focus position setting in the twelfth embodiment.
- the focus position control section 34 of each imaging device 100 acquires designation information.
- Designation information is information that designates whether or not to adjust the focal position when an object is located in the overlapping area ARW.
- the designation information is set for each imaging device 100 .
- Each imaging device 100 may acquire the designation information by any method.
- designation information may be set in advance.
- each imaging device 100 may read preset designation information from the storage unit 22 or acquire designation information from another device via the communication unit 20 .
- each imaging device 100 may automatically set the designation information.
- the user may set the designation information. In this case, for example, the user may input designation information to the input unit 16, and the target region obtaining unit 30 may obtain the designation information input by the user.
- Each imaging device 100 determines whether to focus on the object based on the designation information when the object is located in the overlapping area ARW. That is, for example, if the designation information designates that the focus position should be adjusted when an object is positioned in the overlap area ARW, the imaging device 100 adjusts the focus position to the object positioned within the overlap area ARW. On the other hand, if the designation information specifies that the focus position should not be adjusted when the object is positioned in the overlapping area ARW, the imaging device 100 does not focus on the object positioned within the overlapping area ARW. .
- the designation information of each imaging device 100 may be set so that only one imaging device 100 is designated to be focused when an object is positioned in the overlapping area ARW. . That is, taking FIG. 38 as an example, the designation information of the first imaging device 100a designates that the focus position is adjusted when an object is positioned in the overlapping area ARW, and the designation information of the second imaging device 100b designates that the overlapping area ARW It may be specified that the focal position is not adjusted when the object is positioned at the In this case, the first imaging device 100a focuses on the object Ac located in the overlapping area ARW, and the second imaging device 100b does not focus on the object Ac located in the overlapping area ARW.
- each imaging device 100 may be set so that a plurality of imaging devices 100 are designated to be focused when an object is positioned in the overlapping area ARW.
- the specification information for both the first imaging device 100a and the second imaging device 100b may specify that the focal position should be adjusted when an object is located in the overlapping area ARW. In this case, both the first imaging device 100a and the second imaging device 100b focus on the object Ac located in the overlapping area ARW.
- the designation information may be set so as to divide the overlapping area ARW into a plurality of areas and assign the imaging device 100 for adjusting the focal position for each area.
- FIG. 39 is a schematic diagram showing an example of focus position setting in another example of the twelfth embodiment.
- the overlapping area ARW is divided into a first overlapping area ARWa and a second overlapping area ARWb.
- the designation information of the first imaging device 100a designates that the focus position is adjusted when the object is positioned in the first overlapping area ARWa, and the focus position is not adjusted when the object is positioned in the second overlapping area ARWb. is specified.
- the designation information of the second imaging device 100b designates that the focal position should be adjusted when the object is positioned in the second overlapping area ARWb, and the focal position should be adjusted when the object is positioned in the first overlapping area ARWa. It is specified that it does not match.
- the first imaging device 100a focuses on the object Ad located in the first overlapping area ARWa and does not focus on the object Ae located in the second overlapping area ARWb.
- the second imaging device 100b focuses on the object Ae located in the second overlapping area ARWb and does not focus on the object Ad located in the first overlapping area ARWa.
- any method can be used to divide the overlapping area ARW. It may be the area ARWa. Then, of the overlap area ARW, the area located on the area side that does not overlap the overlap area ARW of the second target area ARb of the second imaging device 100b with respect to the first overlap area ARWa is defined as the second overlap area ARWb. good too.
- FIG. 40 is a flowchart for explaining the processing flow for setting the focus position in the twelfth embodiment.
- the control unit 24 acquires the information of the target area AR and the designation information by the target area acquisition unit 30 (step S70), and acquires the position information of the object by the object information acquisition unit 32 (step S72).
- the order in which steps S70 and S72 are performed may be arbitrary.
- the control unit 24 uses the focus position control unit 34 to determine whether the object is positioned within the overlapping area ARW based on the position information of the object (step S74).
- the focus position control unit 34 determines whether to focus on the object based on the designation information (step S76). That is, when the designation information designates that the object within the overlapping area ARW is to be focused, the focus position control unit 34 focuses on the object. On the other hand, if the designation information designates that the object within the overlapping area ARW is not to be focused, the focus position control unit 34 does not focus on the object. After that, if the process is not to be ended (step S78; No), the process returns to step S72, and if the process is to be ended (step S78; Yes), this process is ended.
- step S74 If the object is not located within the overlapping area ARW (step S74; No), the process proceeds to step S78 without focusing on the object. However, if an object is located outside the overlapping area ARW but within its own target area AR, the object may be focused regardless of the designation information.
- the focus position control unit 34 of each imaging device 100 acquires designation information that designates whether or not to adjust the focus position when an object is positioned in the overlapping area ARW. do.
- the focus position control unit 34 of each imaging device 100 determines whether or not to focus on the object based on the designation information when the object is located in the overlapping area ARW.
- the imaging device 100 that focuses on the object located in the overlapping area ARW can be selected based on the designation information. can be assigned.
- first target area ARa and the second target area ARb may be set by any method in which at least some of the areas overlap each other.
- the thirteenth embodiment differs from the tenth embodiment in that the focal position is adjusted to an object that exists within the target area AR and satisfies a predetermined condition.
- descriptions of the parts that are common to the first embodiment will be omitted.
- the thirteenth embodiment can also be applied to the eleventh and twelfth embodiments.
- the focal position control unit 34 adjusts the focal position to an object that exists within the target area AR and satisfies a predetermined condition.
- the focal position control unit 34 does not focus on an object that does not satisfy at least one of being present in the target area AR and satisfying a predetermined condition.
- the focal position control unit 34 continues to focus on the object while the object on which the focal position is adjusted satisfies a predetermined condition and continues to exist within the target area AR.
- the focus position control unit 34 removes the focus position from the object when at least one of the existence of the object within the target area AR and the satisfaction of the predetermined condition is no longer satisfied.
- the focal position control unit 34 detects that the object Remove the focus position from
- the focal position control unit 34 may determine whether or not the predetermined condition is satisfied by any method. You can The positional information of the object here may refer to the measurement result of the object position measuring unit 14, and the image of the object may refer to image data of the object captured by the imaging device 12. FIG.
- the predetermined condition here may be any condition other than that the object exists within the target area AR.
- the predetermined condition may be at least one of that the object is performing a predetermined motion, that the object has a predetermined shape, and that the object faces a predetermined direction. Also, any two of these may be used as the predetermined conditions, or all of them may be used as the predetermined conditions.
- the focus position control unit 34 determines that the predetermined conditions are satisfied when all the conditions are satisfied.
- the focus position control unit 34 determines whether the object is moving in a predetermined manner based on the position information of the object that is continuously acquired in time series.
- the focal position control unit 34 adjusts the focal position with respect to an object existing within the target area AR and performing a predetermined motion.
- the focal position control unit 34 does not focus on an object that does not meet at least one of being within the target area AR and performing a predetermined movement.
- the focal position control unit 34 keeps the focal position on the object while the object on which the focal position is adjusted exists in the target area AR and continues a predetermined movement.
- the focus position control unit 34 removes the focus position from the object when the object no longer satisfies at least one of being present in the target area AR and performing a predetermined movement.
- the motion of the object here refers to the mode of movement of the object, and may refer to, for example, the direction and speed of movement of the object.
- the predetermined motion means moving downward in the vertical direction at a speed of 10 m/h or more
- the focal position control unit 34 moves downward in the vertical direction by 10 m/h or more in the target area AR. Focus on an object moving at a speed of
- the motion of an object is not limited to indicating the moving direction and moving speed of an object, and may refer to any mode of movement.
- motion of an object may refer to at least one of the direction and speed of movement of the object.
- FIG. 41 is a schematic diagram illustrating an example in which the motion of an object is set as a predetermined condition.
- the predetermined condition is that the object moves downward in the vertical direction (direction opposite to the Z direction), that is, the moving direction of the object.
- the object A moves vertically downward from position A0a through positions A1a and A2a to position A3a and stops at position A3a.
- the position A0a is outside the target area AR, and the positions A1a, A2a, and A3a are inside the target area AR.
- the focus position control unit 34 does not focus on the object A because the object A is outside the target area AR at the timing when the object A exists at the position A0a. .
- the focal position control unit 34 focuses on the object A at the timing when the object A is present at the position A1a, that is, at the timing when the object A enters the target area AR while moving downward in the vertical direction.
- the focus position control unit 34 continues to focus on the object A even at the timing when the object A is present at the position A2a, and removes the focus position from the object A at the timing when the object A moves to the position A3a and stops. , to return the focal position to the set position.
- the focus position control section 34 determines whether the object has a predetermined shape based on the image data showing the object.
- the focal position is adjusted to an object that exists within the target area AR and has a predetermined shape.
- the focal position control unit 34 does not focus on an object that does not satisfy at least one of being within the target area AR and having a predetermined shape.
- the focal position control unit 34 continues to focus on the object while the object on which the focal position is adjusted has a predetermined shape and continues to exist within the target area AR.
- the focus position control unit 34 removes the focus position from the object when the object no longer satisfies at least one of being in the target area AR and having a predetermined shape.
- the shape of the object here may be, for example, at least one of the size of the object and the outline of the object.
- the focus position control unit 34 adjusts the focus position to an object of a predetermined size or more that exists within the target area AR.
- 3D shape information acquired by the object information acquiring unit 32 may be used to acquire the shape information of the object.
- the focal position control unit 34 determines whether the object faces a predetermined direction based on the image data showing the object. A focal position is adjusted to an object that exists within the target area AR and faces in a predetermined direction. The focal position control unit 34 does not focus on an object that does not satisfy at least one of being in the target area AR and being oriented in a predetermined direction. The focal position control unit 34 continues to focus on the object while the object on which the focal position is adjusted continues to exist within the target area AR while facing the predetermined direction.
- the focus position control unit 34 removes the focus position from the object.
- 3D shape information acquired by the object information acquisition unit 32 may be used to acquire information on the orientation of the object.
- the predetermined condition may be set by any method, for example, it may be set in advance.
- the focal position control unit 34 may read out information indicating a predetermined condition (for example, moving direction and moving speed) from the storage unit 22, or may read the information indicating the predetermined condition from the storage unit 22, or may transmit the information to another device via the communication unit 20.
- a predetermined condition may be obtained from.
- the focus position control section 34 may automatically set the predetermined condition.
- the user may set a predetermined condition. In this case, for example, the user inputs information specifying a predetermined condition (for example, moving direction and moving speed) to the input unit 16, and the focal position control unit 34 controls the predetermined condition based on the information specified by the user. may be set.
- the focal position control unit 34 may focus on an object that is present in the target area AR and is performing a predetermined movement.
- the focus position control unit 34 keeps the focus position on the object while the object is moving in a predetermined manner, and removes the focus position from the object when the object stops moving in the predetermined direction.
- satisfying a predetermined motion is also a condition for focusing, so that an object that is moving in a specific manner can be tracked and the focus position can be adjusted appropriately. It is possible to match. For example, it is possible to detect a fall within the target area AR.
- the focus position control unit 34 may focus on an object that exists in the target area AR and has a predetermined shape.
- the object having a predetermined shape is also a condition for adjusting the focus position, so that an object with a specific shape can be tracked and the focus position can be appropriately adjusted. It becomes possible.
- the focal position control unit 34 may focus on an object that exists in the target area AR and faces in a predetermined direction. In this way, in addition to being within the target area AR, by setting the condition that the object is oriented in a predetermined direction as a condition for adjusting the focus position, an object in a specific direction is tracked and the focus position is appropriately adjusted. becomes possible.
- the embodiment is not limited by the contents of these embodiments.
- the components described above include those that can be easily assumed by those skilled in the art, those that are substantially the same, and those within the so-called equivalent range.
- the components described above can be combined as appropriate, and it is also possible to combine the configurations of the respective embodiments.
- various omissions, replacements, or modifications of components can be made without departing from the gist of the above-described embodiments.
- the operation of adjusting the focal position has been described as a characteristic point, but the operation of adjusting the focal position may be combined with another operation.
- the operation of adjusting the focal position may be combined with the operation of zooming.
- the operation of adjusting the focal position may be replaced with another operation.
- the operation of adjusting the focal position may be replaced with the operation of zooming.
- the control unit 24 of the imaging device in each embodiment for example, when a set condition such as an object entering or exiting a predetermined target area AR or an object moving in a predetermined direction is satisfied, the communication unit 20 You may make it notify to a predetermined transmission destination through.
- the set condition here may mean, for example, that the movement of the object within the target area AR is used as a trigger to bring the object into focus.
- the imaging device, imaging system, imaging method, and program of this embodiment can be used, for example, to capture images.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- Studio Devices (AREA)
- Automatic Focus Adjustment (AREA)
Abstract
Description
(撮像装置の構成)
図1は、第1実施形態に係る撮像装置の模式的なブロック図である。第1実施形態に係る撮像装置100は、撮像範囲内の物体を撮像する撮像装置である。撮像装置100は、焦点位置を自動で設定可能なオートフォーカス方式のカメラである。撮像装置100は、所定のフレーム毎に撮像することで動画像を撮像するビデオカメラであってもよいし、静止画像を撮像するカメラであってもよい。撮像装置100は、任意の用途で用いられてよく、例えば設備内や屋外の所定の位置に設定される監視カメラとして用いられてよい。
物体情報取得部32は、撮像領域AR0内に存在する物体の位置情報を取得する。物体情報取得部32は、物***置測定部14を制御して、物***置測定部14に、撮像装置100に対する物体の相対位置を測定させる。物体情報取得部32は、物***置測定部14による、撮像装置100に対する物体の相対位置の測定結果を、物体の位置情報として取得する。物体情報取得部32は、所定時間毎に、物体の位置情報の取得を行うことで、物体の位置情報を逐次取得する。また、物体情報取得部32は、物体の位置情報に基づき、物体の形状を示す情報(例えば物体の3D形状)も取得できる。例えば、物体情報取得部32は、TOF画像情報など、複数の位置情報を集積して、物体の3D形状を取得できる。
対象領域取得部30は、撮像装置100の撮像領域内に設定された対象領域ARの情報を取得する。対象領域ARとは、焦点位置を自動で合わせるために設定される領域である。対象領域ARの情報とは、対象領域ARの位置を示す情報、すなわち対象領域ARの位置情報である。以下、対象領域ARについて説明する。
焦点位置制御部34は、撮像装置100の焦点位置を設定する。焦点位置制御部34は、光学素子10の位置を制御することで、すなわち光学素子10の位置を移動させることで、焦点位置を制御する。
撮像制御部36は、撮像装置100による撮像を制御して、画像を撮像させる。撮像制御部36は、例えば撮像素子12を制御して、撮像素子12に画像信号を取得させる。例えば、撮像制御部36は、撮像素子12に、自動で画像信号を取得させてもよいし、ユーザの操作に応じて画像信号を取得させてもよい。
画像取得部38は、撮像素子12によって取得された画像データを取得する。画像取得部38は、例えば、画像処理回路13を制御して、画像処理回路13に、撮像素子12が生成した画像信号から画像データを生成させて、その画像データを取得する。画像取得部38は、画像データを記憶部22に記憶させる。
次に、以上で説明した焦点位置の設定の処理フローを説明する。図5は、焦点位置の設定の処理フローを説明するフローチャートである。図5に示すように、制御部24は、物体情報取得部32により、基準物体Bの位置情報を取得し(ステップS10)、対象領域取得部30により、基準物体Bの位置情報に基づき、対象領域ARを設定する(ステップS12)。そして、制御部24は、物体情報取得部32により、物体の位置情報を取得する(ステップS14)。ステップS10、S12、S14の実施順は任意であってよい。制御部24は、焦点位置制御部34により、物体の位置情報に基づき、物体が対象領域AR内に位置しているかを判断する(ステップS16)。物体が対象領域AR内に位置しない場合(ステップS16;No)、ステップS14に戻り、物体の位置情報の取得を続ける。一方、物体が対象領域AR内に位置する場合(ステップS16;Yes)、焦点位置制御部34は、その物体に焦点位置を合わせる(ステップS18)。その後、物体の位置情報の取得を続けて、物体が対象領域AR外に移動したかを判断する(ステップS20)。物体が対象領域AR外に移動しない場合(ステップS20;No)、すなわち物体が対象領域AR内に存在し続ける場合、ステップS18に戻り、その物体に焦点位置を合わせ続ける。物体が対象領域AR外に移動した場合(ステップS20;Yes)、焦点位置制御部34は、その物体から焦点位置を外す(ステップS22)。その後、処理を終了しない場合は(ステップS24;No)、ステップS14に戻り、処理を終了する場合(ステップS24;Yes)、本処理を終了する。
以上説明したように、本実施形態に係る撮像装置100は、撮像素子12と、物体情報取得部32と、対象領域取得部30と、焦点位置制御部34とを有する。物体情報取得部32は、撮像素子12の撮像領域AR0に存在する物体の位置情報を取得する。対象領域取得部30は、物体情報取得部32が取得した基準物体Bの位置情報に基づき、対象領域ARを設定する。焦点位置制御部34は、対象領域AR内に、基準物体B以外の物体が存在する場合、その物体に焦点位置を合わせるように、撮像装置100の焦点位置を制御する。
次に、第2実施形態について説明する。第2実施形態においては、移動する基準物体Bの位置情報に基づいて対象領域ARを設定する点で、第1実施形態と異なる。第2実施形態において第1実施形態と構成が共通する箇所は、説明を省略する。
次に、第3実施形態について説明する。第3実施形態においては、対象領域AR内に存在して、かつ所定の条件を満たす物体に対して焦点位置を合わせる点で、第1実施形態と異なる。第3実施形態において第1実施形態と構成が共通する箇所は、説明を省略する。第3実施形態は、第1実施形態にも適用可能である。
(撮像装置の構成)
図9は、第4実施形態に係る撮像装置の模式的なブロック図である。第4実施形態に係る撮像装置100は、撮像範囲内の物体を撮像する撮像装置である。撮像装置100は、焦点位置を自動で設定可能なオートフォーカス方式のカメラである。撮像装置100は、所定のフレーム毎に撮像することで動画像を撮像するビデオカメラであってもよいし、静止画像を撮像するカメラであってもよい。撮像装置100は、任意の用途で用いられてよく、例えば設備内や屋外の所定の位置に設定される監視カメラとして用いられてよい。
対象領域取得部30は、撮像装置100の撮像領域内に設定された対象領域ARの情報を取得する。対象領域ARとは、焦点位置を自動で合わせるために設定される領域である。対象領域ARの情報とは、対象領域ARの形状情報と移動情報とを指すが、詳しくは後述するで。以下、対象領域ARについて説明する。
物体情報取得部32は、撮像領域AR0内に存在する物体の位置情報を取得する。物体情報取得部32は、物***置測定部14を制御して、物***置測定部14に、撮像装置100に対する物体の相対位置を測定させる。物体情報取得部32は、物***置測定部14による、撮像装置100に対する物体の相対位置の測定結果を、物体の位置情報として取得する。物体情報取得部32は、所定時間毎に、物体の位置情報の取得を行うことで、物体の位置情報を逐次取得する。また、物体情報取得部32は、物体の位置情報に基づき、物体の形状を示す情報(例えば物体の3D形状)も取得できる。例えば、物体情報取得部32は、TOF画像情報など、複数の位置情報を集積して、物体の3D形状を取得できる。
焦点位置制御部34は、撮像装置100の焦点位置を設定する。焦点位置制御部34は、光学素子10の位置を制御することで、すなわち光学素子10の位置を移動させることで、焦点位置を制御する。
撮像制御部36は、撮像装置100による撮像を制御して、画像を撮像させる。撮像制御部36は、例えば撮像素子12を制御して、撮像素子12に画像信号を取得させる。例えば、撮像制御部36は、撮像素子12に、自動で画像信号を取得させてもよいし、ユーザの操作に応じて画像信号を取得させてもよい。
画像取得部38は、撮像素子12によって取得された画像データを取得する。画像取得部38は、例えば、画像処理回路13を制御して、画像処理回路13に、撮像素子12が生成した画像信号から画像データを生成させて、その画像データを取得する。画像取得部38は、画像データを記憶部22に記憶させる。
次に、以上で説明した焦点位置の設定の処理フローを説明する。図12は、焦点位置の設定の処理フローを説明するフローチャートである。図12に示すように、制御部24は、対象領域取得部30により、対象領域ARが移動するように、対象領域ARを設定する(ステップS10)。そして、制御部24は、物体情報取得部32により、物体の位置情報を取得する(ステップS12)。ステップS10、S12の実施順は任意であってよい。制御部24は、焦点位置制御部34により、物体の位置情報に基づき、物体が対象領域AR内に位置しているかを判断する(ステップS14)。物体が対象領域AR内に位置しない場合(ステップS14;No)、ステップS10に戻り、対象領域ARを更新しつつ(すなわち対象領域ARを移動させつつ)、物体の位置情報の取得を続ける。一方、物体が対象領域AR内に位置する場合(ステップS14;Yes)、焦点位置制御部34は、その物体が移動しているかを判断する(ステップS16)。焦点位置制御部34は、対象領域AR内の物体が移動していない場合(ステップS16;No)、その物体に焦点位置を合わせることなくステップS10に戻り、対象領域ARを更新しつつ、物体の位置情報の取得を続ける。焦点位置制御部34は、対象領域AR内の物体が移動している場合(ステップS16;Yes)、その物体に焦点位置を合わせる(ステップS18)。その後、基準物体の位置情報を取得して対象領域ARを更新しつつ、物体の位置情報の取得を続けて、物体が対象領域AR外に移動したかを判断する(ステップS20)。物体が対象領域AR外に移動しない場合(ステップS20;No)、すなわち物体が対象領域AR内に存在し続ける場合、ステップS18に戻り、その物体に焦点位置を合わせ続ける。物体が対象領域AR外に移動した場合(ステップS20;Yes)、焦点位置制御部34は、その物体から焦点位置を外す(ステップS22)。その後、処理を終了しない場合は(ステップS24;No)、ステップS10に戻り、処理を終了する場合(ステップS24;Yes)、本処理を終了する。
以上説明したように、本実施形態に係る撮像装置100は、撮像素子12と、物体情報取得部32と、対象領域取得部30と、焦点位置制御部34とを有する。物体情報取得部32は、撮像素子12の撮像領域AR0に存在する物体の位置情報を取得する。対象領域取得部30は、対象領域ARを、撮像領域AR0内で移動するように、設定する。焦点位置制御部34は、対象領域AR内に物体が存在する場合、その物体に焦点位置を合わせるように、撮像装置100の焦点位置を制御する。
次に、第5実施形態について説明する。第5実施形態においては、移動する基準物体Bの位置情報に基づいて、対象領域ARを設定する点で、第4実施形態と異なる。第5実施形態において第4実施形態と構成が共通する箇所は、説明を省略する。
次に、第6実施形態について説明する。第6実施形態においては、対象領域AR内に存在して、かつ所定の条件を満たす物体に対して焦点位置を合わせる点で、第4実施形態と異なる。第6実施形態において第4実施形態と構成が共通する箇所は、説明を省略する。第6実施形態は、第5実施形態にも適用可能である。
(撮像装置の構成)
図17は、第7実施形態に係る撮像装置の模式的なブロック図である。第7実施形態に係る撮像装置100は、撮像範囲内の物体を撮像する撮像装置である。撮像装置100は、焦点位置を自動で設定可能なオートフォーカス方式のカメラである。撮像装置100は、所定のフレーム毎に撮像することで動画像を撮像するビデオカメラであってもよいし、静止画像を撮像するカメラであってもよい。撮像装置100は、任意の用途で用いられてよく、例えば設備内や屋外の所定の位置に設定される監視カメラとして用いられてよい。
自己位置取得部28は、撮像装置100の位置情報を取得する。撮像装置100の位置情報とは、撮像装置100の位置(座標)及び姿勢(向き)を示す情報である。自己位置取得部28は、自己位置測定部15を制御して、自己位置測定部15に、撮像装置100の位置及び姿勢を測定させる。自己位置取得部28は、自己位置測定部15による、撮像装置100の位置及び姿勢の測定結果を、撮像装置100の位置情報として取得する。物体情報取得部32は、所定時間毎に撮像装置100の位置情報の取得を行うことで、撮像装置100の位置情報を逐次取得する。なお、撮像装置100の位置情報は、撮像装置100の位置及び姿勢の両方であることに限られず、撮像装置100の位置及び姿勢の少なくとも一方を示す情報であってよい。
(対象領域の取得)
対象領域取得部30は、撮像装置100の撮像領域内に設定された対象領域ARの情報を取得する。対象領域ARとは、焦点位置を自動で合わせるために設定される領域である。対象領域ARの情報とは、対象領域ARの位置を示す情報、すなわち対象領域ARの位置情報である。以下、対象領域ARについて説明する。
ここで、撮像装置100が移動した場合にも、対象領域ARを適切に設定することが求められる。それに対し、対象領域取得部30は、自己位置取得部28によって撮像装置100が移動したと判断された場合に、対象領域ARの位置を固定して、対象領域ARを移動させない。これにより、対象領域ARを固定して、撮像装置100の移動によって注目する領域が意図せず変化してしまうことを抑制できる。
物体情報取得部32は、撮像領域AR0内に存在する物体の位置情報を取得する。物体情報取得部32は、物***置測定部14を制御して、物***置測定部14に、撮像装置100に対する物体の相対位置を測定させる。物体情報取得部32は、物***置測定部14による、撮像装置100に対する物体の相対位置の測定結果を、物体の位置情報として取得する。物体情報取得部32は、所定時間毎に、物体の位置情報の取得を行うことで、物体の位置情報を逐次取得する。また、物体情報取得部32は、物体の位置情報に基づき、物体の形状を示す情報(例えば物体の3D形状)も取得できる。例えば、物体情報取得部32は、TOF画像情報など、複数の位置情報を集積して、物体の3D形状を取得できる。
焦点位置制御部34は、撮像装置100の焦点位置を設定する。焦点位置制御部34は、光学素子10の位置を制御することで、すなわち光学素子10の位置を移動させることで、焦点位置を制御する。
撮像制御部36は、撮像装置100による撮像を制御して、画像を撮像させる。撮像制御部36は、例えば撮像素子12を制御して、撮像素子12に画像信号を取得させる。例えば、撮像制御部36は、撮像素子12に、自動で画像信号を取得させてもよいし、ユーザの操作に応じて画像信号を取得させてもよい。
画像取得部38は、撮像素子12によって取得された画像データを取得する。画像取得部38は、例えば、画像処理回路13を制御して、画像処理回路13に、撮像素子12が生成した画像信号から画像データを生成させて、その画像データを取得する。画像取得部38は、画像データを記憶部22に記憶させる。
次に、以上で説明した焦点位置の設定の処理フローを説明する。図25は、焦点位置の設定の処理フローを説明するフローチャートである。図25に示すように、制御部24は、対象領域取得部30により、対象領域ARの情報を取得し(ステップS20)、物体情報取得部32により、物体の位置情報を取得する(ステップS22)。ステップS20、S22の実施順は任意であってよい。制御部24は、焦点位置制御部34により、物体の位置情報に基づき、物体が対象領域AR内に位置しているかを判断する(ステップS24)。物体が対象領域AR内に位置しない場合(ステップS24;No)、ステップS22に戻り、物体の位置情報の取得を続ける。一方、物体が対象領域AR内に位置する場合(ステップS24;Yes)、焦点位置制御部34は、その物体に焦点位置を合わせる(ステップS26)。その後、物体の位置情報の取得を続けて、物体が対象領域AR外に移動したかを判断する(ステップS28)。物体が対象領域AR外に移動しない場合(ステップS28;No)、すなわち物体が対象領域AR内に存在し続ける場合、ステップS26に戻り、その物体に焦点位置を合わせ続ける。物体が対象領域AR外に移動した場合(ステップS28;Yes)、焦点位置制御部34は、その物体から焦点位置を外す(ステップS30)。その後、処理を終了しない場合は(ステップS32;No)、ステップS22に戻り、処理を終了する場合(ステップS32;Yes)、本処理を終了する。
以上説明したように、本実施形態に係る撮像装置100は、撮像素子12と、自己位置取得部28と、物体情報取得部32と、対象領域取得部30と、焦点位置制御部34とを有する。自己位置取得部28は、撮像装置100の位置情報を取得し、物体情報取得部32は、撮像素子12の撮像領域AR0に存在する物体の位置情報を取得し、対象領域取得部30は、撮像領域AR0内に対象領域ARを設定し、焦点位置制御部34は、対象領域AR内に物体が存在する場合、その物体に焦点位置を合わせるように、撮像装置100の焦点位置を制御する。対象領域取得部30は、自己位置取得部28によって撮像装置1000が移動したと判断された場合に、対象領域ARの位置を固定させる。
次に、第8実施形態について説明する。第8実施形態においては、対象領域ARの位置を固定している状態で撮像装置100Aが移動することにより、撮像領域AR0の境界と対象領域ARとの距離が所定距離未満となった場合に、警報を通知する点で、第7実施形態とは異なる。第8実施形態において、第7実施形態と構成が共通する箇所は、説明を省略する。
次に、以上で説明した警報の通知フローを説明する。図28は、警報の通知フローを説明するフローチャートである。図28に示すように、制御部24は、対象領域取得部30により、対象領域ARの情報を取得し(ステップS40)、対象領域ARを設定する。そして、制御部24は、自己位置取得部28により、撮像装置100Aが移動したかを判断する(ステップS42)。撮像装置100Aが移動したと判断された場合(ステップS42;Yes)、制御部24は、通知制御部40により、対象領域ARと境界Bとの距離Dが所定距離未満であるかを判断する(ステップS44)。対象領域ARと境界Bとの距離Dが所定距離未満である場合、通知制御部40は、警報を出力させる(ステップS46)。その後、処理を終了しない場合は(ステップS48;No)、ステップS42に戻り、処理を終了する場合(ステップS48;Yes)、本処理を終了する。一方、撮像装置100Aが移動しないと判断された場合(ステップS42;No)も、ステップS48に進む。また、撮像装置100Aが移動したと判断された場合であって、対象領域ARと境界Bとの距離Dが所定距離未満でない場合には(ステップS44;No)、すなわち距離Dが所定距離以上である場合には、対象領域ARを移動させずに、ステップS48に進む。
次に、第9実施形態について説明する。第9実施形態においては、対象領域AR内に存在して、かつ所定の条件を満たす物体に対して焦点位置を合わせる点で、第7実施形態と異なる。第9実施形態において第7実施形態と構成が共通する箇所は、説明を省略する。第9実施形態は、第8実施形態にも適用可能である。
(撮像装置の構成)
図30は、第10実施形態に係る撮像装置の模式的なブロック図である。第10実施形態に係る撮像装置100は、撮像範囲内の物体を撮像する撮像装置である。撮像装置100は、焦点位置を自動で設定可能なオートフォーカス方式のカメラである。撮像装置100は、所定のフレーム毎に撮像することで動画像を撮像するビデオカメラであってもよいし、静止画像を撮像するカメラであってもよい。撮像装置100は、任意の用途で用いられてよく、例えば設備内や屋外の所定の位置に設定される監視カメラとして用いられてよい。
本実施形態においては、複数の撮像装置100が撮像を行っており、それぞれの撮像装置100の対象領域ARの一部の領域同士が重なるように、それぞれの撮像装置100の対象領域ARが設定される。以下、複数の撮像装置100を有する撮像システムを、撮像システム1とする。以降においては、撮像システム1が、第1撮像装置100aと第2撮像装置100bとを含むことを例にして説明するが、撮像システム1が有する撮像装置100の数は、2つに限られず、3つ以上の任意の数であってよい。
それぞれの撮像装置100は、対象領域ARを設定する。以下、対象領域ARの設定方法について説明する。以降においては、第1撮像装置100aの対象領域ARを、第1対象領域ARaとし、第2撮像装置100bの対象領域ARを、第2対象領域ARbとする。第1対象領域ARaと第2対象領域ARbとを区別しない場合は、単に対象領域ARと記載する。
第1撮像装置100aは、対象領域取得部30により、第1撮像装置100aの撮像領域AR0内に設定された対象領域AR(第1対象領域ARa)の情報を取得する。対象領域ARとは、焦点位置を自動で合わせるために設定される領域である。対象領域ARの情報とは、対象領域ARの位置を示す情報、すなわち対象領域ARの位置情報である。
第1撮像装置100aは、領域位置情報取得部33により、領域位置情報を取得する。領域位置情報とは、基準物体Bに対する第1対象領域ARaの位置(相対位置)を示す情報である。具体的には、第1撮像装置100aは、物体情報取得部32により、撮像領域AR0内に存在する基準物体Bの位置情報を取得する。物体情報取得部32は、物***置測定部14を制御して、物***置測定部14に、第1撮像装置100aに対する基準物体Bの相対位置を測定させる。物体情報取得部32は、物***置測定部14による、第1撮像装置100aに対する基準物体Bの相対位置の測定結果を、基準物体Bの位置情報として取得する。
第2撮像装置100bの対象領域取得部30は、通信部20を介して、第1撮像装置100aから領域位置情報を取得する。第2撮像装置100bは、対象領域取得部30により、第1撮像装置100aから取得した領域位置情報に基づき、第2対象領域ARbを設定する。以下、具体的に説明する。
以上説明した複数の撮像装置100の対象領域ARの設定フローを説明する。図33は、対象領域の設定フローを説明するフローチャートである。図33に示すように、第1撮像装置100aは、対象領域取得部30により、第1対象領域ARaを設定し(ステップS10)、物体情報取得部32により、基準物体Bの位置情報(第1撮像装置100aに対する基準物体Bの相対位置の情報)を取得する(ステップS12)。ステップS10、S12の実施順番は任意であってよい。その後、第1撮像装置100aは、領域位置情報取得部33により、基準物体Bの位置情報に基づき、領域位置情報を取得して(ステップS14)、領域位置情報を第2撮像装置100bに送信する(ステップS16)。
次に、焦点位置の設定方法について説明する。焦点位置の設定については、それぞれの撮像装置100(本例では第1撮像装置100aと第2撮像装置100b)で同じであるため、単に撮像装置100として説明する。
撮像制御部36は、撮像装置100による撮像を制御して、画像を撮像させる。撮像制御部36は、例えば撮像素子12を制御して、撮像素子12に画像信号を取得させる。例えば、撮像制御部36は、撮像素子12に、自動で画像信号を取得させてもよいし、ユーザの操作に応じて画像信号を取得させてもよい。
画像取得部38は、撮像素子12によって取得された画像データを取得する。画像取得部38は、例えば、画像処理回路13を制御して、画像処理回路13に、撮像素子12が生成した画像信号から画像データを生成させて、その画像データを取得する。画像取得部38は、画像データを記憶部22に記憶させる。
次に、以上で説明した焦点位置の設定の処理フローを説明する。図35は、焦点位置の設定の処理フローを説明するフローチャートである。図35に示すように、制御部24は、対象領域取得部30により、対象領域ARの情報を取得し(ステップS30)、物体情報取得部32により、物体の位置情報を取得する(ステップS32)。ステップS30、S32の実施順は任意であってよい。制御部24は、焦点位置制御部34により、物体の位置情報に基づき、物体が対象領域AR内に位置しているかを判断する(ステップS34)。物体が対象領域AR内に位置しない場合(ステップS34;No)、ステップS32に戻り、物体の位置情報の取得を続ける。一方、物体が対象領域AR内に位置する場合(ステップS34;Yes)、焦点位置制御部34は、その物体に焦点位置を合わせる(ステップS36)。その後、物体の位置情報の取得を続けて、物体が対象領域AR外に移動したかを判断する(ステップS38)。物体が対象領域AR外に移動しない場合(ステップS38;No)、すなわち物体が対象領域AR内に存在し続ける場合、ステップS36に戻り、その物体に焦点位置を合わせ続ける。物体が対象領域AR外に移動した場合(ステップS38;Yes)、焦点位置制御部34は、その物体から焦点位置を外す(ステップS40)。その後、処理を終了しない場合は(ステップS42;No)、ステップS32に戻り、処理を終了する場合(ステップS42;Yes)、本処理を終了する。
次に、第11実施形態について説明する。第11実施形態においては、それぞれの撮像装置100の対象領域AR同士が重なる重複領域ARWに物体が位置する場合には、物体が侵入してくる位置に基づき、その物体に焦点位置を合わせる撮像装置100を選択する点で、第10実施形態とは異なる。第11実施形態において第10実施形態と構成が共通する箇所は、説明を省略する。
次に、第12実施形態について説明する。第12実施形態においては、重複領域ARWに物体が位置する場合に焦点位置を合わせるか否かを指定する指定情報に基づき、重複領域ARWに位置する物体に焦点位置を合わせる撮像装置100を割り当てる点で、第1実施形態とは異なる。第12実施形態において、第10実施形態と構成が共通する箇所は、説明を省略する。なお、第12実施形態は、第11実施形態にも適用可能である。
次に、第13実施形態について説明する。第13実施形態においては、対象領域AR内に存在して、かつ所定の条件を満たす物体に対して焦点位置を合わせる点で、第10実施形態と異なる。第13実施形態において第1実施形態と構成が共通する箇所は、説明を省略する。第13実施形態は、第11実施形態や第12実施形態にも適用可能である。
12 撮像素子
14 物***置測定部
30 対象領域取得部
32 物体情報取得部
34 焦点位置制御部
AR 対象領域
AR0 撮像領域
B 基準物体
Claims (20)
- 物体を撮像可能な撮像装置であって、
撮像素子と、
前記撮像素子の撮像領域に存在する物体の位置情報を取得する物体情報取得部と、
前記物体情報取得部が取得した基準物体の位置情報に基づき、対象領域を設定する対象領域取得部と、
前記対象領域内に、前記基準物体以外の物体が存在する場合、その物体に焦点位置を合わせるように、前記撮像装置の焦点位置を制御する焦点位置制御部と、
を有する、
撮像装置。 - 前記対象領域取得部は、前記基準物体の周囲の領域を、前記対象領域として設定する、請求項1に記載の撮像装置。
- 前記対象領域取得部は、複数の前記基準物体に囲われる領域を、前記対象領域として設定する、請求項1に記載の撮像装置。
- 前記対象領域取得部は、前記撮像領域内で停止している前記基準物体の位置情報に基づき、前記対象領域を設定する、請求項1から請求項3のいずれか1項に記載の撮像装置。
- 前記対象領域取得部は、前記撮像装置からの距離が第1距離となる第1位置と、前記撮像装置からの距離が前記第1距離より短い第2距離となる第2位置との間に位置する前記基準物体の位置情報に基づき、前記対象領域を設定する、請求項1から請求項4のいずれか1項に記載の撮像装置。
- 前記対象領域取得部は、前記対象領域を、前記撮像領域内で移動するように設定し、
前記焦点位置制御部は、前記対象領域内に物体が存在する場合、その物体に焦点位置を合わせるように、前記撮像装置の焦点位置を制御する、請求項1に記載の撮像装置。 - 前記焦点位置制御部は、移動しない物体が前記対象領域内に位置している場合は、その物体に前記焦点位置を合わせず、移動する物体が前記対象領域内に位置している場合は、その物体に前記焦点位置を合わせる、請求項6に記載の撮像装置。
- 前記対象領域取得部は、前記物体情報取得部が取得した、移動する基準物体の位置情報に基づき、前記対象領域を移動するように設定する、請求項6又は請求項7に記載の撮像装置。
- 前記対象領域取得部は、前記基準物体に対する前記対象領域の位置を同一に保つように、前記基準物体の移動に伴い、前記対象領域を移動するように設定する、請求項8に記載の撮像装置。
- 前記対象領域取得部は、前記対象領域が、前記撮像装置からの距離が第1距離となる第1位置と、前記撮像装置からの距離が前記第1距離より短い第2距離となる第2位置との間に位置するように、前記対象領域を設定する、請求項6から請求項9のいずれか1項に記載の撮像装置。
- 前記撮像装置の位置情報を取得する自己位置取得部を更に含み、
前記焦点位置制御部は、前記対象領域内に物体が存在する場合、その物体に焦点位置を合わせるように、前記撮像装置の焦点位置を制御し、
前記対象領域取得部は、前記自己位置取得部によって前記撮像装置が移動したと判断された場合に、前記対象領域の位置を固定したままとする、請求項1に記載の撮像装置。 - 前記対象領域取得部は、
前記対象領域の位置を固定する第1モードに設定されている場合には、前記自己位置取得部によって前記撮像装置が移動したと判断された場合に、前記対象領域の位置を固定させ、
前記対象領域の位置を固定しない第2モードに設定されている場合には、前記自己位置取得部によって前記撮像装置が移動したと判断された場合に、前記対象領域の位置を変化させる、請求項11に記載の撮像装置。 - 前記撮像装置が移動することにより前記撮像領域が移動して、前記対象領域から、前記撮像領域内と前記撮像領域外との境界位置までの距離が、所定距離未満となった場合に、警報を出力させる通知制御部を更に有する、請求項11又は請求項12に記載の撮像装置。
- 前記対象領域取得部は、前記対象領域が、前記撮像装置からの距離が第1距離となる第1位置と、前記撮像装置からの距離が前記第1距離より短い第2距離となる第2位置との間に位置するように、前記対象領域を設定する、請求項11から請求項13のいずれか1項に記載の撮像装置。
- 請求項1に記載の撮像装置を複数有する撮像システムであって、
それぞれの前記撮像装置の前記焦点位置制御部は、前記対象領域内に物体が存在する場合、その物体に焦点位置を合わせるように、前記撮像装置の焦点位置を制御し、
それぞれの前記撮像装置の前記対象領域取得部は、それぞれの前記対象領域の少なくとも一部の領域同士が重なるように、前記対象領域を設定する、撮像システム。 - 前記複数の撮像装置として、少なくとも第1撮像装置と第2撮像装置とを含み、
前記第1撮像装置は、
前記物体情報取得部が取得した基準物体の位置情報に基づき、前記第1撮像装置の前記対象領域である第1対象領域の、前記基準物体に対する位置を示す領域位置情報を取得する領域位置情報取得部と、
を有し、
前記第2撮像装置の前記対象領域取得部は、前記第1撮像装置から前記領域位置情報を取得して、前記領域位置情報に基づき、前記第2撮像装置の前記対象領域である第2対象領域を設定する、請求項15に記載の撮像システム。 - 前記第1撮像装置の前記領域位置情報取得部は、3つ以上の前記基準物体のそれぞれに対する前記第1対象領域の位置を示す情報を、前記領域位置情報として取得する、請求項16に記載の撮像システム。
- 前記第2撮像装置の前記対象領域取得部は、前記基準物体の情報にも基づき、前記第2対象領域を設定する、請求項16又は請求項17に記載の撮像システム。
- 物体を撮像する撮像方法であって、
撮像領域に存在する物体の位置情報を取得するステップと、
前記物体の位置情報を取得するステップにおいて取得された基準物体の位置情報に基づき、対象領域を設定するステップと、
前記対象領域内に、前記基準物体以外の物体が存在する場合、その物体に焦点位置を合わせるように、撮像装置の焦点位置を制御するステップと、
を含む、
撮像方法。 - 物体を撮像する撮像方法をコンピュータに実行させるプログラムであって、
撮像領域に存在する物体の位置情報を取得するステップと、
前記物体の位置情報を取得するステップにおいて取得された基準物体の位置情報に基づき、対象領域を設定するステップと、
前記対象領域内に、前記基準物体以外の物体が存在する場合、その物体に焦点位置を合わせるように、撮像装置の焦点位置を制御するステップと、
を前記コンピュータに実行させる、
プログラム。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280059605.0A CN117917090A (zh) | 2021-09-27 | 2022-07-29 | 拍摄装置、拍摄***、拍摄方法以及程序 |
Applications Claiming Priority (16)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-157250 | 2021-09-27 | ||
JP2021-157147 | 2021-09-27 | ||
JP2021-157146 | 2021-09-27 | ||
JP2021-156863 | 2021-09-27 | ||
JP2021-157148 | 2021-09-27 | ||
JP2021157244A JP2023048013A (ja) | 2021-09-27 | 2021-09-27 | 撮像装置、撮像方法及びプログラム |
JP2021156863A JP2023047764A (ja) | 2021-09-27 | 2021-09-27 | 撮像装置、撮像方法及びプログラム |
JP2021-157244 | 2021-09-27 | ||
JP2021157148A JP2023047944A (ja) | 2021-09-27 | 2021-09-27 | 撮像装置、撮像方法及びプログラム |
JP2021156799A JP2023047714A (ja) | 2021-09-27 | 2021-09-27 | 撮像装置、撮像方法及びプログラム |
JP2021156800A JP2023047715A (ja) | 2021-09-27 | 2021-09-27 | 撮像システム、撮像方法及びプログラム |
JP2021157146A JP2023047942A (ja) | 2021-09-27 | 2021-09-27 | 撮像装置、撮像方法及びプログラム |
JP2021-156800 | 2021-09-27 | ||
JP2021-156799 | 2021-09-27 | ||
JP2021157250A JP2023048019A (ja) | 2021-09-27 | 2021-09-27 | 撮像装置、撮像方法及びプログラム |
JP2021157147A JP2023047943A (ja) | 2021-09-27 | 2021-09-27 | 撮像装置、撮像方法及びプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023047804A1 true WO2023047804A1 (ja) | 2023-03-30 |
Family
ID=85719406
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/029384 WO2023047804A1 (ja) | 2021-09-27 | 2022-07-29 | 撮像装置、撮像システム、撮像方法及びプログラム |
PCT/JP2022/029298 WO2023047802A1 (ja) | 2021-09-27 | 2022-07-29 | 撮像装置及び撮像方法 |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/029298 WO2023047802A1 (ja) | 2021-09-27 | 2022-07-29 | 撮像装置及び撮像方法 |
Country Status (1)
Country | Link |
---|---|
WO (2) | WO2023047804A1 (ja) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005167517A (ja) * | 2003-12-01 | 2005-06-23 | Olympus Corp | 画像処理装置、画像処理装置のキャリブレーション方法及び画像処理プログラム |
JP2012165340A (ja) * | 2011-02-09 | 2012-08-30 | Olympus Imaging Corp | 携帯機器およびプログラム |
JP2016027704A (ja) * | 2014-07-04 | 2016-02-18 | パナソニックIpマネジメント株式会社 | 撮像装置 |
WO2017141746A1 (ja) | 2016-02-19 | 2017-08-24 | ソニー株式会社 | 撮像装置、撮像制御方法、およびプログラム |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02118610A (ja) * | 1988-10-28 | 1990-05-02 | Asahi Optical Co Ltd | 自動焦点カメラのフォーカスリミッター |
JP5384172B2 (ja) * | 2009-04-03 | 2014-01-08 | 富士フイルム株式会社 | オートフォーカスシステム |
JP2018084571A (ja) * | 2016-11-11 | 2018-05-31 | 株式会社東芝 | 処理装置、撮像装置および自動制御システム |
JP6923160B2 (ja) * | 2017-11-28 | 2021-08-18 | 株式会社ザクティ | 追尾制御装置 |
JP6998454B2 (ja) * | 2018-03-29 | 2022-01-18 | 富士フイルム株式会社 | 撮像装置、撮像方法、プログラム及び記録媒体 |
-
2022
- 2022-07-29 WO PCT/JP2022/029384 patent/WO2023047804A1/ja active Application Filing
- 2022-07-29 WO PCT/JP2022/029298 patent/WO2023047802A1/ja active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005167517A (ja) * | 2003-12-01 | 2005-06-23 | Olympus Corp | 画像処理装置、画像処理装置のキャリブレーション方法及び画像処理プログラム |
JP2012165340A (ja) * | 2011-02-09 | 2012-08-30 | Olympus Imaging Corp | 携帯機器およびプログラム |
JP2016027704A (ja) * | 2014-07-04 | 2016-02-18 | パナソニックIpマネジメント株式会社 | 撮像装置 |
WO2017141746A1 (ja) | 2016-02-19 | 2017-08-24 | ソニー株式会社 | 撮像装置、撮像制御方法、およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
WO2023047802A1 (ja) | 2023-03-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5911934B2 (ja) | 輪郭線計測装置およびロボットシステム | |
TWI585436B (zh) | 深度資訊量測方法及裝置 | |
KR102111935B1 (ko) | 표시 제어장치, 표시 제어방법 및 프로그램 | |
CN108700408B (zh) | 三维形状数据及纹理信息生成***、方法及拍摄控制方法 | |
JP5875453B2 (ja) | 自動追尾装置 | |
EP3163346B1 (en) | Autofocusing a macro object by an imaging device | |
JP2019161462A5 (ja) | 制御装置、画像処理システム、決定方法、プログラム、及び撮影システム | |
JP4873729B2 (ja) | 光学機器 | |
JP6967464B2 (ja) | 画像処理装置 | |
JP2014021328A (ja) | 立体映像撮影システムに用いる光学装置 | |
US10432843B2 (en) | Imaging apparatus, control method of imaging apparatus, and non-transitory recording medium for judging an interval between judgement targets | |
JP2017103601A (ja) | 焦点検出装置およびカメラ | |
WO2023047804A1 (ja) | 撮像装置、撮像システム、撮像方法及びプログラム | |
JP2021005860A (ja) | 撮像装置、および、撮像装置の制御方法 | |
EP3163369B1 (en) | Auto-focus control in a camera to prevent oscillation | |
JP2023047714A (ja) | 撮像装置、撮像方法及びプログラム | |
KR101247316B1 (ko) | 감시시스템 | |
JP2023047715A (ja) | 撮像システム、撮像方法及びプログラム | |
JP2023048019A (ja) | 撮像装置、撮像方法及びプログラム | |
JP6452913B1 (ja) | 移動体撮像装置及び移動体撮像方法 | |
JP2023047764A (ja) | 撮像装置、撮像方法及びプログラム | |
JP2023047943A (ja) | 撮像装置、撮像方法及びプログラム | |
JP2023047942A (ja) | 撮像装置、撮像方法及びプログラム | |
JP5780750B2 (ja) | 自動合焦装置、その制御方法及びプログラム | |
JP2023048013A (ja) | 撮像装置、撮像方法及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22872560 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280059605.0 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022872560 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2022872560 Country of ref document: EP Effective date: 20240326 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |