GB2539495A - Improvements relating to time-of-flight cameras - Google Patents

Improvements relating to time-of-flight cameras Download PDF

Info

Publication number
GB2539495A
GB2539495A GB1510791.5A GB201510791A GB2539495A GB 2539495 A GB2539495 A GB 2539495A GB 201510791 A GB201510791 A GB 201510791A GB 2539495 A GB2539495 A GB 2539495A
Authority
GB
United Kingdom
Prior art keywords
camera
radiation
light
fish
interest
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1510791.5A
Other versions
GB201510791D0 (en
GB2539495B (en
Inventor
Pyne-Carter Nathan
Andrew Lines Jeffrey
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ACE AQUATEC Ltd
Original Assignee
ACE AQUATEC Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ACE AQUATEC Ltd filed Critical ACE AQUATEC Ltd
Priority to GB1510791.5A priority Critical patent/GB2539495B/en
Publication of GB201510791D0 publication Critical patent/GB201510791D0/en
Publication of GB2539495A publication Critical patent/GB2539495A/en
Application granted granted Critical
Publication of GB2539495B publication Critical patent/GB2539495B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/32Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K61/00Culture of aquatic animals
    • A01K61/90Sorting, grading, counting or marking live aquatic animals, e.g. sex determination
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K61/00Culture of aquatic animals
    • A01K61/90Sorting, grading, counting or marking live aquatic animals, e.g. sex determination
    • A01K61/95Sorting, grading, counting or marking live aquatic animals, e.g. sex determination specially adapted for fish
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/32Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • G01S17/36Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated with phase comparison between the received signal and the contemporaneously transmitted signal
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • G06V20/653Three-dimensional objects by matching three-dimensional models, e.g. conformal mapping of Riemann surfaces

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Zoology (AREA)
  • Environmental Sciences (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Animal Husbandry (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

Time-of-Flight (ToF) cameras underwater is used for estimating position, number, depth, shape, size or mass of freely movable object, particularly underwater such as fish. The apparatus comprises a modulated electromagnetic (EM) radiation source configured to illuminate an object of interest and a 3D Time-of-Flight (ToF) camera comprising a matrix of pixels. The matrix of pixels is configured to gather reflected modulated EM radiation and a controller is configured to determine depth information from phase information of reflected EM radiation gathered by the matrix of pixels. The modulating EM radiation source comprises at least one wavelength component with a wavelength lower than that of visible red light. Also provide is similar apparatus in which the modulated radiation source is configured to emit sufficiently high intensity of radiation so that phase information relating to distance information is extractable in reflected radiation received by the camera.

Description

Improvements Relating to Time-of-Flight Cameras
Field of Invention
The invention relates to system(s), apparatus and method(s) using Time-of-Flight (ToF) cameras underwater, e.g. for estimating presence and/or position, and/or number, and/or depth, and/or shape, and/or size and/or mass of objects, such as freely movable objects (e.g. fish) underwater.
Background to Invention
W02014/098614 (BRINGSDAL) describes a system and method for calculating physical dimensions of freely moveable objects in water by projecting a known light pattern on the objects. BRINGSDAL uses the structured light method with a 2D camera for generating 3D models of objects. BRINGSDAL uses infrared light explaining this is absorbed quickly in water offering lower interference from external sources and filters to collect selected wavelengths of light. Use of multiple 2D cameras e.g. one of which accepts only infrared light and one of which may accept only green light is described.
HASSO PLATTNER INSTITUT issued a human computer interaction Masters Project in Summer 2011 entitled "HCI Project: Underwater Motion Capture using Depth Cameras" (by Professor Dr Patrick BAUDISCH) requiring students to create a gestural interface that works underwater, for example, using the Xbox® 360 KINECT®. This relies on a triangulation technique.
Dr. Ing. Thorsten RINGBECK describes "A 3D Time-of-Flight Camera for Object Detection" in Optical 3D Measurement Techniques 09-12.07.2007 ETH Zurich, Plenary Session 1: Ranging Imaging 1, describing principles and application to tracking of luggage and car collision detection and recommending use of 2D and 3D images where high resolution is needed.
QUALISYS describe "Marine Applications -An Objective real time tracking method for marine applications" using data from two or more cameras (stereoscopic) to calculate the 3D position of markers. The system has a waterproof housing and a cyan light strobe (www.gualisys.com).
W02012/038415 (WINGAN) describes a fish counter for the registration of numbers and/or weight distribution of marine organisms.
N0330423 (RED and OLSON) describes a device for the determination of the volume or mass of an object suspended in a medium using distance image data and amplitude image data. Use of a 3D camera in combination with a grayscale 2D camera is described. The image data in RED and OLSON appears to be of relatively low resolution.
"Deformable Object Detection in Underwater ToF Videos" by Slobodan ILIC was presented at a Dagstuhl seminar 12431 on 26 October 2012. In the abstract, ILIC describes the goal of the research is to build an automated and relatively affordable tool for biomass estimation indicating they will rely on ToF camera images acquired underwater for a period of time. ILIC states this can be achieved by first detecting the fishes in every range image of the incoming video stream and then fitting a 3D model to these detections. ILIC introduces the technical challenges in data acquisition using a ToF camera underwater and the difficulties constructing reliable acquisition devices underwater. A practical underwater ToF camera is not described.
Chie-Leung TSUI describes "Using a Time-of-Flight method for Underwater 3D depth Measurements and Point Cloud Imaging" (i.e. Abstract, Oceans 2015 Taipei, 7-10 April 2014). TSUI describes use of existing cameras including structured light depth cameras and time-of-flight cameras in the infrared range, using an external light source above the water for taking images underwater, and developing point of cloud images from the measurements taken.
Technical White Pater SLOA190B from TEXAS INSTRUMENTS dated January 2014, revised May 2014, entitled "Time-of-Flight Camera -an Introduction" by Larry LI describes time-of-flight camera measurements. Technical White Paper SBAU/219D from TEXAS INSTRUMENTS dated December 2013, revised May 2014, entitled "Introduction to the Time-of-Flight (ToF) System Design" describes Time-of-Flight system design. These Technical White Papers describe pulsed measurements which rely on the round trip time of the pulse indicating distance and also continuous wave measurements (periodic measurements) which use the phase difference between transmitted and received signals as an indicator of the round trip time and hence distance. Single pixel ToF methods use pulsed measurements and provide distance information for a single spot. If this is scanned over a scene, a distance map of the scene can be built up. Multiple pixel ToF cameras typically use continuous wave measurements and a matrix of pixels to obtain a depth map from phase information derived from reflected illumination of an entire scene.
PIATTI and RINAUDO describe "SR-400 and Cam Cube 3.0 Time-of-Flight (ToF) Cameras: Tests and Comparison" in Remote Sens. 2012, 4, 1069-1089; doi; 10.3390/rs4041069.
TILLETT, McFARINE and LINES describe "Estimating Dimensions of Free-Swimming Fish Using 3D Point Distribution Models" in Computer Vision and Image Understanding Vol. 79,123-141 (2000).
DARMONT (www.aphesatorn) describes the "Spectral Response of Silicon Image Sensors" in a Technical White Pater dated April 2009, explaining the probability of interaction of a photon with silicon depends on how much higher than 1.1 eV (= 1125nm, the silicon band gap) is the energy of that photon. In this paper the responsiveness of silicon to wavelength of incoming photons is shown to be optimal above 700nm, in the red/infra-red range.
2G ROBOTICS describe underwater single point laser scanning based on a triangulation method using a blue/green laser light offset by a fixed distance from a sensor. Laser scanning appears to provide sufficient intensity of light for a triangulation method but is slow, relying on the laser travelling across every point in a scene to develop a depth image, so is less suitable for capturing moving images.
At least one observer (in 3D Sensors Lecture 3: Time of Flight Cameras Continuous Wave Modulation by Radu Howard INRIA Grenoble (downloaded 6 May 2015 from rittpliperceptioninrialpes.fr/)) describes the need for a long integration time over several time periods, to increase signal to noise ratio (SNR) hence accuracy, which increases motion blur, rendering this less suitable for capturing moving images.
Time-of-Flight (ToF) cameras have become fairly common in consumer applications such as computer gaming, however, industrial Time-of-Flight cameras remain expensive and have comparatively relatively low resolution. Time-of-Flight cameras have been used in robotic cow milking machines to identify teat position. The animals are slow moving and in any case are located within a milking stall (a frame) during such imaging.
In aquaculture, such as fish farming, harvest estimation and sale of stock is often carried out prior to harvesting. Therefore, being able to estimate pre-harvest, the harvested weight of the farmed animals (e.g. fish) (with appropriate corrections for pre-harvesting reduction in food, blood loss and gut removal, etc) is therefore an important step in the process. Existing biomass estimation systems typically rely on fish having to swim through a frame which can introduce errors to the assessment of the fish within that population if some fish do not swim through the frame. Other existing systems, use 2D cameras to image the fish, however, these systems suffer from inaccuracies e.g. 0 due to not knowing the distance to the fish (and hence the scale), and ii) also difficulties segmenting or discriminating the fish from the background cluttering the image, and iii) the fish swimming at an angle to the 2D camera being misrepresented as short squat fish.
It is, therefore, desirable to provide an apparatus and method that uses a depth sensing camera to capture a depth sensed image of an object underwater e.g. a freely moveable object such as a swimming fish and then appropriate machine vision algorithms to estimate position and/or number, and/or depth, and/or shape and/or size, and/or mass, and/or number of objects.
None of the above documents address several problems encountered in obtaining depth images of marine organisms underwater. The present invention in its various aspects aims to alleviate some of the problems and difficulties found in the existing approaches exemplified by the above documents.
The present invention, in one or more aspects, aims to provide an improved ToF camera system for imaging underwater.
The present invention, in one or more aspects, aims to provide an improved ToF camera system for imaging swiftly moving and/or swiftly deformable objects underwater.
The present invention in one or more aspects aims to provide method(s) and apparatus for imaging underwater that alleviates one or more of the problems described herein and/or offers one or more of the following improvements: improved discrimination with respect to background; improved discrimination with respect to multiple objects within an image; improved control over imaging; improved control over illumination of images; improved control over discrimination with respect to background and/or multiple objects within an image; improved object detection; - improved system(s), method(s) and apparatus for estimating and/or determining presence, and/or position, and/or number and/or depth, and/or shape, and/or size, and/or mass, of freely moveable objects.
Statement of Invention
In a first aspect the invention comprises: an apparatus for determining the presence, and/or position, and/or number and/or depth, and/or shape, and/or size, and/or mass, of freely movable objects, comprising: - a modulated electromagnetic (EM) radiation source configured to illuminate an object of interest; a 3D Time-of-Flight (ToF) camera comprising a matrix of pixels, the matrix of pixels configured to gather reflected modulated EM radiation; a controller configured to determine depth information from phase information of reflected EM radiation gathered by the matrix of pixels; - wherein the modulating EM radiation source comprises at least one wavelength component; - and further wherein at least one wavelength component of the modulating EM radiation source has a wavelength (when measured in air) lower than that of visible red light.
In a further aspect, there is proved a camera system for use underwater comprising the apparatus. In a further aspect there is provided a method for determining the presence, and/or position, and/or number, and/or depth, and/or shape, and/or size, and/or mass, of freely movable objects comprising: providing a modulated electromagnetic (EM) radiation source configured to illuminate an object of interest; providing a 3D Time-of-Flight (ToF) camera comprising a matrix of pixels, the matrix of pixels configured to gather reflected modulated EM radiation; providing a controller configured to determine depth information from phase information of reflected EM radiation gathered by the matrix of pixels; wherein the modulating EM radiation source comprises at least one wavelength component; and further wherein at least one wavelength component of the modulating EM radiation source has a wavelength (in air) lower than that of visible red light.
In a further aspect there is provided a method for determining the presence, and/or position, and/or depth, and/or shape, and/or size, and/or mass, and/or number, of freely movable objects, comprising: providing a modulated electromagnetic (EM) radiation source configured to illuminate an object of interest; providing a 3D Time-of-Flight (ToF) camera comprising a matrix of pixels, the matrix of pixels configured to gather reflected modulated EM radiation; providing a controller configured to determine depth information from phase information of reflected EM radiation gathered by the matrix of pixels; wherein the modulating EM radiation source comprises at least one wavelength component; and further wherein at least one wavelength component of the modulating EM radiation source has a wavelength lower than that of visible red light, the method further comprising one or more of the following:-a) converting a depth image to X, Y, Z co-ordinates; and/or, b) determining if an object in the image matches a pre-determined shape; and/or, c) determining if an expected object is present if the predetermined shape is present; and/or, d) determining depth about a periphery of an object in the image; and/or e) identifying a principle axis, or other suitable dimension, of an object in the image; and/or, 0 identifying an intersection of a principle axis or other suitable dimension with a contour of an object in the image; and/or, g) identifying peripheral features of an object in an image from a change in gradient of a contour of an expected object; and/or, h) identifying an opposite point to previously identified peripheral feature; and/or, i) determining depth along a principle axis, or other suitable dimension, and if the depth is determined to vary along the principle axis, or the other suitable dimension, either rotating the data of an object in an image and/or correcting the data so that the actual length of the principle axis of the object, or other suitable dimension, can be determined and/or j) using one or more dimensions (such as length, width and height) of the pre-determined shape to estimate one or more dimensions and/or size and/or shape of an object. Optionally, the controller, or an integral or separate microprocessor, is arranged to carry out one or more steps of the method.
Preferably, at least one wavelength component of the modulating EM radiation source has a wavelength (when measured in air) greater than that of visible blue light. Preferably, at least one wavelength component has a wavelength of visible green light. Preferably, visible red and/or visible blue light is not used. Preferably, at least one wavelength component has a wavelength of visible orange light and/or visible yellow light. Optionally, multiple wavelength components may be used, e.g. white light from white light LEDs or multiple radiation emitting sources at least two with different wavelengths. Preferably a source comprising one wavelength component is used.
Preferably, the modulated EM radiation source is configured to preferentially illuminate the object of interest within an expected region (a volume) of interest. Preferably the region of interest is configured and or delimited, e.g. by one or more physical barrier(s), and/or by direction of illumination from the radiation source so that the objects of interest, when present in the region of interest, are more uniformly and consistently illuminated, and/or are more uniformly presented to the camera. This arrangement provides improvement(s) in the resolution of images obtainable for moving objects within a scene. Furthermore, control of illumination (by selection of wavelength(s) and/or by variation (e.g. predetermined controlled increase) in intensity (e.g. in standard EM radiation source (e.g. red/infra-red LEDs)) and/or control of distances for the incident and reflected light paths, and/or absorption of those paths, improves resolution of images of moving and/or deformable objects underwater.
Preferably, at least one wavelength component has a wavelength (in air) of less than 625nm or more preferably less than 620nm or more preferably less than 615nm or less than 610nm or less than 600nm. Preferably, at least one wavelength component has a wavelength (in air) of less than 595nm or less than 590nm. Preferably, at least one wavelength component has a wavelength (in air) of less than 580nm or less than 575nm or less than 570nm.
Preferably, at least one wavelength component has a wavelength (in air) of greater than 480nm, or greater than 490nm, or greater than 495nm, or greater than 500nm, or greater than 510nm, or greater than 520nm.
Preferably, at least one wavelength component has a wavelength (in air) of between 480nm and 620nm, or between 490nm and 620nm, or between 500 and 620 nm, between 510nm and 620nm.
Preferably, at least one wavelength component has a wavelength (in air) of between 480nm and 615nm, or between 490nm and 615nm, or between 500 and 615 nm, between 510nm and 615nm.
Preferably, at least one wavelength component has a wavelength (in air) of between 480nm and 610nm, or between 490nm and 610nm, or between 500 and 610 nm, or between 510nm and 610nm.
Preferably, at least one wavelength component has a wavelength (in air) of between 480nm and 600nm, or between 490nm and 600nm, or between 500 and 600 nm, or between 510nm and 600nm, or between 520nm and 600nm, or between 525nm and 600nm, or between 550nm and 600nm.
Preferably, at least one wavelength component has a wavelength (in air) of between 480nm and 580nm, or between 490nm and 580nm, or between 500 and 580 nm, or between 510nm and 580nm, or between 520nm and 580nm, or between 525nm and 580nm, or between 550nm and 580nm.
Preferably, at least one wavelength component has a wavelength of between 500nm and 600nm, or more preferably between 550 and 600nm.
Examples of radiation emitting elements e.g. LEDs, and wavelengths (in air) which may be used include the following. -Red LEDs typically have wavelengths of 700nm (deep red), 660nm (red), 645nm (bright red), 630nm (He-Ne red), 620nm (orange-red). Orange LEDs typically have wavelengths of 615nm (reddish orange), 610nm (orange), 605nm (amber). Yellow LEDs typically have wavelengths of 590nm ("sodium" yellow) and 585nm (yellow), Green LEDs typically have wavelengths of 575nm (yellow green), 570nm (lime green), 555nm (blueish lime green), 550nm (emerald green), 525nm (pure green). Blue-green LEDs typically have wavelengths of 505nm (blueish green), 500nm (greenish cyan), 495nm (sky blue). Blue LEDs typically have wavelengths of 475nm (azure blue), 470-460nm (bright blue), 450nm (pure blue).
From the examples of LEDs given above and their own knowledge, it would be understood by those skilled in the art that the wavelengths ranges given could be rewritten as colour ranges e.g. between 480nm and 620nm indicates a range of colours from blue-green to orange-red, and between 510nm and 620nm indicates a range of colours from green to orange, and between 580nm and 600nm indicates yellow and between 510 and 580nm indicates green.
Preferably at least one wavelength component has a wavelength of between 490 and 620 nm (e.g. blue-green to orange). Preferably at least one wavelength component has a wavelength of between 510 and 620 nm (e.g. green to orange). Preferably at least one wavelength component has a wavelength of between 510 and 600 nm (e.g. green to yellow). Preferably at least one wavelength component has a wavelength of between 510 and 580 (e.g. green).
Preferably, the modulated EM radiation comprises continuous wave modulated EM radiation. Examples of this include square wave and sinusoidal wave modulating EM radiation.
Preferably, the 3D ToF camera and modulated EM radiation source are co-located e.g physically adjacent to one another. This means that these are physically located next to one another so that incident radiation (from the source falling on the object of interest) and reflected radiation, (reflected from the object of interest and falling on the camera), travel more or less in the same direction, and may be more or less the same length. Examples of this are described elsewhere herein. Co-location in this manner can assist and/or in simplifying the electronics for controlling the modulating EM radiation source and/or the receiving electronics for the matrix of pixels so as to sample the incoming continuous reflected EM radiation over appropriate integration windows so that rising edges of incident pulses (and viewing windows) can be timed to be more exactly coincident more easily.
The EM radiation source may comprise a number of source sub-units and the source, or source sub-unit(s) where provided, may comprise a plurality of EM radiation emitting elements such as LEDs and/or laser diodes or the like. The modulated EM radiation may comprises two or more correlated sub-units (e.g. sub-unit sources 230A, 230B, 330A, 330B in Figures 15 and 16). Preferably, the EM radiation emitting elements, where provided, will have the same nominal wavelength and/or same intensity and/or will be modulated at the same frequency. These may be odd or even in number. If even in number, this facilitates placement of equivalent numbers of radiation emitting elements either side of the camera.
Each EM radiation source, or source sub-unit, each of which may comprise multiple EM radiation emitting element, can be said to provide a cone of illumination therefrom. In practice, each EM radiation emitting element (and/or sub-unit) will have its own cone of illumination which will add together to provide the cone of illumination of the modulated EM radiation source (or source sub-unit) to which these belong. The cone of illumination of the one or more modulated EM radiation sources (or source sub-units) typically overlaps with the viewing cone (field of view) of the 3D camera, and this region of overlap can be used to define a volume of interest in which an expected object can be preferentially viewed. Preferably the cone of illumination of the one or more modulated EM radiation source and the viewing cone of the 3D camera are generally or substantially co-incident (e.g. each cone having a centre line lying within 30°, or s20°, or 0°, or 5r3 of the centre line(s) of one another).
One or more absorption paths may be provided within the illumination cone(s) and/or viewing cone comprising a liquid absorption media such as water. Varying the length of the absorption path(s) by varying the distance of the modulated EM radiation source (or source sub-units), and/or varying the distance of the camera with respect to a predetermined volume of interest, assists in providing adequate illumination of objects in the volume of interest, and/or good discrimination with respect to background, and/or good all round illumination (e.g. from above and/or from below and/or from one and/or both sides).
Preferably, the apparatus is configured to provide a predetermined region of interest coinciding with the expected path of one or more objects or interest. Preferably, the apparatus is configured to provide a predetermined region of interest adjacent a wall of a fish enclosure. Preferably the fish enclosure wall is smoothly varying e.g. curved, circular, or oval, in that region. A fish enclosure may have the apparatus of the invention centrally located within it, substantially equidistant from the walls of the fish enclosure. Alternatively, or in addition, the apparatus of the invention may be placed so that a region of interest, e.g. in front of a wall of the enclosure, lies at the focal length of the camera.
Preferably, the source, and/or source sub-units where provided, and/or the EM radiation emitting elements where provided, are directed towards the region of interested (e.g. individually and/or together angled towards the region of interest). One camera may be provided. Optionally, the source may comprise light of differing wavelengths, although one wavelength component is preferred. Preferably at least one wavelength component has a wavelength (when measured in air) less than that of visible red light. Optionally, the source, or one or more radiation emitting elements where provided, provides light of visible white and/or visible green and/or visible orange and/or visible yellow wavelengths (when measured in air).
Preferably the camera and EM radiation source are arranged with the camera in the centre and a plurality of radiation source emitting elements (e.g. LEDs or laser diodes) located symmetrically around the camera either in a continuous shape or in groups (e.g. each group forming a sub-unit of the source). Optionally, the plurality of radiation emitting elements are arranged concentrically around the camera, preferably within the same underwater housing. Radiation emitting components (e.g. LEDs or laser diodes or the like) may be arranged continuously e.g. in a circle, or square, or a triangular shape or the like, lying within the substantially the same vertical plane, optionally with the camera at the centre. Optionally, at least one axis of symmetry in the arrangement of radiation emitting elements about the camera is provided.
Preferably, the EM radiation emanating from the modulated EM radiation source has a horizontal dimension greater than vertical dimension. Preferably, the source, or source sub-units or arrangements of radiation emitting elements, have a horizontal dimension greater than their vertical dimension so as to facilitate this. Radiation emitting elements may be arranged (generally or substantially) horizontally either side of the camera. For example, two groups of eight radiation emitting elements (e.g. LEDs or laser diodes) may be arranged in rows e.g. horizontally either side of the camera in single horizontal lines of eight or in two horizontal lines of four, or four horizontal rows of two and so on, each group of rows may form a sub-unit of the radiation source. Multiple (generally or substantially) horizontal rows of radiation light emitting elements one above the other provides a vertical extent to the radiation source. Thus the source may be designed to illuminate a region having a horizontal dimension greater than its vertical dimension. This assists in excluding objects illuminated by ambient light (above the region of interest) and limits the effect of ambient light illuminating the objects of interest within the region of interest which lies somewhat below the surface (ambient light decreases strongly with increasing depth).
By arranging radiation emitting elements in one or more of the predetermined patterns described herein, illumination within the region of interest can be appropriately provided and controlled. Indeed the source and for source sub-units and/or radiation emitting elements may be angled towards (e.g. pointing towards, or arranged curvedly towards, preferably about a vertical axis, optionally alternatively or in addition about a horizontal axis) a region of interest (e.g. substantially towards its centre). For example, by providing an arrangement of radiation emitting elements in one or more horizontal rows preferably laying one above the other in vertical direction (optionally pointing towards and/or lying within a curved arrangement focussed on the region of interest), a region of interest spanning a volume having a large horizontal dimension with a limited vertical dimension can be illuminated with more efficient energy use and less disturbance out with this zone. By providing camera and EM radiation source (e.g. radiation emitting elements) arranged in this way and/or co-located in this manner (so that the path length and direction of travel of incident and reflected rays are approximately the same), there is a potential for simpler electronics and/or calculations as well as reduced energy use and less disturbance (to fish etc) outside this region.
Preferably, the modulating EM radiation source is associated with the 3D ToF camera whereby the 3D ToF camera is configured to receive the modulated EM radiation of that modulated EM radiation source. This configuration is so that the ToF camera "looks" at the same modulating frequency, i.e. provides sampling windows corresponding to the frequency of the modulated EM radiation source.
Thus, the matrix of pixels within the 3D ToF camera are modulated so as to be sensitive to, and be able to determine phase information from, the radiation being emitted from the modulating EM radiation source. Typically, a square wave is used for example with a frequency of modulation in a range from 20 to 100MHz.
The camera may comprise the controller for determining depth from phase information, or a separate controller may be provided. The camera may comprise optics having one or more band pass filter(s) for excluding one or more unwanted wavelengths. The camera may comprise optics comprising one or more lenses and/or lens combinations for imaging and/or focusing received EM radiation on the matrix of pixels. The optics, e.g. in the form of lens and/or lens combination(s), filters. etc. may be configured to restrict entry of light from above and/or below a volume of interest. The region of interest may comprise a volume opposite an exit pupil of the camera extending in a horizontal direction and/or having limited vertical extent (in front of an exit pupil of the camera within the field of view of the camera). The camera may capture between 10 and 30 frames per second of the entire image scene with corresponding exposure times of 100ms to 33.3ms. Preferably 20 to 25 frames per second are used with exposure times of 50 to 40ms. Preferably, exposure times of a few tens of ms e.g. 40 to 50ms or less may be used.
By providing at least one wavelength of light to which commercially available 3D ToF cameras are less sensitive, enhanced discrimination of illuminated objects with respect to background can be provided.
Examples of a red LEDs which may be used include 720 series available from OSRAM, UK. Other colours of LED are available from various manufacturers. Examples of sensing chip sets which may be used include OPT8140 available from Texas Instruments, D-imager available from Panasonic, epc660 available from ESPROS, 03M151 available from PMD Technologies.
Although conventional silicon sensors are thought to be less sensitive outside the standard measurement regime of red/infrared, the present inventor(s) have found that by providing one or more embodiments as described herein, sufficient light can be found to reach the object of interest and be reflected back to the camera, to provide phase information. This is further enhanced by improving removal of unwanted ambient light due to daylight, by providing an absorption medium that reduces ambient light with increasing depth. Thus, in one or more embodiments, the apparatus is configured to operate at depths of al m ±10% or more or one of 2m ±10%.
The region of interest over which sufficient light must be provided and reflected back to provide phase information to the matrix of pixels can be viewed as a region lying within a solid angle formed by at least one predetermined object of interest within a predetermined distance range. For example, for a fish held within an enclosure, the predetermined distance is the distance from the camera to the walls of the enclosure and the width of the region (adjacent the enclosure wall) may be one, or more likely two to three, or possibly, three to four average fish lengths and the vertical height of the region at the wall is likely to be less than this, say around one to two, or, or more likely two to three average fish lengths high.
The predetermined volume of interest may have a horizontal extent greater than its corresponding vertical extent (e.g. at a similar distance from the source). Thus, the predetermined volume of interest may have a greater angular extent within a horizontal plane than its angular extent within a vertical plane. Thus, the predetermined volume of interest may be wider than it is high and may form a deformed cone shape expanding or spreading horizontally in the z direction further away from the camera, but expanding or spreading much less so in the vertical direction.
Intensity of radiation within the region of interest is determined by the intensity of the light source, its distance from the region of interest and any absorption medium present in between. Preferably, the apparatus comprises a first absorption path along which incident modulated EM radiation travels between the modulated EM radiation source and the region of interest (where at least one predetermined object of interest is expected), and the first absorption path comprises at least one liquid absorption medium (e.g. water, salt water, silty or cloudy water, oil/water emulsion, other liquid(s) bearing suspended particles and/or contaminates).
Preferably, the apparatus comprises a second absorption path along which reflected EM radiation travels between the region of interest and the camera, and the second absorption path comprises at least one absorption medium, (e.g. water, salt water, silty or cloudy water, oil/water emulsion, other liquid(s) bearing suspended particles and/or contaminates).
Preferably the first and second absorption paths are generally of the same length. Preferably, these are within +/-30cm, or more preferably +/-20cm, or more preferably +/-10cm, or more preferably +l5cm of each other in length. Preferably, the first and second absorption path are generally co-incident. Optionally the length of the first absorption path is less than the first, or vice versa. Thus, the source may be placed closer to the region of interest than the camera (preferably without obscuring the field of view of the camera).
The invention provides in a further aspect, in one preferred embodiment, a modulated EM radiation source of high intensity (much higher than conventional commercially available 3D ToF cameras). Preferably, the modulated EM radiation source comprises at least 8 or at least 12 or at least 16 LEDs (e.g. in the re/infra-red region) of total radiant flux of about 800-1000mW each at least. Optionally, two generally co-incident absorption paths are provided comprising a liquid absorption medium. Preferably, these paths are of approximately the same length and therefore the absorption in the incident path is more or less the same as the absorption in the reflective path. Thus, in a further aspect there is provided an apparatus for determining the presence, and/or position, and/or depth, and/or shape, and/or size, and/or mass, and/or number, of freely movable objects, comprising: a modulated electromagnetic (EM) radiation source configured to illuminate an object of interest; a 3D Time-of-Flight (ToF) camera comprising a matrix of pixels, the matrix of pixels configured to gather reflected modulated EM radiation; a controller configured to determine depth information from phase information of reflected EM radiation gathered by the matrix of pixels; wherein, the modulated EM radiation source is configured to emit sufficient high intensity of radiation so that phase information relating to distance information is extractable in reflected modulated EM radiation received by the camera. In a further aspect a method comprising operating the apparatus is provided.
Optionally, at least one wavelength component of wavelength (in air) less than that of visible red light may be used. Preferably, at least one wavelength component of wavelength (in air) greater than that of visible blue light may be used. However, in one embodiment of this aspect of the invention, high intensity red or infrared light may be used. In one or more embodiments the modulated EM radiation source is configured to emit sufficient high intensity of radiation over a solid angle formed by at least part of at least one predetermined object of interest within a predetermined distance range so that phase information relating to distance information is extractable in reflected modulated EM radiation received by the camera. Preferably, the predetermined object of interest is from 5cm to 100cm in length or more preferably 5cm to 75cm in length, or more preferably 5cm to 50cm in length. Preferably, the predetermined distance is 0.5m to 6m, or more preferably 0.5m to 5m, or more preferably lm to 5m, or more preferably 3m to 5m. Preferably, the modulated electromagnetic radiation source comprises multiple radiation emitting elements. Preferably, at least 8, or at least 12, or at least 16, radiation emitting elements are provided. Preferably, the total radiant flux of each radiation emitting elements is 1000mW +/-100mW. Preferably, the modulated EM radiation source comprises at least one wavelength component and further wherein at least one wavelength component of the modulating EM radiation source has a wavelength of red and/or near infra-red light. Preferably red and/or near infra-red radiation emitting elements are used.
Therefore, the reflected radiation falling on the matrix of pixels may be controlled by providing high intensity incident light into a predefined region of interest. High intensity incident light would typically, in the absence of suitable absorption paths and media as provided by the present invention, be likely to saturate the light sensing elements (pixels) within a matrix of pixels. By providing a first absorption path of incident light and a substantially or generally co-incident second absorption path of reflected light, in combination with the high intensity incident radiation, greater control as to the amount of light falling on the matrix of pixels is provided so that this may be kept within acceptable limits, reducing the risks of saturation. This results in better discrimination against the background. In certain embodiments where background reflected radiation is likely to be higher (e.g. where light having a wavelength of less than visible red light is used), the risk of poor discrimination and/or saturation is higher. Therefore, the use of overlapping incident and reflected absorption paths provides a further element of control. Alternatively or in addition the amount of light can be controlled by selective illumination of just a few of the radiation emitting elements and/or by varying the exposure time (e.g. by varying the frame rate).
In one or more embodiments, the length of absorption path(s) can be controlled by locating the camera and/or radiation source (individually or together) at different points along the respective absorption paths. Preferably, these are co-located within a single housing and the respective incident and reflective absorption paths are generally co-incident. Alternatively, these may be located at different points along the paths. For example, the radiation source may be located closer to the expected position of objects of interest. Optionally, these may be located above and/or below a horizontally extending region of interest in which the expected objects of interest are expected to appear. Optionally, these may be directed into the horizontally extending region of interest.
Preferably, the underwater housing comprises a viewing panel. The viewing panel may comprise the exit aperture of the camera. The viewing panel and/or exit pupil of the camera may comprise a de-fogging mechanism. The de-fogging mechanism may comprise a heater. The viewing panel and/or exit pupil of the camera may comprise a transparent panel with an internal resistive heater. Typically the EM radiation source and camera are located in air in a waterproof housing, which can then be located underwater.
In one or more embodiments of the present invention, there is provided system(s), apparatus and method(s) comprising a 3D Time-of-Flight camera for gathering phase information from the reflected modulated electromagnetic radiation and a modulated EM radiation source for emitting modulated electromagnetic radiation of sufficient intensity to ensure sufficient photons are reflected from an object of interest within a "hit zone" The "hit zone" or region (volume) of interest may be delineated (in one sense), by a solid angle (e.g. in steradians) defined by the expected length of the object (or two or three such lengths) and the expected distance at which the object would be found when illuminated by the apparatus. An enclosure having a smoothly curved rear surface may delineate the rear of the "hit zone" and aid in delivering and/or directing freely movable objects into the region of interest in a more consistent way.
The apparatus of the present invention takes into account in one or more embodiments and in various aspects, a number of factors, these include the transmissibility of water with respect to wavelength, particular wavelength(s) of the radiation source, the reflectivity of water particles and other particles floating in the water, the reflectivity of the expected object of interest, the distance to the expected object of interest, the expected motion of the object of interest (speed, direction, attitude), and the response of the expected object of interest (particularly in terms of speed and expected direction of motion) as well as other potential difficulties such as fogging of optics and/or a viewing panel of an underwater housing when used underwater.
Brief Description of the Drawings
The invention will now be described, by way of example only, with reference to the following Figures in which like reference numerals refer to like features. Other embodiments of the invention will be apparent to those skilled in the art from the information described herein.
Figure 1 shows a schematic plan view of a Time-of-Flight camera system comprising a camera and associated light source.
Figure 2 shows a schematic elevation view of a fish enclosure (e.g. on a fish farm) having an apparatus according to one embodiment of the invention in the form of 3D Time-of-Flight camera system comprising a 3D ToF camera and modulated EM radiation source, both co-located in a waterproof housing underwater.
Figure 3 shows a schematic view of Time-of-Flight electronic control modules forming a controller for determining depth information from phase information, within a 3D camera according to one embodiment of the invention.
Figure 4 shows a schematic view of processing steps for capturing images of objects underwater according to an example embodiment of one aspect of the invention.
Figure 5 shows a graph of absorption of light of different colours (wavelengths) in water. Figure 6, and Table 1, show data of reported depth of objects in and out of water.
Figure 7 shows the key points of interest in a species of underwater animal that may be observed using the present invention, here a fish is shown. In this example, the fish is a salmon.
Figure 8 shows a schematic perspective view of a deformable underwater object (here a bending fish) when deformed illustrating the difference in perceived length (L1) between the extremities when the object is curved as opposed to the actual length of the object (L2).
Figure 9 (Table 2) shows depth measurements (z) for key positions on fish such as that seen in Figure 7 (in pixel -x, y -co-ordinates for dimension points 2, 3, 4, 6 and 7).
Figure 10 shows image data at various steps in a process according to an example embodiment of one aspect of the invention for determining the outline (peripheral contour) and principal axis of a freely moveable deformable object such as a fish.
Figures 11 shows image data wherein the distance "z" image, threshold regions and key fish points in an outline are identified for fish at distances 0.5 metres, 1.0 metres, 1.5 metres, 2.0 metres and 2.5 metres according to an example embodiment of one aspect of the invention. The periphery of smaller fish at similar distances were more difficult to identify compared to larger fish.
Figure 12 shows image data including "z" image, threshold regions and key fish points identified on the outline along with the principal axis for bending fish (towards and away from the camera) according to an example embodiment of one aspect of the invention.
Figure 13 is similar to that shown in Figure 12 with firstly a fish swimming towards the camera and secondly a fish with a principal axis perpendicular to the path to the camera according to an example embodiment of one aspect of the invention.
Figure 14 shows various process steps for determining position, and/or number, and/or depth and/or shape and/or size and/or mass of objects underwater, e.g. freely moveable objects such as fish, according to example embodiments of one aspect of the invention.
Figure 15 shows schematic plan and elevation views of apparatus according to one or more example embodiment(s) of one or more aspects of the invention comprising a 3D ToF camera and one or two modulated EM radiation sources (where two are provided, these can be viewed as sub-units of a single source) arranged substantially horizontally so as to illuminate the object(s) of interest from one or preferably both sides preferably evenly. Here, these are located substantially centrally within a substantially circular fish enclosure so as to be substantially equidistant from the walls of the enclosure. The camera and EM radiation sources are co-located next to one another within a waterproof housing.
Figure 16 shows schematic plan and elevation views of apparatus according to one or more example embodiment(s) of one or more aspects of the invention comprising a 3D ToF camera and one or two modulated EM radiation sources (where two are provided, these can be viewed as sub-units of a single source) arranged in a substantial vertical plane (above and/or below the camera) so as to illuminate the object from above and/or below. The camera and EM radiation sources are substantially centrally located within the substantially circular fish enclosure and co-located next to one another.
Detailed Description of the Invention
Definitions Whilst there is some disagreement amongst experts about terminology, it is generally understood that there are two types of Time-of-Flight measurements: the first is the pulsed method that measures Time-of-Flight directly from the arrival time of a single laser pulse and provides a point by point method of mapping objects at quite large distances. The second is the matrix method which uses a matrix of pixels (e.g. pixels within sensors such as CMOS or CCD sensors) and continuous wave modulation of a light source. In this continuous wave modulation method, the phase difference between modulated incident and reflected signals is directly related to the distance, if the modulated frequency is known. Different shapes of signals are possible, such as sinusoidal and square-wave, but square-wave is typically preferred for use in digital systems. Reflected radiation is captured by a matrix of pixels. Within a matrix, each pixel measures the return time of reflected light in which it determines the depth of image seen by that particular pixel, so an image in x (horizontal), y (vertical) and z (depth) co-ordinates can be formed.
In this application, a "matrix or pixels" is a multiplicity of pixels (light sensing elements) set out in an array, typically in rows and columns but other arrangements can be envisaged. While typically a 2D matrix of pixels is expected to be used it is within the scope of this invention that a 3D matrix of pixels may be used.
By "determine" or 'determining", it is meant that an estimate (e.g. a rough estimate) and/or calculation (e.g. more exact calculation) of the quantity is made within expected tolerances for the type of measurement involved. Thus for position, this may be determination within expected tolerances of the 1D or 2D or 3D position of an object in x and/or y and/or z co-ordinates.
The "wavelength" of a wavelength component is its central or nominal wavelength. By "wavelength component of EM radiation", it is meant EM radiation having a central (also known as 'nominal') wavelength, and an associated wavelength distribution expected within tolerances for a given type of EM radiation source. For example, the wavelength distribution of laser diodes is expected to be less than for light emitting diodes (LEDs).
By "co-located" and "co-locating" camera and radiation source, it is meant that incident radiation (e.g. light from the radiation source) originates from approximately the same location at which reflected EM radiation is received. Thus, incident and reflected EM radiation travel approximately along the same path and preferably also approximately the same distance. It will be understood that EM radiation spreads as it travels away from the source. It also spreads as it travels from the object at which it is reflected back to a camera forming in effect incident and reflected expanding cones of light. It is to be understood that when the camera and radiation source are co-located, the centre lines of such cones of light do lay roughly in the same direction and in certain preferred arrangements are also roughly of the same length. Whilst it is not always possible to locate the radiation source and camera at exactly the same location, it is possible to locate these physically next to one another, e.g. symmetrically with respect to one another, or even concentrically (e.g. the camera may be at the centre of a symmetrical pattern of light emitting elements surrounding the camera). Thus, the light emitting elements such as laser diodes or LEDs may be arranged symmetrically about the camera and may be continuously arranged around the camera (e.g. in a square, rectangle, oval or circle, or may be arranged in patterns either side of the camera or above and/or above and below as will be explained elsewhere herein).
By (approximately) co-locating the light source and the light receiver, the incident rays and reflected rays (or at least those received by the camera (light receiver)) will lie at approximately the same angle and be at approximately the same distance with respect to the expected location of the object of interest (e.g. a fish) within a region of interest, reducing the variation in incident 'light' upon the object of interest (e.g. the fish).
The term "co-incident" is given its normal meaning of overlaying and by "generally co-incident path" it is meant more or less co-incident with some allowance made for deviation from the co-incidence of the path(s) along part or all of their length. Nevertheless, it is expected that "generally co-incident paths" lay in the same general direction. Where spreading or expanding white cones exist about a central light ray, the paths of the central light rays may be generally co-incident if the central light rays lie within less than about 30° of one another, and so overlap over at least part of the light cones. In this case the light cones lay in the same general direction and may cross or overlap over part or all of their length.
By "horizontal" it is meant laying in a plane approximately parallel with the average level of the surface of a body of liquid at that location. Thus, whilst typically horizontal is a term used in reference to the ground, the ground itself may not be horizontal and therefore in a body of water it is more suitable to refer to the overall average level of the surface of the body of liquid as being horizontal as this represents a more consistent indicator.
By "predetermined distance range", it is meant a range of predetermined distances e.g. over which an object of interest is expected to be found. For example, a fish enclosure (e.g. within a fish farm) may have a pre-determined distance range e.g. from source and/or camera to the walls of the enclosure i.e. the limit within which an expected object of interest can be observed, of 0.5 to 8m, or more preferably 1 to 6m or more preferably 1 to 5m or more preferably 0.5m to 2m or more preferably 0.5 to 1.5m. The distance is measured from an exit pupil of the camera to the object of interest. Typically, the exit pupil of the camera will be co-incident or nearly co-incident (within a few centimetres) with a viewing window of an underwater camera housing.
In this application, the term 'light' is used as one example of the more generic term 'electromagnetic' (EM) radiation. The term 'light' is not intended to be limiting, and should be understood as a shorthand for EM radiation, unless the context dictates otherwise, for example EM radiation from light emitting diodes, laser diodes or the like may be used in the present invention even if not visible.
Whilst visible light may be used, and the term light is used throughout this document, it will be recognised by those skilled in the art that any suitable EM radiation capable of being modulated at a suitable frequency (such as 20-100 MHz) with a suitable modulation wave form (e.g. sinusoidal or more preferably square wave) may be used. It will be understood that the selection of one or more wavelength components EM radiation of one or more particular wavelengths (or selection of a range of wavelengths e.g. of a limited spread, centred on a particular or nominal wavelength) forms example embodiments of certain aspects of the invention.
The term 'visible light' is intended to mean those wavelengths of EM radiation (light) that are visible to the human eye.
In one aspect of the invention, as will be described later, wavelength components of a particular range of wavelengths is suggested for a particular application. Where a wavelength or wavelength range of EM radiation is specified, it will be understood that devices emitting EM radiation (e.g. light) such as LEDs, laser diodes, lasers etc may be said to emit EM radiation 'at' a particular wavelength, but in practice, and this does vary with particular devices and types of devices, a range of wavelengths is emitted depending upon the nature of the device and material and manufacturing tolerances. For laser diodes and lasers this range may be very small. For LEDs this range may be less small e.g. as a percentage of the actual wavelength or in nanometres, than for laser diodes and lasers. Where a wavelength of a device is specified this refers to the nominal peak or main central wavelength of the device (which typically, but not necessarily, lays at the centre of the range of emitted wavelengths and is typically the wavelength at which the device is rated). Furthermore, where it is intended in certain embodiments that a particular wavelength or range of wavelengths of EM radiation is preferred, it is sufficient that the main or peak expected wavelength of the device (in air) falls within that preferred range, even if some of the actual range of emitted wavelengths from the device (because of the distribution of emitted wavelengths for that type of device) falls outside the preferred range of wavelength.
For ease of reference, we refer to different ranges of wavelengths of light, particularly visible light, as red, orange, yellow, green, blue, etc. Whilst it is generally understood that visible light is red to blue, with near infrared and ultraviolet falling outside the visible range (for humans), the actual beginning and end of the visible spectrum, or division of one colour from another in terms of actual wavelengths (e.g. in nm) is not well defined by observers. That being said, observers tend to agree when observing colours in visible light. When specific wavelengths are mentioned these are intended to lie within the limits of the expected range of wavelengths of that colour group.
Typically, the range of wavelengths for colour groups are as follows: infrared is ascribed to wavelengths above 750nm, red is ascribed to wavelength of 620 to 750nm, orange is ascribed to wavelengths of 590 to 620nm, yellow is ascribed to wavelengths of 580 to 600nm, green is ascribed to wavelengths of 510 to 580nm, blue-green is ascribed to wavelengths of 490 to 510nm, blue is ascribed to wavelengths of 450 to 490nm and violet is ascribed to wavelengths of 380 to 450nm. It is expected that observers would ascribe an error to these ranges of the order of +1-5 to 1 Onm. It will be appreciated by those skilled in the art, that when referring to wavelengths, this is in relation to the wavelengths in particular medium, usually vacuum or air. The actual wavelength of light changes as the light passes into seawater, as does the velocity, although the frequency (and so the colour) does not. EM radiation emitting sources are usually referred to by their wavelength (in air) and therefore for convenience wavelength (in air) is referred to within this application rather than frequency. Conversion to frequency using the speed of light in water could be carried out.
In Figure 1 a schematic view of an apparatus 10 according to one example embodiment of the invention is shown. Here apparatus 10 comprises a Time-of-Flight (ToF) camera having a sensing chip 22 comprising an array of light sensing elements (pixels) in a 2D matrix formation (for example arranged in an array e.g. of rows and columns) and an associated correlated separate modulated EM radiation source 30 (which, for simplicity, may be referred to as a light source 30 in the following description). Camera 20 comprises sensing chip 22, an optional filter 24 and preferably also a focusing lens component 26. Lens component 26 may comprise a single lens, or a combination of two or more lenses, for providing a suitable optical aperture for camera 20 and focussing onto sensing chip 22. Examples include a 16mm focal length lens LM16JC1MS available from Kowa, Germany.
Light source 30 is correlated with camera 20 so that timing information in relation to the difference between modulated outgoing light 60 and modulated reflected light 80 can be gathered. Thus, it will be appreciated that the present invention finds application in 3D ToF camera systems that use periodically modulated incident EM radiation (continuous wave modulation) and a matrix of light sensing elements (pixels) arranged in an array. Each light sensing element (pixel) gathers light which is used to determine phase difference between incident and reflected EM radiation for that pixel to provide a depth map of the viewed scene on a pixel by pixel basis.
Light source 30 may be preferably physically integrated with camera 20 in a single camera housing (e.g. a unitary camera housing) but it may be housed separately but in a co-located arrangement e.g. within an underwater housing or neighbouring underwater housings, containing a camera and light source. In both arrangements these are said to be co-located together as the camera and light source are physically located substantially next to one another and the emitted light and received light is emitted and received in more or less the same direction. Further, the light source 30 is correlated with the sensing electronics of 3D Time-of-Flight (ToF) camera 20 so that phase/time information can be extracted from the reflected signal 18. It is desirable that the light source (or light source sub-units) are co-located so as to be physically adjacent to one another. Light source 30 may comprise one or more EM radiation emitting elements such as light emitting diodes (LEDs) or laser diodes or similar. These provide EM radiation capable of being modulated at a frequency suitable for the measuring capabilities of the light sensing elements. In one example, one or more LEDs emitting green light may be used. Other colours such as orange and yellow may be used. For example, orange and/or yellow and/or green LEDs may be used. Although less preferred blue-green LEDs may be also be used. In at least one embodiment it is preferred red and blue LEDs are not used.
Sensing chip 22 within camera 20 typically comprises a number of pixels. Once commercial example of a sensing chip is the Texas Instruments OPT8410 having a QVGA resolution of 360 x 240 pixels.
Whilst certain embodiments of the invention may feature an external light source 30 separate from 3D Time-of-Flight camera 20 it is preferred that, for underwater applications, the light source and Time-of-Flight camera are co-located so that reflected rays 18 travel approximately in line with outgoing light rays 16. This simplifies somewhat the mathematics but, more importantly, reduces the variation in illumination of the fish, which may be more consistently illuminated with respect to the camera. This reduces potential variation in light received by the camera, enabling "tuning" of the apparatus, e.g. such as selection of a particular choice of wavelength(s), and/or selection of brightness, and/or selection of exposure time etc e.g. for that particular application and/or for that particular location, and/or for local water conditions. Thus, by co-locating the light source and camera optionally within a single camera housing and in any case preferably within the same waterproof housing, the relation of the fish with the camera and the light source remains more consistent than would otherwise be the case.
Figure 2 shows components of a machine vision system for use underwater, e.g. on fish farms, comprising a combined 3D Time-of-Flight (ToF) camera 120 having a waterproof rated housing 32. ToF camera 120 is connected via a waterproof cable 31 to a standard PC capable of running machine algorithms 34. Waterproof cable 31 may comprise various wet-mate connectors along its length.
Camera 120 is located within a fish enclosure 14 which may be a net, cage, or indeed a tank forming a fish pool. Fish 12 are able to move freely within the fish enclosure 14.
Nevertheless, typically, fish 12 will frequently swim in a particular orientation with respect to the enclosure, often in a more or less horizontal direction around the periphery of the enclosure. Although fish can swim in any direction, when rising or falling there is often a substantial horizontal component to the motion of fish. The inventor(s) have appreciated that advantage can be taken of this by co-locating camera and light source so as to more consistently illuminate the fish in the region in front of of a smoothly varying wall of a fish enclosure. Typically camera 120 will be located, preferably centrally, within the fish enclosure 14 with a field of view and associated light source and receiving optics intending to capture images opposite the camera looking in a generally horizontal direction.
Preferably, the camera 120 will have a field of view across, and somewhat above and below, a horizontal plane in front of an exit pupil of the camera.
Figure 3 shows a combined underwater 3D ToF camera 120 having here, in this example, an LED driver 40 forming with LEDs (not shown) a modulated EM radiation source emitting incident modulated EM radiation 16, and receiving optics 28 directing reflected modulated EM radiation 18 to ToF sensing chip 22. Receiving optics 28 may comprise something simple like a pin hole but more typically comprises a combination of optical elements such as a band pass filter 24 and/or a lens, and/or lens combination 26 (not shown). The lens and filter combination is selected to optimise delivery of the received reflected EM radiation and the phase information of an object image it carries, onto the pixels of sensing chip 22. 3D ToF camera 120 may comprise an analogue front end conversion module 36 and/or a Time-of-Flight controller module 38 for controlling LED driver 40 and correlating modulated outgoing EM radiation 16 with received reflected modulated EM radiation 18. A microprocessor 42 is arranged to analyse and correlate timing information between incident and received rays 16, 18 of EM radiation and deliver this to external processing means such as a separate computer 34. A power management module 44 is also provided. Thus, a controller configured to extract depth information from phase information is provided by one or both of the time of flight controller module 38 and the microprocessor 42. Alternatively, a single controller with appropriate processing capability to offer both functions may be provided.
Figure 4 illustrates a ToF camera 20, 120 having receiving optics 28 and providing a camera exit pupil of given dimensions and the process steps in general required to deliver biomass information to a user. In process 50, in step 52 firstly, camera 20, 120 captures one or more images (containing x, y and z information). Typically, in step 54, an outline of an animal such as a fish is extracted from the data. This second step may include several sub-steps, e.g. selecting suitable images from an incoming image stream (e.g. in step 52 several images may be obtained per second, a typical frame rate being 20 frames per second). Optionally, blurred or unwanted images are discarded. Optionally, images are averaged together (on a pixel by pixel basis) to enhance the signal to noise ratio. Expected objects within the image may then be identified, e.g. by applying a suitable shape and assessing if this matches, to a predetermined extent, the shape(s) (peripheral outline) of one or more objects in the image. A discrimination step, involving the application of thresholds to depth data, may be used following capture of the image in step 52 to aid in identifying expected objects within the image(s). Once a potential object has been identified in the image data, the outline of the potential object is extracted from the data. In step 56, optionally, a comparison is made between the expected outline and the measured outline. This may be carried out by establishing one or more features of the expected outline, e.g. overall shape (e.g. ellipse) and/or tail fin, top fin, bottom fin, etc, or any one or more of these, and discarding any outlines that do not correspond with that expected. If the identified object is assessed to be the expected object, e.g. a fish, then analysis of the image is continued e.g. to extract size and/or weight in step 58. If the extracted object does not meet the expected shape of the expected object then the process is repeated from step 52, with either a further section of the same image or a different image, to identify potential objects of interest in the image data. An applied shape e.g. ellipse may be used to identify an object. Typically, real image data of the object may be used, optionally corrected e.g. for bending or turning, to estimate shape and/or size and/or mass. Optionally, in one embodiment, the matched shape e.g. ellipse can be used as a representative avatar of the object from which information can be extracted e.g. shape and/or size and/or mass.
Therefore, size and /or shape information from the matched shape (e.g. an ellipse) and/or from real image data itself, can be used e.g. to estimate mass.
Available commercial sensing chip sets are rated for use with red and/or infrared light and, indeed, are provided with band pass filters within the receiving optics to exclude other sources of light. Silicon (used in commercially available 3D ToF cameras) is particularly responsive in the red/infrared region.
To improve performance of ToF cameras underwater, an assessment of the absorption of light of different colours (wavelengths) in water was undertaken and is shown in graph 60 in Figure 5. Off the shelf Time-of-Flight cameras use infrared light for illumination, and the purpose of this experiment was to determine which colours of light may work better in water. To test the absorption of different colours of light in water, a white light source was placed in water facing an illuminometer and the distance was varied and light levels recorded. This was repeated with different colour filters in front of the illuminometer. Results for % absorption as a function of wavelength are shown in Figure 5. The results have been normalised so that absorption at 175 mm is one. The results show that red light and, somewhat surprisingly, blue light were absorbed very quickly in water. The results also indicate that better light to use in terms of transmission through water may be green to orange or a combination of colours (such as a combination of different wavelengths e.g. white light) with at least one wavelength component of the light source having a wavelength (in air) less than that of red light.
In one or more aspects of the present invention, the inventor(s) have appreciated that increased light levels can be obtained more efficiently and effectively by providing a light source of high intensity incident radiation (e.g. >1600mVV), and/or by appropriate co-locating of camera and light source, and/or by improving illumination of a predetermined volume of interest and/or improving control of absorption paths of incident and/or reflected rays.
In Figure 6 (Graph 70 and Table 1) an experiment was carried out to determine if measurements using standard infrared cameras could be taken underwater. A camera was housed within an underwater housing (an aluminium box with a sealed perspex window). The underwater housing contained both a red light source and a 3D ToF camera co-located within it. Here the light source and camera formed a single camera system. The viewing window of the housing (and light source and camera within it) were arranged to illuminate and view a horizontal volume of water.
An image of an artificial fish target at different distances out of water was taken and the test was then repeated in water. The viewing panel of the housing itself resulted in some degradation in the image.
Up to 200 mm the depths estimated underwater and out of water were similar. By 250 mm the underwater depth appeared quite incorrect. At lower distances, the measurements underwater appeared to be consistent indicating that errors in light propagation in water may be rectified by calibration. However, the present inventors have appreciated that increasing light levels underwater rather than saturating the camera pixels as might be expected, can provide improved results (e.g. illumination and recognition of object(s)). Indeed, increasing light levels in specular media such as water, can result in much increased back scatter from particles suspended therein and reduced resolution of images (like illuminating snowflakes in a car headlights, or moisture droplets in fog). Nevertheless, by providing the configurations provided in one or more embodiments of the present invention, this is effect is thought to be much reduced enabling images of better discrimination with respect to background and improved resolution to be obtained.
In a further aspect, the present inventors have also appreciated that extending the wavelengths of light used, e.g. to include preferably green, and/or optionally orange and/or yellow (and/or blue-green although this is less preferred) can provide improved results (illumination and recognition of object(s)).
Figure 7 shows an elevation view of a salmon 80 showing various points of interests:-namely point 1 (the eye), point 2 (the snout), point 3 (point on the lower belly opposite the intersection between the front of the top fin and the outline (periphery) of the main body of the fish), point 4 (the intersection of the front of the top fin and the outline (periphery) of the main body of the fish), point 5 (the end of the body and beginning of the tail fin at the intersection of the outline (periphery) of the main body of the fish and a central point of the tail fin), point 6 (the furthest most extent of a central point of the tail fin) and point 7 (the intersection between the lower rear fin and the outline (periphery) of the main body of the fish). In order to gain a suitable estimate of the mass of the fish, the length L2 (along the principal axis between point 2 and 5) is identified. Preferably the height of the fish between points 3 and 4 should also be determined and, if possible, the distance 4-7 between points 1 and 7 should also be assessed. Preferably, actual dimensions derived from depth data are used (rather than perceived dimensions) e.g. an actual length L2 derived from depth data is used, rather than a perceived length L1 (see Figure 8). From these an estimate of the biomass may be made. In one embodiment, a height and/or length of an applied shape (e.g. an ellipse) matched to real image data may be used to represent fish measurements (preferably corrected for any bending) instead of actual image data, and size and or mass may be derived from this matched shape data. For example, if the perceived length of an applied ellipse shape is LL1, this can be corrected to actual length of applied ellipse shape LL2 using the depth data to correct for any bending. The actual ellipse length LL2 can then be used to represent the length of a fish. Similarly the height HH1 (or actual corrected height HH2) of the ellipse can be used to represent the height of the fish so that the length LL2 and height HH2 of the ellipse can be used to provide an estimate of the fish size and/or mass. Alternatively, real image data e.g. the peripheral contour of a fish outline may be used.
In a 3D image a large fish at a greater distance can be distinguished from a small fish at a closer distance. Thus, even having a single depth measurement to the fish, can allow a more accurate estimate of length of the fish L1, since the depth to the fish is known, the scale for the length L1 (or L2) can be determined.
Figure 8 shows a bending fish 90 and the difference between perceived length L1 between points 2 and 5, as would be seen in a 2D image using x and y co-ordinates, and actual length L2 of the fish between points 2 and 5 which can be determined in a 3D depth image (using x, y and z co-ordinates).
By using multiple depth measurements the actual length L2 may be determined from pixel (x, y, z coordinates) information along the perceived length L1.
Figure 9 (Table 2) shows fish measurements for key positions on fish such as that seen in Figure 7 (in pixel -x, y co-ordinates for dimensions points 2, 3, 4, 6 and 7). Figure 9 shows the results of depth measurements in air using a ToF camera to identify particular dimension points on small, medium and large fish. Thus, within each image (e.g. a frame or, if appropriate, a number of averaged frames) the outline of the fish and particular dimensional points 2, 3, 4, 6 and 7 on the outline of the fish have been identified and identified in pixel co-ordinates. From these, the scale of the fish and one or more size measurements can be determined.
Figure 10 shows image data at various steps in a process for determining the outline (peripheral contour), and principal axis or other suitable dimension of a deformable object such as a fish.
In Figure 10 an example of how this may be achieved is shown. In step A, a "z" depth image of a fish template is shown. In step B, the "z" depth data has been used to eliminate data points not falling within the threshold of expected depth of the expected object. A zone of interest is defined, say between 0.5m and 1.5m and any data having depth outside this region is rejected. Then a region finding operator e.g. in a software module, is used to find areas of the image where adjacent pixels have the same value e.g. depth (say within 5 or 10mm of each other). This is repeated until a region is identified. Thus, step B assists in identifying the pixels associated with the outline of the expected object. In step C, a shape e.g. ellipse is fitted to the region, and remaining data is rejected. The selected shape, here an ellipse, is used to assist in identifying the presence of an image of an expected object within the image data. If corresponding depth data from an outline shape correlates well with the selected shape, here an ellipse, then a decision is made that a fish is present. Once a section of the image data is identified as being the expected object, here a fish, then the contour of the object (e.g. in pixel co-ordinates) may be rotated (about a horizontal and/or vertical axis) so that the principal axis of the object (or other suitable dimension) lies within a horizontal plane (and preferably also within a vertical plane perpendicular to the path of light from the object to the camera). In step E, key points and/or dimensions are identified which, for fish such as salmon, may be points 1 to 7. Alternatively or in addition, key points and/or dimensions can be identified in the fitted shape e.g. in the ellipse (computationally, this may be easier and may be sufficient) so as to provide an estimate of size and/or weight of the fish.
In step C, the selected shape, here an ellipse, may be matched to that of an expected object so that the distance between the periphery of the selected shape and the periphery of the selected object is arranged to be a minimum. The centre of the selected shape C may then be identified and the image of the expected object rotated about this centre in step D. In step E, points 2, 3, 4, 6 and 7 are identified e.g. by identifying variation in the gradient of the outline or periphery of the fish or by taking a line diametrically opposite such a point (e.g. point 3 diametrically opposite point 4 with respect to the principal axis between 2 and 6).
In step F, the depth data along the principal axis 2 to 6 may be identified from the image data.
Figure 11 shows image data for fish at distances 0.5 metres, 1.0 metres, 1.5 metres, 2.0 metres and 2.5 metres wherein the distance "z" image, threshold regions and key fish points in an outline, are identified. It can be seen that at 2.5 metres identification of a fish has not been possible. Figure 12 shows image data including "z" data images, threshold regions identified, and images with key fish points identified on a peripheral outline of the fish for bending fish (towards and away from the camera).
In Figure 12 fish bending towards and away from the camera are shown with respective depth data. A corrected length L2 may be derived from the depth data whereas the apparent length L1 only would be seen in the x, y data and without at least one depth measurement, or some other way to judge scale, the actual length of L1 is uncertain. Similarly for fish swimming towards or away from the camera, as shown in Figure 13, a corrected length L2 may be derived using depth data associated with pixels along the perceived length in x, y co-ordinates L1.
In Figure 14 a process 90 is described for obtaining depth image and using depth information to estimate position and/or depth and/or shape and/or size, and/or mass and/or number information of the selected objects. Firstly, in step 92, a depth image is obtained using a continuous wave modulated EM radiation source (optionally comprising multiple EM radiation emitting elements) and a camera comprising a matrix of EM radiation sensing elements (e.g. pixels in an array). One or more depth images may be combined (e.g. an average) to improve signal to noise ratio. This is especially useful if the illumination and rate of image capture is such that the objects have not (or have not significantly) moved or deformed. Typically, for fast moving objects such as fish only one or a few images may be so combined. Next, in step 94, a depth image is converted to x, y and z co-ordinates, x, y being in a plane (preferably a generally or substantially vertical plane) and z being perpendicular to the x, y plane (preferably in generally or substantially horizontal plane). Optionally, in step 96, a threshold is applied to the z co-ordinates to eliminate (e.g. by making these zero or very large) those co-ordinates not falling within a particular range of depth, e.g. within a predetermined expected range of depths, or a range of depths determined from the image itself. So if a series of pixels has a range of depth 2.0 to 2.5m, and the rest are >3.0m, this may indicate that those pixels, outside the 2.0 to 2.5m range, can be discarded. Optionally, an intermediate step (step 97 -not shown) a region of adjacent pixels of similar depth may be identified using a region finding operator. Next, in step 98, a contour of an object is identified typically, for example, in the x, y plane. Next, a suitable selected shape function (e.g. for a fish, a suitable shape may be an ellipse) and this shape is applied to the x, y plane. Thus, as in step C of Figure 10 the outline of the fish is identified and an ellipse shape function is applied. Typically, by determining a minimal difference in size between the selected shape, such as an ellipse, and an observed outline a decision can be made as to whether this is indeed the expected object (step 102). In step 101, optionally the count of objects may be increased by one (and/or the number of such objects in an image may be determined, as a measure of number). In step 103, optionally a principle axis (or other suitable dimension) is identified and the image data is rotated (e.g. about a vertical and/or horizontal axis about a centre point, such as 'C' in Figure 10) e.g. to a horizontal before x, y and distance z data is extracted. Optionally, in step 104 key points on the expected object may be identified (e.g. key points on the intersection of the principle axis or other suitable dimension, and the outline contour) and 108. Optionally, in steps 106 and 108 further key points, e.g. of contour, features, are identified. In step 110, preferably, depth information is used to determine the actual length (rather than perceived length) of a suitable axis or dimension.
When a ray of light is incident upon a surface, it will be reflected in a direction such that the angle of reflection is equal to the angle of incidence. Further, depending on the nature of the surface, some light may be absorbed, some may be scattered and some may be reflected at a somewhat different angle than the expected angle of reflection (e.g. due to imperfections in the reflecting surface).
Therefore a reflected ray will have an angular spread (in steradians).
In land based animal husbandry, the motion of the animals is typically limited to motion within a 3D volume having a defined "base" (namely the ground) upon which the animals can walk (crawl, move, etc). Thus, animal motion is confined by necessary contact with the ground. The location of land based animal(s) is therefore confined and relatively predictable providing some simplification for visual inspection using cameras. Further, in much land based animal husbandry animals tend to be relatively slow moving slowly deformable and have limited ranges of deformation possible, and their location may be easily controllable. However, there is lots of ambient light which can saturate conventional infrared 3D ToF cameras.
Underwater, animals (e.g. shellfish, fish, etc.) are unconfined (save for being within an enclosure) and can move freely in three dimensions. They are deformable (to facilitate swimming) and can move very quickly. Underwater there is little ambient light which beneficially aids discrimination when illuminating an object with conventional infrared 3D ToF cameras which use infrared.
The inventor(s) have appreciated that locating the source of incident EM radiation remotely (e.g. so that incident rays are at (or are approximately at) right angles to the expected path of reflected rays to a camera), can be problematic because this introduces another degree of freedom into image creation. This is thought to be because the fish may swim in any direction in relation to EM radiation source and, therefore, the orientation of the EM radiation source to the fish will be variable so the illumination of the fish will be variable. Furthermore, the inventor(s) have appreciated that even though conventional infrared 3D ToF cameras offer better discrimination against the background ambient light underwater (e.g. because red/infrared light from natural light is quickly absorbed) other selected wavelengths, and/or wavelength ranges, may offer a better solution. The inventor(s) have also appreciated that increasing the intensity of the 'light' source and/or controlling the location of the light source with respect to the matrix of pixels and/or controlling the absorption path(s) between 'light' source and object of interest and between object of interest and camera, can be beneficial in producing images of good resolution even when red/infra-red sources are used, with reduced risk of saturation.
Further, the inventor(s) have appreciated that, even though the response of conventional 3D ToF cameras using silicon are thought to offer the best response in the red/infrared ranges, other selected wavelength(s) and/or wavelength ranges may still be observed by the pixels (light sensing elements) of such cameras and can provide good images underwater, especially within controlled conditions (e.g. by controlling the location of the light source with respect to the matrix of pixels and/or by providing a predetermined region of interest (e.g. delimited in at least one dimension by a physical barrier) and/or by controlling the absorption path(s) between 'light' source and the region of interest and/or between the region of interest and the camera).
Further, the inventor(s) have also appreciated that offsetting the drop in responsiveness of the silicon at the selected wavelengths of interest, by increasing the intensity of the 'light' source and/or by controlling the location of the light source with respect to the matrix of pixels and/or by controlling the absorption path(s) between 'light' source and object of interest and between object of interest and camera, can be beneficial in producing images of good resolution.
Figure 15 shows schematic plan and cross-sectional elevation views of an underwater enclosure (e.g. here a fish enclosure) for aquatic animals. Whilst the invention(s) find particular application to fish enclosure for farmed fish such as salmon, the invention can be used, with suitable variations, in other types of underwater animal enclosures. Here a (generally or substantially) circular fish enclosure 14 has apparatus according to one embodiment of the invention generally, or substantially, centrally located within the fish enclosure (here suspended on one or more wire ropes 15 approximately equidistant from the walls of the enclosure). Other shapes of fish enclosures (other than circular) may be used, although, it is preferred that the fish enclosure has a smoothly varying profile (in plan view), so that the fish, which tend to swim about the periphery of the enclosure, are smoothly guided towards a volume of interest 200 and the directions of approach to and from the volume of interest are somewhat limited. This improves consistency of illumination of the fish within the designated volume of interest 200. In Figure 15, the apparatus 120 comprises a 3D ToF camera 20 and two associated modulated EM radiation sources 230A, 2306 each arranged horizontally on opposing sides of the 3D ToF camera 20. Whilst one source (230A or 2306) to one side of camera 20 may be provided, it is preferred if two sources are provided, one on each side of the camera, and that these are arranged so that the incident EM radiation from each illuminates the volume of interest 200 overlapping one another in a region defining the volume of interest in front of an exit pupil 28 of camera 20. EM radiation from each source 230A, 2306 is preferably of high intensity and may comprise multiple correlated radiation emitting elements such as LEDs or laser diodes, with a preferred radiant flux intensity of each LED being over 800mW, and preferably around 1000mW. Preferably at least four, or more preferably at least six, or more preferably still, at least eight, or more preferably eight to twelve, radiation emitting elements of high radiant flux are provided on each side of camera 20. Example of LEDs that may be used include the OSRAM SFH4715S which has a centroid wavelength at 850nm, and a total radiant flux of 1030mW.
It is expected that the amount of sunlight at sea level would be around 75 Watts per m2 at sea level and this can penetrate below the surface introducing background illumination and so background noise to the ToF measurements. Therefore it is preferred if the measurements underwater are made at ? lm (±10%) or more preferably ?1.5m (±10%) or more preferably ?2m (±10%) below the surface of the water. Ropes 15 are arranged accordingly.
Within each source (in this case which each source sub-unit) 230A, 230B, radiation emitting elements may be provided in one or two or more horizontal rows so as to provide multiple overlapping cones of illumination extending horizontally and, to a more limited extent, vertically which combine to form illumination cones 1 and 2 (216A and 2166) from sources 230A and 230B respectively. In this way fish travelling about the periphery of fish enclosure 14 are illuminated with modulated EM radiation consistently and generally evenly from both sides when these are in the volume of interest 200 in front of the exit pupil 28 of the camera 20. The exit pupil 28 has a field of view (also known as a camera viewing cone) 218 which lies in front of exit pupil 28 and expands further away from the exit pupil 28.
The volume of interest 200 is defined as the region of overlap between the one or more radiation source sub-units 230A, 2306 and the field of view (or camera viewing cone) 218 of camera 20. This region is bounded along its rear side (with respect to the camera) by the wall of the fish enclosure and it is preferred that the wall of the fish enclosure is approximately flat, or only very slightly and/or gently curved, in this region, and that the entry and exit to this region are also confined by smoothly, continuous varying enclosure walls in this region. This arrangement channels the fish into the volume of interest 200 and the fish are more likely to approach this region (and be presented to the camera apparatus 128) in more or less the same way. The fish are still free to move, but the range of variation in the fish movement is likely to be more limited.
The camera 20 and one or more radiation sources 230A, 230B are co-located such that the field of view (or viewing cones) 218 of camera 20, and illumination cones 216A, 216B of source sub-units 230A, 230B, are approximately co-incident and of similar lengths from the sub-unit or camera to the volume of interest. Typically, these differ in angular separation (measured angular separation of lines from the centre of the source sub-unit(s) and camera to the same point on the wall of the enclosure) of 300, or more preferably 200, or more preferably 0°.
The present invention in one or more embodiments provides consistent, high intensity illumination of objects of interest within a predetermined volume of interest. The invention in one or more embodiments also provides two absorption paths (one for incident light and one for reflected light).
The location of EM radiation source, or one or more source sub-units 230A, 230B or 330A, 330B in Figure 16, may be varied (separately or together) with respect to the camera in a horizontal plane, so that these are closer and/or further away to the volume of interest 200. This enables improved control of appropriate levels of light to be directed to the volume of interest 200 so that reduced likelihood of saturation of the pixels of camera 20 is achievable.
In Figure 16, a different arrangement is shown which may be used as an alternative, or in addition to the embodiment of Figure 15. In this embodiment two modulated EM radiation source sub-units 330A, 330B are provided, one above and one below camera 20, illuminating a region of interest 200 opposite exit pupil 28 from above and below. Optionally, just one upper and lower of source sub-units 330A, 3308 may be used. Similar to those in Figure 15, multiple EM radiation emitting elements (e.g. LEDs or laser diodes) may be used within each source sub-unit 330A, 330B. Like soruce sub-units 230A, 230B, these may be of sufficiently high intensity and/or number to overcome absorption along the absorption path of the incident illumination cone 316A, 316B to the volume of interest, and along the absorption path of absorption in the reflected radiation viewing cone 318 back to the exit pupil 28 of 3D ToF camera 20, so that a depth image can be formed from the phase information received.
Similar to Figure 15, alternatively, or in addition, one or both light sources 330A and 330B may be located further away or more preferably closer to volume of interest 200 bounded by the rear wall of the fish enclosure 14 (not shown). Preferably, light source sub-units 230A, 2308 (in Figure 15) and/or 330A, 330B (in Figure 16) are at the same distance as camera 20 to the volume of interest 200. Optionally, the lower light source 230B is closer to the volume of interest 200 and/or has the same or a greater number of EM radiation emitting elements as upper source 330A or vice versa.
The wavelengths of light selected for use by the modulated EM radiation source preferably include at least one wavelength component with a wavelength (in air) less than that of visible red light and preferably greater than that of blue light. This may be provided by laser diodes, or more preferably LEDs. These form individual EM radiation emitting elements (and may be grouped into source sub-units) which are correlated so as to act as a single radiation source. Typically these are arranged about the camera and are directed towards a predetermined region of interest (which may be e.g. 3-4 average fish lengths wide and 2-3 fish average fish lengths high, bounded by the perimeter of the fish enclosure e.g. 3-5m away from the LEDs and camera). Preferably at least 16 LEDs are provided each having at least 800mW and more preferably 1000mW of radiant flux). Preferably the LEDs are physically co-located with the 3D ToF camera within a waterproof housing. Preferably, the LEDs illuminate the region of interest from within the housing via the same viewing panel though which the camera observes the region of interest. Whilst it is expected that LEDs are of the same colour will be used, LEDs of differing colours or LEDs offering multiple colour output (e.g. white LEDs may be used). Preferably at least one wavelength component (of the EM radiation source in air e.g. of at least one radiation emitting element) has a wavelength less than that of visible red light and preferably also greater than that of visible blue light, e.g. a wavelength of visible orange light and/or visible yellow light and/or green light, and/or visible blue-green light (although this is less preferred). Preferably at least one wavelength component (of the EM radiation source e.g. of at least one radiation emitting element) has a wavelength of less than 620 nm and/or less than 600nm and/or less than 580 nm and/or greater than 490nm and/or a wavelength of between 490 and 620 nm (from blue-green to orange) and/or a wavelength of between 510 and 620 nm (from green to orange) and/or a wavelength of between 510 and 600 nm (from green to yellow) and/or a wavelength of between 510 to 580 nm (green).
Typically, the modulated EM radiation source is configured to preferentially illuminate an object of interest within an expected region (e.g. a volume) of interest e.g. by directing the EM radiation source, or more particularly the radiation emitting elements towards the region of interest, and preferably providing sufficient numbers and/or intensity of radiation emitting elements (LEDs) to overcome the expected absorption to and from the region of interest.
The region of interest is delimited by one or more smoothly varying physical barrier(s) such as the rear of the fish enclosure and/or illumination from the radiation source, or directed illumination from the radiation source.
A lens or lens combination may be used to further or more clearly define the field of view of the camera 20 and enable an image to be formed on the matrix of pixels within the camera 20. By illuminating and viewing from the same general direction, and delimiting the region by at least one physical barrier (preferably to the rear), this configuration assists in defining a region of interest that can be better and more evenly illuminated with light of sufficient intensity, more effectively and efficiently, to counter the effect of absorption along incident and reflected absorption paths, improving discrimination against the background and reducing the likelihood of light saturation within the matrix of pixels of the camera.
GB1510791.5A 2015-06-19 2015-06-19 Improvements relating to time-of-flight cameras Active GB2539495B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1510791.5A GB2539495B (en) 2015-06-19 2015-06-19 Improvements relating to time-of-flight cameras

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1510791.5A GB2539495B (en) 2015-06-19 2015-06-19 Improvements relating to time-of-flight cameras

Publications (3)

Publication Number Publication Date
GB201510791D0 GB201510791D0 (en) 2015-08-05
GB2539495A true GB2539495A (en) 2016-12-21
GB2539495B GB2539495B (en) 2017-08-23

Family

ID=53784183

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1510791.5A Active GB2539495B (en) 2015-06-19 2015-06-19 Improvements relating to time-of-flight cameras

Country Status (1)

Country Link
GB (1) GB2539495B (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018222048A1 (en) * 2017-05-29 2018-12-06 Ecotone As Method and system for underwater hyperspectral imaging of fish
WO2019180698A1 (en) 2018-03-20 2019-09-26 Giliocean Technology Ltd Method and system for extraction of statistical sample of moving objects
WO2020049007A1 (en) * 2018-09-05 2020-03-12 Atlas Maridan Aps Fish observation apparatus
WO2021206890A1 (en) * 2020-04-10 2021-10-14 X Development Llc Multi-chamber lighting controller for aquaculture
DE102020002365A1 (en) 2020-04-20 2021-10-21 Hochschule Ravensburg-Weingarten Method for the optical detection of the movements of a target object
WO2021216343A1 (en) 2020-04-21 2021-10-28 InnovaSea Systems, Inc. Systems and methods for fish volume estimation, weight estimation, and analytic value generation
EP3915361A1 (en) * 2020-05-26 2021-12-01 Furuno Electric Co., Ltd. Device for calculating fish body depth
EP3954206A1 (en) * 2020-08-12 2022-02-16 Furuno Electric Company Limited Fish size calculation with compensation of the tail beat
US11263765B2 (en) * 2018-12-04 2022-03-01 Iee International Electronics & Engineering S.A. Method for corrected depth measurement with a time-of-flight camera using amplitude-modulated continuous light
US11297806B2 (en) 2020-01-15 2022-04-12 X Development Llc Lighting controller for sea lice detection
EP4008179A1 (en) * 2020-12-04 2022-06-08 Siemens Aktiengesellschaft Method and system for determining biomass of aquatic animals
US20220333395A1 (en) * 2021-04-15 2022-10-20 Zodiac Pool Care Europe Underwater time-of-flight sensing systems principally for use in connection with swimming pools or spas
US20220377221A1 (en) * 2021-05-18 2022-11-24 X Development Llc Enhanced controller synchronization verification
US11611685B2 (en) 2021-05-10 2023-03-21 X Development Llc Enhanced synchronization framework

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017204073A1 (en) * 2017-03-13 2018-09-13 Osram Gmbh TOF CAMERA, MOTOR VEHICLE, METHOD FOR MANUFACTURING A TOF CAMERA, AND METHOD FOR DETERMINING A DISTANCE TO AN OBJECT
CN117169893B (en) * 2023-11-02 2024-01-26 崂山国家实验室 Laser induced sound cross-air underwater target detection system and method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102012223301A1 (en) * 2012-12-14 2014-06-18 Pmdtechnologies Gmbh Time-of-flight sensor for time-of-flight camera system, has time-of-flight pixel and multiple reference time-of-flight pixels for receiving modulated reference light, where two reference pixels have different dimensioned modulation gates
WO2014098614A1 (en) * 2012-12-20 2014-06-26 Ebtech As System and method for calculating physical dimensions for freely movable objects in water
US20140362364A1 (en) * 2013-06-05 2014-12-11 Gregory M. WALIGORSKI Time-of-flight ranging system and method with extended range
WO2015156684A2 (en) * 2014-04-08 2015-10-15 University Of Waikato Signal harmonic error cancellation method and apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102012223301A1 (en) * 2012-12-14 2014-06-18 Pmdtechnologies Gmbh Time-of-flight sensor for time-of-flight camera system, has time-of-flight pixel and multiple reference time-of-flight pixels for receiving modulated reference light, where two reference pixels have different dimensioned modulation gates
WO2014098614A1 (en) * 2012-12-20 2014-06-26 Ebtech As System and method for calculating physical dimensions for freely movable objects in water
US20140362364A1 (en) * 2013-06-05 2014-12-11 Gregory M. WALIGORSKI Time-of-flight ranging system and method with extended range
WO2015156684A2 (en) * 2014-04-08 2015-10-15 University Of Waikato Signal harmonic error cancellation method and apparatus

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018222048A1 (en) * 2017-05-29 2018-12-06 Ecotone As Method and system for underwater hyperspectral imaging of fish
WO2019180698A1 (en) 2018-03-20 2019-09-26 Giliocean Technology Ltd Method and system for extraction of statistical sample of moving objects
EP3769036A4 (en) * 2018-03-20 2021-05-19 Giliocean Technology Ltd Method and system for extraction of statistical sample of moving objects
WO2020049007A1 (en) * 2018-09-05 2020-03-12 Atlas Maridan Aps Fish observation apparatus
US11263765B2 (en) * 2018-12-04 2022-03-01 Iee International Electronics & Engineering S.A. Method for corrected depth measurement with a time-of-flight camera using amplitude-modulated continuous light
US11297806B2 (en) 2020-01-15 2022-04-12 X Development Llc Lighting controller for sea lice detection
US11657498B2 (en) 2020-04-10 2023-05-23 X Development Llc Multi-chamber lighting controller for aquaculture
WO2021206890A1 (en) * 2020-04-10 2021-10-14 X Development Llc Multi-chamber lighting controller for aquaculture
DE102020002365A1 (en) 2020-04-20 2021-10-21 Hochschule Ravensburg-Weingarten Method for the optical detection of the movements of a target object
WO2021216343A1 (en) 2020-04-21 2021-10-28 InnovaSea Systems, Inc. Systems and methods for fish volume estimation, weight estimation, and analytic value generation
EP3915361A1 (en) * 2020-05-26 2021-12-01 Furuno Electric Co., Ltd. Device for calculating fish body depth
EP3954206A1 (en) * 2020-08-12 2022-02-16 Furuno Electric Company Limited Fish size calculation with compensation of the tail beat
EP4008179A1 (en) * 2020-12-04 2022-06-08 Siemens Aktiengesellschaft Method and system for determining biomass of aquatic animals
US20220333395A1 (en) * 2021-04-15 2022-10-20 Zodiac Pool Care Europe Underwater time-of-flight sensing systems principally for use in connection with swimming pools or spas
US11611685B2 (en) 2021-05-10 2023-03-21 X Development Llc Enhanced synchronization framework
US11778127B2 (en) 2021-05-10 2023-10-03 X Development Llc Enhanced synchronization framework
US20220377221A1 (en) * 2021-05-18 2022-11-24 X Development Llc Enhanced controller synchronization verification
US11582397B2 (en) 2021-05-18 2023-02-14 X Development Llc Enhanced controller synchronization verification
US11831989B2 (en) 2021-05-18 2023-11-28 X Development Llc Enhanced controller synchronization verification

Also Published As

Publication number Publication date
GB201510791D0 (en) 2015-08-05
GB2539495B (en) 2017-08-23

Similar Documents

Publication Publication Date Title
GB2539495A (en) Improvements relating to time-of-flight cameras
DK181352B1 (en) System for external fish parasite monitoring in aquaculture
DK181498B1 (en) System for external fish parasite monitoring in aquaculture
DK181217B1 (en) Method and system for external fish parasite monitoring in aquaculture
JP5647118B2 (en) Imaging system
CA3084294A1 (en) System for external fish parasite monitoring in aquaculture
WO2020187719A1 (en) Detector for identifying at least one material property
US11849707B2 (en) Method and system for external fish parasite monitoring in aquaculture
JP6990859B2 (en) Depth acquisition device, depth acquisition method and program
CA2895758A1 (en) System and method for calculating physical dimensions for freely movable objects in water
CN110312079A (en) Image collecting device and its application system
CN208432736U (en) A kind of infrared and visual signature obstacle avoidance apparatus
Mohammed Amean Automatic plant features recognition using stereo vision for crop monitoring
DK202370044A1 (en) System and method for external fish parasite monitoring in aquaculture
Thielemann Optical 3D imaging for subsea and space applications