7 results for Dorrington, Adrian A., Conference item, 2010

  • Proof of concept of diffuse optical tomography using time-of-flight range imaging cameras

    Hassan, Ahmad; K├╝nnemeyer, Rainer; Dorrington, Adrian A.; Payne, Andrew D. (2010)

    Conference item
    University of Waikato

    Diffuse optical tomography is an optical technique to create 3-dimensional images of the inside of highly scattering material. Research groups around the world have been developing imaging systems using various source-detector arrangements to determine optical properties of biological tissue with a focus on medical applications. In this paper we investigate whether a range imaging camera can be used as a detector array. We used time-of-flight range imaging cameras instead of the conventional source-detector array used by others. The results provided in this paper show reconstructed images of absorption and reduced scattering of an object submerged in a tissue simulating phantom. Using the ranging camera XZ422 Demonstrator and the NIRFAST software package, we reconstructed 2D images of a 6 mm metal rod submerged in the centre of a 5 cm deep tank filled with 1% IntralipidTM. We have shown for the first time that range imaging cameras can replace the traditional detectors in diffuse optical tomography.

    View record details
  • Volume measurement using 3D Range Imaging

    Shrivastava, Vipul; Cree, Michael J.; Dorrington, Adrian A. (2010)

    Conference item
    University of Waikato

    The use of 3D Range Imaging has widespread applications. One of its applications provides us the information about the volumes of different objects. In this paper, 3D range imaging has been utilised to find out the volumes of different objects using two algorithms that are based on a straightforward means to calculate volume. The algorithms implemented succesfully calculate volume on objects provided that the objects have uniform colour. Objects that have multi-coloured and glossy surfaces provided particular difficulties in determining volume.

    View record details
  • Analysis of ICP variants for the registration of partially overlapping time-of-flight range images

    Larkins, Robert L.; Cree, Michael J.; Dorrington, Adrian A. (2010)

    Conference item
    University of Waikato

    The iterative closest point (ICP) algorithm is one of the most commonly used methods for registering partially overlapping range images. Nevertheless, this algorithm was not originally designed for this task, and many variants have been proposed in an effort to improve its prociency. The relatively new full-field amplitude-modulated time-of-flight range imaging cameras present further complications to registration in the form of measurement errors due to mixed and scattered light. This paper investigates the effectiveness of the most common ICP variants applied to range image data acquired from full-field range imaging cameras. The original ICP algorithm combined with boundary rejection performed the same as or better than the majority of variants tested. In fact, many of these variants proved to decrease the registration alignment.

    View record details
  • Resolving depth measurement ambiguity with commercially available range imaging cameras

    McClure, Shane H.; Cree, Michael J.; Dorrington, Adrian A.; Payne, Andrew D. (2010)

    Conference item
    University of Waikato

    Time-of-flight range imaging is typically performed with the amplitude modulated continuous wave method. This involves illuminating a scene with amplitude modulated light. Reflected light from the scene is received by the sensor with the range to the scene encoded as a phase delay of the modulation envelope. Due to the cyclic nature of phase, an ambiguity in the measured range occurs every half wavelength in distance, thereby limiting the maximum useable range of the camera. This paper proposes a procedure to resolve depth ambiguity using software post processing. First, the range data is processed to segment the scene into separate objects. The average intensity of each object can then be used to determine which pixels are beyond the non-ambiguous range. The results demonstrate that depth ambiguity can be resolved for various scenes using only the available depth and intensity information. This proposed method reduces the sensitivity to objects with very high and very low reflectance, normally a key problem with basic threshold approaches. This approach is very flexible as it can be used with any range imaging camera. Furthermore, capture time is not extended, keeping the artifacts caused by moving objects at a minimum. This makes it suitable for applications such as robot vision where the camera may be moving during captures. The key limitation of the method is its inability to distinguish between two overlapping objects that are separated by a distance of exactly one non-ambiguous range. Overall the reliability of this method is higher than the basic threshold approach, but not as high as the multiple frequency method of resolving ambiguity.

    View record details
  • Multiple range imaging camera operation with minimal performance impact

    Whyte, Refael Z.; Payne, Andrew D.; Dorrington, Adrian A.; Cree, Michael J. (2010)

    Conference item
    University of Waikato

    Time-of-flight range imaging cameras operate by illuminating a scene with amplitude modulated light and measuring the phase shift of the modulation envelope between the emitted and reflected light. Object distance can then be calculated from this phase measurement. This approach does not work in multiple camera environments as the measured phase is corrupted by the illumination from other cameras. To minimize inaccuracies in multiple camera environments, replacing the traditional cyclic modulation with pseudo-noise amplitude modulation has been previously demonstrated. However, this technique effectively reduced the modulation frequency, therefore decreasing the distance measurement precision (which has a proportional relationship with the modulation frequency). A new modulation scheme using maximum length pseudo-random sequences binary phase encoded onto the existing cyclic amplitude modulation, is presented. The effective modulation frequency therefore remains unchanged, providing range measurements with high precision. The effectiveness of the new modulation scheme was verified using a custom time-of-flight camera based on the PMD19-K2 range imaging sensor. The new pseudo-noise modulation has no significant performance decrease in a single camera environment. In a two camera environment, the precision is only reduced by the increased photon shot noise from the second illumination source.

    View record details
  • Calibration and control of a robot arm using a range imaging camera

    Kelly, Cameron B.D.; Dorrington, Adrian A.; Cree, Michael J.; Payne, Andrew D. (2010)

    Conference item
    University of Waikato

    Time of flight range imaging is an emerging technology that has numerous applications in machine vision. In this paper we cover the use of a commercial time of flight range imaging camera for calibrating a robotic arm. We do this by identifying retro-reflective targets attached to the arm, and centroiding on calibrated spatial data, which allows precise measurement of three dimensional target locations. The robotic arm is an inexpensive model that does not have positional feedback, so a series of movements are performed to calibrate the servos signals to the physical position of the arm. The calibration showed a good linear response between the control signal and servo angles. The calibration procedure also provided a transformation between the camera and arm coordinate systems. Inverse kinematic control was then used to position the arm. The range camera could also be used to identify objects in the scene. With the object location now known in the arm's coordinate system (transformed from the camera's coordinate system) the arm was able to move allowing it to grasp the object.

    View record details
  • Blind deconvolution of depth-of-field limited full-field lidar data by determination of focal parameters

    Godbaz, John Peter; Cree, Michael J.; Dorrington, Adrian A. (2010)

    Conference item
    University of Waikato

    We present a new two-stage method for parametric spatially variant blind deconvolution of full-field Amplitude Modulated Continuous Wave lidar image pairs taken at different aperture settings subject to limited depth of field. A Maximum Likelihood based focal parameter determination algorithm uses range information to reblur the image taken with a smaller aperture size to match the large aperture image. This allows estimation of focal parameters without prior calibration of the optical setup and produces blur estimates which have better spatial resolution and less noise than previous depth from defocus (DFD) blur measurement algorithms. We compare blur estimates from the focal parameter determination method to those from Pentland's DFD method, Subbarao's S-Transform method and estimates from range data/the sampled point spread function. In a second stage the estimated focal parameters are applied to deconvolution of total integrated intensity lidar images improving depth of field. We give an example of application to complex domain lidar images and discuss the trade-off between recovered amplitude texture and sharp range estimates.

    View record details