49 results for Cree, Michael J., Conference item

  • Colour image processing and texture analysis on images of porterhouse steak meat

    Streeter, Lee V.; Burling-Claridge, G. Robert; Cree, Michael J. (2005)

    Conference item
    University of Waikato

    This paper outlines two colour image processing and texture analysis techniques applied to meat images and assessment of error due to the use of JPEG compression at image capture. JPEG error analysis was performed by capturing TIFF and JPEG images, then calculating the RMS difference and applying a calibration between block boundary features and subjective visual JPEG scores. Both scores indicated high JPEG quality. Correction of JPEG blocking error was trialled and found to produce minimal improvement in the RMS difference. The texture analysis methods used were singular value decomposition over pixel blocks and complex cell analysis. The block singular values were classified as meat or non- meat by Fisher linear discriminant analysis with the colour image processing result used as ‘truth.’ Using receiver operator characteristic (ROC) analysis, an area under the ROC curve of 0.996 was obtained, demonstrating good correspondence between the colour image processing and the singular values. The complex cell analysis indicated a ‘texture angle’ expected from human inspection.

    View record details
  • Analysis of binning of normals for spherical harmonic cross-correlation

    Larkins, Robert L.; Cree, Michael J.; Dorrington, Adrian A. (2012)

    Conference item
    University of Waikato

    Spherical harmonic cross-correlation is a robust registration technique that uses the normals of two overlapping point clouds to bring them into coarse rotational alignment. This registration technique however has a high computational cost as spherical harmonics need to be calculated for every normal. By binning the normals, the computational efficiency is improved as the spherical harmonics can be pre-computed and cached at each bin location. In this paper we evaluate the efficiency and accuracy of the equiangle grid, icosahedron subdivision and the Fibonacci spiral, an approach we propose. It is found that the equiangle grid has the best efficiency as it can perform direct binning, followed by the Fibonacci spiral and then the icosahedron, all of which decrease the computational cost compared to no binning. The Fibonacci spiral produces the highest achieved accuracy of the three approaches while maintaining a low number of bins. The number of bins allowed by the equiangle grid and icosahedron are much more restrictive than the Fibonacci spiral. The performed analysis shows that the Fibonacci spiral can perform as well as the original cross-correlation algorithm without binning, while also providing a significant improvement in computational efficiency.

    View record details
  • Separating true range measurements from multi-path and scattering interference in commercial range cameras

    Dorrington, Adrian A.; Godbaz, John Peter; Cree, Michael J.; Payne, Andrew D.; Streeter, Lee V. (2011)

    Conference item
    University of Waikato

    Time-of-flight range cameras acquire a three-dimensional image of a scene simultaneously for all pixels from a single viewing location. Attempts to use range cameras for metrology applications have been hampered by the multi-path problem, which causes range distortions when stray light interferes with the range measurement in a given pixel. Correcting multi-path distortions by post-processing the three-dimensional measurement data has been investigated, but enjoys limited success because the interference is highly scene dependent. An alternative approach based on separating the strongest and weaker sources of light returned to each pixel, prior to range decoding, is more successful, but has only been demonstrated on custom built range cameras, and has not been suitable for general metrology applications. In this paper we demonstrate an algorithm applied to both the Mesa Imaging SR-4000 and Canesta Inc. XZ-422 Demonstrator unmodified off-the-shelf range cameras. Additional raw images are acquired and processed using an optimization approach, rather than relying on the processing provided by the manufacturer, to determine the individual component returns in each pixel. Substantial improvements in accuracy are observed, especially in the darker regions of the scene.

    View record details
  • Volume measurement using 3D Range Imaging

    Shrivastava, Vipul; Cree, Michael J.; Dorrington, Adrian A. (2010)

    Conference item
    University of Waikato

    The use of 3D Range Imaging has widespread applications. One of its applications provides us the information about the volumes of different objects. In this paper, 3D range imaging has been utilised to find out the volumes of different objects using two algorithms that are based on a straightforward means to calculate volume. The algorithms implemented succesfully calculate volume on objects provided that the objects have uniform colour. Objects that have multi-coloured and glossy surfaces provided particular difficulties in determining volume.

    View record details
  • Illumination waveform optimization for time-of-flight range imaging cameras

    Payne, Andrew D.; Dorrington, Adrian A.; Cree, Michael J. (2011)

    Conference item
    University of Waikato

    Time-of-flight range imaging sensors acquire an image of a scene, where in addition to standard intensity information, the range (or distance) is also measured concurrently by each pixel. Range is measured using a correlation technique, where an amplitude modulated light source illuminates the scene and the reflected light is sampled by a gain modulated image sensor. Typically the illumination source and image sensor are amplitude modulated with square waves, leading to a range measurement linearity error caused by aliased harmonic components within the correlation waveform. A simple method to improve measurement linearity by reducing the duty cycle of the illumination waveform to suppress problematic aliased harmonic components is demonstrated. If the total optical power is kept constant, the measured correlation waveform amplitude also increases at these reduced illumination duty cycles. Measurement performance is evaluated over a range of illumination duty cycles, both for a standard range imaging camera configuration, and also using a more complicated phase encoding method that is designed to cancel aliased harmonics during the sampling process. The standard configuration benefits from improved measurement linearity for illumination duty cycles around 30%, while the measured amplitude, hence range precision, is increased for both methods as the duty cycle is reduced below 50% (while maintaining constant optical power).

    View record details
  • A power-saving modulation technique for time-of-flight range imaging sensors

    Conroy, Richard M.; Dorrington, Adrian A.; Payne, Andrew D.; Künnemeyer, Rainer; Cree, Michael J. (2011)

    Conference item
    University of Waikato

    Time-of-flight range imaging cameras measure distance and intensity simultaneously for every pixel in an image. With the continued advancement of the technology, a wide variety of new depth sensing applications are emerging; however a number of these potential applications have stringent electrical power constraints that are difficult to meet with the current state-of-the-art systems. Sensor gain modulation contributes a significant proportion of the total image sensor power consumption, and as higher spatial resolution range image sensors operating at higher modulation frequencies (to achieve better measurement precision) are developed, this proportion is likely to increase. The authors have developed a new sensor modulation technique using resonant circuit concepts that is more power efficient than the standard mode of operation. With a proof of principle system, a 93–96% reduction in modulation drive power was demonstrated across a range of modulation frequencies from 1–11 MHz. Finally, an evaluation of the range imaging performance revealed an improvement in measurement linearity in the resonant configuration due primarily to the more sinusoidal shape of the resonant electrical waveforms, while the average precision values were comparable between the standard and resonant operating modes.

    View record details
  • Shape and deformation measurement using heterodyne range imaging technology

    Conroy, Richard M.; Dorrington, Adrian A.; Cree, Michael J.; Künnemeyer, Rainer; Gabbitas, Brian (2006-11)

    Conference item
    University of Waikato

    Range imaging is emerging as a promising alternative technology for applications that require non-contact visual inspection of object deformation and shape. Previously, we presented a solid-state full-field heterodyne range imaging device capable of capturing three-dimensional images with sub-millimetre range resolution. Using a heterodyne indirect time-of-flight configuration, this system simultaneously measures distance (and intensity), for each pixel in a cameras field of view. In this paper we briefly describe our range imaging system, and its principle of operation. By performing measurements on several metal objects, we demonstrate the potential capabilities of this technology for surface profiling and deformation measurement. In addition to verifying system performance, the reported examples highlight some important system limitations. With these in mind we subsequently discuss the further developments required to enable the use of this device as a robust and practical tool in non-destructive testing and measurement applications.

    View record details
  • Analysis of ICP variants for the registration of partially overlapping time-of-flight range images

    Larkins, Robert L.; Cree, Michael J.; Dorrington, Adrian A. (2010)

    Conference item
    University of Waikato

    The iterative closest point (ICP) algorithm is one of the most commonly used methods for registering partially overlapping range images. Nevertheless, this algorithm was not originally designed for this task, and many variants have been proposed in an effort to improve its prociency. The relatively new full-field amplitude-modulated time-of-flight range imaging cameras present further complications to registration in the form of measurement errors due to mixed and scattered light. This paper investigates the effectiveness of the most common ICP variants applied to range image data acquired from full-field range imaging cameras. The original ICP algorithm combined with boundary rejection performed the same as or better than the majority of variants tested. In fact, many of these variants proved to decrease the registration alignment.

    View record details
  • Monte Carlo evaluation of a Compton camera in breast and brain imaging

    Uche, Chibueze Zimuzo; Round, W. Howell; Cree, Michael J. (2013)

    Conference item
    University of Waikato

    Background: The Compton camera is increasingly becoming the subject of investigation for possible implementa- tion in nuclear medical imaging. It is likely to have advantages over Anger camera in medical imaging. However, very little has been done to characterize its performance for specific medical imaging techniques. There is therefore a need to fill in the gaps in knowledge relating to realistic evaluation of the viability of the camera for nuclear medical imaging. Objective: The present study has sought to explore the viability of a prototype Compton camera in breast and brain imaging. Methods: The GEANT4 simulation software was used to model the radiation transport and interactions with matter. Simulations were carried out of a Si/CZT Compton camera being used in breast and brain imaging. In order to study a challenging detection case, the volumes of two simulated breast tumours were chosen to be 0.65 mL, and embedded in the medial region of the breast. For the brain imaging, a multitracing approach was used, and imaging was done parallel to the orbitomeatal line of the brain. Results: The results suggest that the Compton camera would visualize small breast tumours of about 0.65 mL volume, placed at the medial region of an average compressed human breast. Although brain imaging using the Compton camera seems to be promising, analyses suggest however that beyond a distance difference of 2 cm between two brain tumours, there may be a need to rotate the camera around the human head for efficient brain imaging. Conclusions: It is envisioned that with further work, the Compton camera could replace the Anger camera in breast and brain imaging.

    View record details
  • Verification of multi-view point-cloud registration for spherical harmonic cross-correlation

    Larkins, Robert L.; Cree, Michael J.; Dorrington, Adrian A. (2012)

    Conference item
    University of Waikato

    Spherical harmonic cross-correlation is a robust registration algorithm that brings two point-clouds of the same scene into coarse rotational alignment. The found rotation however may not give the desired alignment, as misalignments can occur if there is not enough overlap between point-clouds, or if they contain a form of symmetry. We propose a verification method whose purpose is to determine if registration has failed for a priori unknown registration. The rotational transformation between multiple clouds must satisfy internal consistency, namely multiple rotational transformations are transitive. The rotation verification is performed using triplets of images, which are cross-referenced with each other to classify rotations individually. Testing is performed on a dataset of a priori known registrations. It is found that when the number of images or the percentage of correct rotations is increased, the number of correct rotation classifications improves. Even when tested with only four images and a correct rotation percentage of 17%, the rotation verification is still considered a viable method for classifying rotations. Spherical harmonic cross-correlation is benefited by rotation verification as it provides an additional approach for checking whether found rotations are correct.

    View record details
  • Closed-form inverses for the mixed pixel/multipath interference problem in AMCW lidar

    Godbaz, John Peter; Cree, Michael J.; Dorrington, Adrian A. (2012)

    Conference item
    University of Waikato

    We present two new closed-form methods for mixed pixel/multipath interference separation in AMCW lidar systems. The mixed pixel/multipath interference problem arises from the violation of a standard range-imaging assumption that each pixel integrates over only a single, discrete backscattering source. While a numerical inversion method has previously been proposed, no close-form inverses have previously been posited. The first new method models reflectivity as a Cauchy distribution over range and uses four measurements at different modulation frequencies to determine the amplitude, phase and reflectivity distribution of up to two component returns within each pixel. The second new method uses attenuation ratios to determine the amplitude and phase of up to two component returns within each pixel. The methods are tested on both simulated and real data and shown to produce a significant improvement in overall error. While this paper focusses on the AMCW mixed pixel/multipath interference problem, the algorithms contained herein have applicability to the reconstruction of a sparse one dimensional signal from an extremely limited number of discrete samples of its Fourier transform.

    View record details
  • Analysis of the SoftKinetic DepthSense for Range Imaging

    Cree, Michael J.; Streeter, Lee V.; Conroy, Richard M.; Dorrington, Adrian A. (2013)

    Conference item
    University of Waikato

    We analyse the SoftKinetic DepthSense 325 range imaging camera for precision and accuracy in ranging out to 3 m. Flat planar targets (one a grey board, the other made from retroreflective material) are imaged at a variety of distances. Straight-forward image processing is used to identify the target and calculate the range and the root mean square variation in ranging to the target. It is found that inaccuracies in ranging of up to 2 cm occur to the grey board when imaging over 0 m to 1.5 m and the precision in ranging degrades from just below 1 cm at 0 m to almost 10 cm at 1.5 m. Similar inaccuracies occur with the retroreflective target but the precision is always under 1 cm even out to 3 m due to the strong signal return received from the target.

    View record details
  • Combination of Mean Shift of Colour Signature and Optical Flow for Tracking During Foreground and Background Occlusion

    Hedayati, M.; Cree, Michael J.; Scott, Jonathan B. (2015)

    Conference item
    University of Waikato

    This paper proposes a multiple hypothesis tracking for multiple object tracking with moving camera. The proposed model makes use of the stability of sparse optical flow along with the invariant colour property under size and pose variation, by merging the colour property of objects into optical flow tracking. To evaluate the algorithm five different videos are selected from broadcast horse races where each video represents different challenges that present in object tracking literature. A comparison study of the proposed method, with a colour based mean shift tracking proves the significant improvement in accuracy and stability of object tracking.

    View record details
  • Vectorised SIMD Implementations of Morphology Algorithms

    Cree, Michael J. (2015)

    Conference item
    University of Waikato

    We explore vectorised implementations, exploiting single instruction multiple data (SIMD) CPU instructions on commonly used architectures, of three efficient algorithms for morphological dilation and erosion. We discuss issues specific to SIMD implementation and describe how they guide algorithm choice. We compare our implementations to a commonly used opensource SIMD accelerated machine vision library and find orders of magnitude speed-ups can be achieved for erosions using two-dimensional structuring elements.

    View record details
  • Estimating heading direction from monocular video sequences using biologically-based sensor

    Cree, Michael J.; Perrone, John A.; Anthonys, Gehan; Garnett, Aden C.; Gouk, Henry (2016)

    Conference item
    University of Waikato

    The determination of one’s movement through the environment (visual odometry or self-motion estimation) from monocular sources such as video is an important research problem because of its relevance to robotics and autonomous vehicles. The traditional computer vision approach to this problem tracks visual features across frames in order to obtain 2-D image motion estimates from which the camera motion can be derived. We present an alternative scheme which uses the properties of motion sensitive cells in the primate brain to derive the image motion and the camera heading vector. We tested heading estimation using a camera mounted on a linear translation table with the line of sight of the camera set at a range of angles relative to straight ahead (0◦ to 50◦ in 10◦ steps). The camera velocity was also varied (0.2, 0.4, 0.8, 1.2, 1.6 and 2.0 m/s). Our biologically-based method produced accurate heading estimates over a wide range of test angles and camera speeds. Our approach has the advantage of being a one-shot estimator and not requiring iterative search techniques for finding the heading.

    View record details
  • Learning Distance Metrics for Multi-Label Classification

    Gouk, Henry; Pfahringer, Bernhard; Cree, Michael J. (2016)

    Conference item
    University of Waikato

    Distance metric learning is a well studied problem in the field of machine learning, where it is typically used to improve the accuracy of instance based learning techniques. In this paper we propose a distance metric learning algorithm that is specialised for multi-label classification tasks, rather than the multiclass setting considered by most work in this area. The method trains an embedder that can transform instances into a feature space where squared Euclidean distance provides an estimate of the Jaccard distance between the corresponding label vectors. In addition to a linear Mahalanobis style metric, we also present a nonlinear extension that provides a substantial boost in performance. We show that this technique significantly improves upon current approaches for instance based multi-label classification, and also enables interesting data visualisations.

    View record details
  • Extracting the MESA SR4000 Calibrations

    Charleston, Sean A.; Dorrington, Adrian A.; Streeter, Lee V.; Cree, Michael J. (2015)

    Conference item
    University of Waikato

    Time-of-flight range imaging cameras are capable of acquiring depth images of a scene. Some algorithms require these cameras to be run in `raw mode', where any calibrations from the off-the-shelf manufacturers are lost. The calibration of the MESA SR4000 is herein investigated, with an attempt to reconstruct the full calibration. Possession of the factory calibration enables calibrated data to be acquired and manipulated even in “raw mode.” This work is motivated by the problem of motion correction, in which the calibration must be separated into component parts to be applied at different stages in the algorithm. There are also other applications, in which multiple frequencies are required, such as multipath interference correction. The other frequencies can be calibrated in a similar way, using the factory calibration as a base. A novel technique for capturing the calibration data is described; a retro-reflector is used on a moving platform, which acts as a point source at a distance, resulting in planar waves on the sensor. A number of calibrations are retrieved from the camera, and are then modelled and compared to the factory calibration. When comparing the factory calibration to both the “raw mode” data, and the calibration described herein, a root mean squared error improvement of 51:3mm was seen, with a standard deviation improvement of 34:9mm. © (2015) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.

    View record details
  • The Waikato range imager

    Cree, Michael J.; Dorrington, Adrian A.; Conroy, Richard M.; Payne, Andrew D.; Carnegie, Dale A. (2006)

    Conference item
    University of Waikato

    We are developing a high precision simultaneous full-field acquisition range imager. This device measures range with sub millimetre precision in range simultaneously over a full-field view of the scene. Laser diodes are used to illuminate the scene with amplitude modulation with a frequency of 10MHz up to 100 MHz. The received light is interrupted by a high speed shutter operating in a heterodyne configuration thus producing a low-frequency signal which is sampled with a digital camera. By detecting the phase of the signal at each pixel the range to the scene is determined. We show 3D reconstructions of some viewed objects to demonstrate the capabilities of the ranger.

    View record details
  • Assessment of Self-Management Skills in a Project-Based Learning Paper

    Scott, Jonathan B.; Khoo, Elaine G.L.; Seshadri, Sinduja; Cree, Michael J. (2017)

    Conference item
    University of Waikato

    This initial study revealed the potential of having management-specific assessment and business-related demonstrating staff in undergraduate engineering project- based classes. This will offer students valuable insights in preparing for engineering industries that are increasingly incorporating interdisciplinary expertise and ideas to solve complex issues.

    View record details
  • Full field image ranger hardware

    Payne, Andrew D.; Carnegie, Dale A.; Dorrington, Adrian A.; Cree, Michael J. (2006)

    Conference item
    University of Waikato

    We describe the hardware designed to implement a full field heterodyning imaging system. Comprising three key components - a light source, high speed shutter and a signal generator - the system is expected to be capable of simultaneous range measurements to millimetre precision over the entire field of view. Current modulated laser diodes provide the required illumination, with a bandwidth of 100 MHz and peak output power exceeding 600 mW. The high speed shutter action is performed by gating the cathode of an image intensifier, driven by a 50 Vpp waveform with 3.5 ns rise and fall times. A direct digital synthesiser, with multiple synchronised channels, provides high stability between its outputs, 160 MHz bandwidth and tuning of 0.1 Hz.

    View record details