5,264 results for 2012

  • Enhancing Educational Outcomes for Alaska Native Students through Networked Inquiry

    Fickel LH (2012)

    Conference Contributions - Other
    University of Canterbury Library

    View record details
  • When to Start, What to Start and Other Treatment Controversies in Pediatric HIV Infection

    Turkova, A; Webb, Rachel; Lyall, H (2012-12)

    Journal article
    The University of Auckland Library

    Over the last decade there have been dramatic changes in the management of pediatric HIV infection. Whilst observational studies and several randomized control trials (RCTs) have addressed some questions about when to start antiretroviral therapy (ART) in children and what antiretrovirals to start, many others remain unanswered. In infants, early initiation of ART greatly reduces mortality and disease progression. Treatment guidelines now recommend ART in all infants younger than 1 or 2 years of age depending on geographical setting. In children >1 year of age, US, European (Paediatric European Network for Treatment of AIDS; PENTA) and WHO guidelines differ and debate is ongoing. Recent data from an RCT in Thailand in children with moderate immune suppression indicate that it is safe to monitor asymptomatic children closely without initiating ART, although earlier treatment was associated with improved growth. Untreated HIV progression in children aged over 5 years is similar to that in adults, and traditionally adult treatment thresholds are applied. Recent adult observational and modeling studies showed a survival advantage and reduction of age-associated complications with early treatment. The current US guidelines have lowered CD4+ cell count thresholds for ART initiation for children aged >5 years to 500 cells/mm3. Co-infections influence the choice of drugs and the timing of starting ART. Drug interactions, overlapping toxicities and adherence problems secondary to increased pill burden are important issues. Rapid changes in the pharmacokinetics of antiretrovirals in the first years of life, limited pharmacokinetic data in children and genetic variation in metabolism of many antiretrovirals make correct dosing difficult. Adherence should always be addressed prior to starting ART or switching regimens. The initial ART regimen depends on previous exposure, including perinatal administration for prevention of mother to child transmission (PMTCT), adherence, co-infections, drug availability and licensing. A European cohort study in infants indicated that treatment with four drugs produced superior virologic suppression and immune recovery. Protease inhibitor (PI)-based ART has the advantage of a high barrier to viral resistance. A recent RCT conducted in several African countries showed PI-based ART to be advantageous in children aged <3 years compared with nevirapine-based ART irrespective of previous nevirapine exposure. Another trial in older children from resource rich settings showed both regimens were equally effective. Treatment interruption remains a controversial issue in children, but one study in Europe demonstrated no short-term detrimental effects. ART in children is a rapidly evolving area with many new antiretrovirals being developed and undergoing trials. The aim of ART has shifted from avoiding mortality and morbidity to achieving a normal life expectancy and quality of life, minimizing toxicities and preventing early cancers and age-related illnesses.

    View record details
  • Characterisation of soybean oil droplets formed by the breakup of a laminar capillary liquid jet

    Hoh, ST; Farid, Mohammed; Chen, John (2012)

    Conference item
    The University of Auckland Library

    This work is an effort to model the gas-liquid droplet transesterification reaction to produce biodiesel, specifically to determine the hydrodynamic properties of soybean oil droplets produced by the liquid jet instability method as a model to the droplets produced in the spray reactor of Behzadi and Farid (2009), thus enabling the calculation of mass transfer coefficients. Soybean oil, chosen for this work because it is a typical vegetable oil used for biodiesel manufacture, is heated in a pressure tank and driven through a 0.34 mm diameter orifice using compressed air to produce a capillary liquid jet. The velocity of the emerging liquid jet was calibrated against driving pressures (gauge) between 41.4 to 137.9 kPa. Oil temperatures used were 80, 90 and 100oC. It was found that the liquid jet travels at speeds of 6.2 to 16.2 m/s depending on the driving pressure, with corresponding Reynolds numbers of 280 to 1670, and breaks up into a stream of droplets between 7.6 to 12.0 cm from the orifice opening. The manner in which the laminar jet breaks up into droplets, namely the breakup regime, droplet diameter and breakup length, was examined and observed through high-speed photography. In addition, the effect of having a source of external vibration (20 to 40 Hz) to the liquid jet breakup was also examined.

    View record details
  • K-means clustering pre-analysis for fault diagnosis in an aluminium smelting process

    Aini Abd Majid, N; Young, Brent; Taylor, Mark; Chen, John (2012)

    Conference item
    The University of Auckland Library

    Developing a fault detection and diagnosis system of complex processes usually involve large volumes of highly correlated data. In the complex aluminium smelting process, there are difficulties in isolating historical data into different classes of faults for developing a fault diagnostic model. This paper presents a new application of using a data mining tool, k-means clustering in order to determine precisely how data corresponds to different classes of faults in the aluminium smelting process. The results of applying the clustering technique on real data sets show that the boundary of each class of faults can be identified. This means the faulty data can be isolated accurately to enable for the development of a fault diagnostic model that can diagnose faults effectively.

    View record details
  • Process integration in pulp and paper mills for energy and water reduction - A review

    Atkins, Martin John; Walmsley, Michael R.W.; Morrison, Andrew S.; Neale, James R. (2012)

    Journal article
    University of Waikato

    Process integration (including pinch analysis) is a holistic or systems approach to process design and optimisation, which considers the interactions and interdependences between individual unit operations or process elements. Large reductions in both energy and water use in pulp and paper mills has been demonstrated using process integration techniques. A review of the current process integration techniques for energy and water reduction, with a focus on application to the pulp and paper industry is presented in this paper. The concurrent application of heat integration and water/mass integration analysis is discussed. Particular focus is given to published case studies. The integration of biorefineries into existing mills and the energy and water use implications is also receiving much attention and this development is also reviewed.

    View record details
  • OpenSolver - An open source add-in to solve linear and integer progammes in Excel

    Mason, Andrew (2012)

    Conference item
    The University of Auckland Library

    OpenSolver is an open source Excel add-in that allows spreadsheet users to solve their LP/IP models using the COIN-OR CBC solver. OpenSolver is largely compatible with the built-in Excel Solver, allowing most existing LP and IP models to be solved without change. However, OpenSolver has none of the size limitations found in Solver, and thus can solve larger models. Further, the CBC solver is often faster than the built-in Solver, and OpenSolver provides novel model construction and on-sheet visualisation capabilities. This paper describes Open- Solver???s development and features. OpenSolver can be downloaded free of charge at http://www.opensolver.org.

    View record details
  • International comparison of long term care resident dependency across four countries (1998-2009): A descriptive study

    Boyd, Michal; Bowman, C; Broad, Joanna; Connolly, Martin (2012-12)

    Journal article
    The University of Auckland Library

    Aim:??? To describe an international comparison of dependency of long-term care residents. Methods:??? All Auckland aged care residents were surveyed in 1998 and 2008 using the ???Long-Term Care in Auckland??? instrument. A large provider of residential aged care, Bupa-UK, performed a similar but separate functional survey in 2003, again in 2006 (including UK Residential Nursing Home Association facilities), and in 2009 which included Bupa facilities in Spain, New Zealand and Australia. The survey questionnaires were reconciled and functional impairment rates compared. Results:??? Of almost 90 000 residents, prevalence of dependent mobility ranged from 27 to 47%; chronic confusion, 46 to 75%; and double incontinence, 29 to 49%. Continence trends over time were mixed, chronic confusion increased, and challenging behaviour decreased. Conclusion:??? Overall functional dependency for residents is high and comparable internationally. Available trends over time indicate increasing resident dependency signifying care required for this population is considerable and possibly increasing.

    View record details
  • Life and living in advanced age: A cohort ctudy in New Zealand - Te puawaitanga o nga tapuwae kia ora tonu, LILACS NZ: Study protocol

    Hayman, Karen; Kerse, Ngaire; Dyall, Lorna; Kepa, M; Teh, Ruth; Wham, C; Wright-St Clair, V; Wiles, Janine; Keeling, S; Connolly, Martin; Wilkinson, TJ; Moyes, Simon; Broad, Joanna; Jatrana, S (2012)

    Journal article
    The University of Auckland Library

    Background The number of people of advanced age (85 years and older) is increasing and health systems may be challenged by increasing health-related needs. Recent overseas evidence suggests relatively high levels of wellbeing in this group, however little is known about people of advanced age, particularly the indigenous M??ori, in Aotearoa, New Zealand. This paper outlines the methods of the study Life and Living in Advanced Age: A Cohort Study in New Zealand. The study aimed to establish predictors of successful advanced ageing and understand the relative importance of health, frailty, cultural, social & economic factors to successful ageing for M??ori and non-M??ori in New Zealand.

    View record details
  • A Sequential Steady-State Detection Method for Quantitative Discrete-Event Simulation

    Freeth, Adam (2012)

    Doctoral thesis
    University of Canterbury Library

    In quantitative discrete-event simulation, the initial transient phase can cause bias in the estimation of steady-state performance measures. Methods for detecting and truncating this phase make calculating accurate estimates from the truncated sample possible, but no methods proposed in the literature have proved to work universally in the sequential online analysis of output data during simulation. This report proposes a new automated truncation method based on the convergence of the cumulative mean to its steady-state value. The method uses forecasting techniques to determine this convergence, returning a truncation point when the cumulative mean time-series becomes sufficiently horizontal and flat. Values for the method’s parameters are found that adequately truncate initialisation bias for a range of simulation models. The new method is compared with the sequential MSER-5 method, and shows to detect the onset of steady-state more effectively and consistently for almost all simulation models that are tested. This rule thus appears to be a good candidate as a robust sequential truncation method and for implementation in sequential simulation research packages such as Akaroa2.

    View record details
  • Creating and Evaluating Problem Templates for Problem Generation Within the Context of Stroke Cognitive Rehabilitation

    Ogden, Scott (2012)

    Doctoral thesis
    University of Canterbury Library

    Stroke Rehabilitation would be more effective if the patients conducted activities personalised to them, as opposed to a set of generic activities which may be irrelevant. This project has the intention of creating problem templates, which will be able to describe the conditions that must be met for a stroke patient to complete an activity, for a wide range of personalised activities. The project is part of the larger Stroke Rehabilitation System, which the ICTG lab is working on. This customisation of treatment is the motivation for the project. The concept of problem templates is used extensively in this project. Problem templates are “chunks of domain-specific knowledge, compiled mentally by experts, and used to solve commonly occurring problems in a particular domain” [1]. Because of their dynamic nature, it is possible to create very generalised problem templates, which can be applied to create very specific scenarios. This research aims to create many of these templates, with the intent of showing that it is possible to create a varied set of potential problem scenarios, using only a few problem templates for each task. After completing background research, a system for developing these problem templates was developed, and six such problem templates created. The six problem templates were for everyday tasks that stroke patients might perform, such as ‘make hot drink’, ‘make frozen meal’, and ‘make sandwich’. A quick survey rendered a number of specific examples to instantiate the problem templates, such as: ‘make coffee’, ‘make pizza’ and ‘make tuna salad sandwich’. From preliminary examination of these problem scenarios, it would appear that these problem templates can be used to generate problem scenarios, in well-defined situations. For the majority of the specific examples, the resultant dependency graphs of states were the correct solution, or required only minor changes. The Author suggests that further problem templates should be made and that further studies be conducted so as to maximise its effectiveness for the Stroke Rehabilitation Project in the long term.

    View record details
  • A Sequential Steady-State Detection Method for Quantitative Discrete-Event Simulation

    Freeth, Adam (2012)

    Doctoral thesis
    University of Canterbury Library

    In quantitative discrete-event simulation, the initial transient phase can cause bias in the estimation of steady-state performance measures. Methods for detecting and truncating this phase make calculating accurate estimates from the truncated sample possible, but no methods proposed in the literature have proved to work universally in the sequential online analysis of output data during simulation. This report proposes a new automated truncation method based on the convergence of the cumulative mean to its steady-state value. The method uses forecasting techniques to determine this convergence, returning a truncation point when the cumulative mean time-series becomes sufficiently horizontal and flat. Values for the method’s parameters are found that adequately truncate initialisation bias for a range of simulation models. The new method is compared with the sequential MSER-5 method, and shows to detect the onset of steady-state more effectively and consistently for almost all simulation models that are tested. This rule thus appears to be a good candidate as a robust sequential truncation method and for implementation in sequential simulation research packages such as Akaroa2.

    View record details
  • Fast Automated Estimation of Variance in Discrete Quantitative Stochastic Simulation

    Shaw, Nelson (2012)

    Doctoral thesis
    University of Canterbury Library

    Quantitative stochastic simulation is an important tool in assessing the performance of complex dynamic systems such as modern communication networks. Because of the proliferation of computers and devices that use and rely on networks such as the internet, assessing the performance of these networks is important to ensure future reliability and service. The current methodology for the analysis of output data from stochastic simulation is focused mainly on the estimation of means. Research on variance estimation focuses mainly on the estimation of the variance of the mean, as this is used to construct confidence intervals for the estimated mean values. To date, there has been little research on the estimation of variance of auto correlated data, such as those collected during steady-state stochastic simulation. This research investigates different methodologies for estimation of variance of terminating and steady-state simulation. Results from the research are implemented in the simulation tool Akaroa2.

    View record details
  • Efficient Algorithms for the 2-Center Problems

    Takaoka, Tadao (2012)

    Doctoral thesis
    University of Canterbury Library

    This paper achieves O(n 3 log log n/ log n) time for the 2- center problems on a directed graph with non-negative edge costs under the conventional RAM model where only arithmetic operations, branching operations, and random accessibility with O(log n) bits are allowed. Here n is the number of vertices. This is a slight improvement on the best known complexity of those problems, which is O(n 3 ). We further show that when the graph is with unit edge costs, one of the 2-center problems can be solved in O(n 2.575) time.

    View record details
  • Are personal cues more effective than provided cues for remembering a set of tasks?

    Harrison, Tegan (2012)

    Doctoral thesis
    University of Canterbury Library

    A person’s memory is a vital component in every day life, as it allows people to organise their lives in a systematic way giving them the ability to plan future events and recall past events as if they were chapters in a book of their life. The importance of a person’s memory seems to be taken for granted and research on such topics appears to be limited for a topic of such societal importance. Previous research indicates that cues help a person’s prospective and retrospective memory as it reinforces their intention to execute a task. This research project focuses on a person’s ability to recall events which they are going to perform in the future, with the aid of personal or provided cues. The cues were recorded with the aid of visualization techniques. This study found that participant’s personal cues were more beneficial than provided cues in aiding remembrance of a prospective task.

    View record details
  • Improved Shortest Path Algorithms for Nearly Acyclic Directed Graphs

    Tian, Lin; Takaoka, Tadao (2012)

    Doctoral thesis
    University of Canterbury Library

    This paper presents new algorithms for computing single source shortest paths (SSSPs) in a nearly acyclic directed graph G. The first part introduces higher-order decomposition. This decomposition is an extension of the technique of strongly connected component (sc-component) decomposition. The second part presents a new method for measuring acyclicity based on modifications to two existing methods. In the new method, we decompose the graph into a 1-dominator set, which is a set of acyclic subgraphs where each subgraph is dominated by one trigger vertex. Meanwhile we compute sc-components of a degenerated graph derived from triggers. Using this preprocessing, a new SSSP algorithm has O(m + r logl) time complexity, where r is the size of the 1-dominator set, and l is the size of the largest sccomponent. In the third part, we modify the concept of a 1-dominator set to that of a 1-2-dominator set. Each of acyclic subgraphs obtained by the 1- 2-dominator decomposition are dominated by one or two trigger vertices cooperatively. Such subgraphs are potentially larger than those decomposed by the 1- dominator set. Thus fewer trigger vertices are needed to cover the graph.

    View record details
  • Improving Face Recognition with Genealogical and Contextual Data

    Rasmus, Ellie (2012)

    Doctoral thesis
    University of Canterbury Library

    Face recognition has long been an area of great interest within computer science, and as face recognition implementations become more sophisticated, the scope of real-world applications has widened. The field of genealogy has embraced the move towards digitisation, with increasingly large quantities of historical photographs being digitised in an effort to both preserve and share them with a wider audience. Genealogy software is prevalent, but while many programs support photograph management, only one uses face recognition to assist in the identification and tagging of individuals. Genealogy is in the unique position of possessing a rich source of context in the form of a family tree, that a face recognition engine can draw information from. We aim to improve the accuracy of face recognition results within a family photograph album through the use of a filter that uses available information from a given family tree. We also use measures of co-occurrence, recurrence and relative physical distance of individuals within photos to accurately predict their identities. This proposed use of genealogical and contextual data has shown a 26% improvement in accuracy over the most advanced face recognition technology currently available when identifying 348 faces against a database of 523 faces. These faces are extracted from a challenging dataset of 173 family photographs, dating back as far as 1908.

    View record details
  • Low-level Image Segmentation for a Vine Imaging Robot

    Flowers, Simon (2012)

    Doctoral thesis
    University of Canterbury Library

    Image segmentation is an important preprocessing step in most computer vision based applications, as it can significantly reduce future computation in tasks such as object classification. By grouping pixels that are similar with regard to a measure such as colour or position, classification can be performed on a per-segment basis, rather than per-pixel. This research examines several segmentation techniques and evaluates their performance at segmenting the network structure of vine images. Methods described in the literature are selected for comparison based on their performance at segmenting similar structures. The methods examined are k-means clustering, mean-shift clustering, normalised cuts segmentation, quadtree segmentation and watershed segmentation. We evaluate each method against five distinct images, based on their accuracy and efficiency at separating scene components such as vines, posts, wires and background. Evaluation is performed using a boundary-based comparison method to compare segmented images against hand generated ground truths. The clustering methods k-means and mean-shift are found to have the best performance. We propose mean-shift as the most suitable algorithm, due to its ability to produce a dynamic number of segments. We provide reasoning behind the relative successes and shortcomings of each method.

    View record details
  • Improving File Navigation with Spatially Consistent Revisitation Visualisation

    Leung, Joshua (2012)

    Doctoral thesis
    University of Canterbury Library

    People are storing an increasing amount of their data digitally in the form of files. However, many of the current navigation based interfaces are unable to support efficient retrieval of this information. We develop a file system crawler, and use this to conduct a user study to characterise the structural features and temporal usage patterns of file systems. Based on these findings and the extensive prior literature, we develop a spatially consistent representation of entire file systems which is augmented with colourcoded tags allowing efficient access to temporally relevant target folders within the file system (SCOFT). We conduct a user study to evaluate the effectiveness of this technique and the supporting techniques developed. Our findings show that SCOFT allows users to revisit files 3 times faster than when using a standard file browser.

    View record details
  • Improving Users' Command Selection Performance

    Harrison, Joel (2012)

    Doctoral thesis
    University of Canterbury Library

    Hotkeys have been shown to improve user command selection performance through proving a flat selection hierarchy and fast activation through the keyboard. However, hotkeys are often not learned by users even though command selection performance improvements are possible. Part of the hotkey learning problem is the current hotkey feedback methods which require the user to acquire the command with the mouse in order to display. In this paper we present ExposeHK, a new interface technique which allows the user to browse, perform and learn hotkeys from within the hotkey modality. ExposeHK encourages users to learn hotkeys through practice thereby providing a seamless transition from novice to expert performance. Our evaluations demonstrate that with ExposeHK users are able to achieve a significantly higher level of command selection performance. Furthermore, our final evaluation demonstrates that when ExposeHK is available in an application users’ hotkey usage is extremely high.

    View record details
  • Towards Concurrent Hoare Logic

    Candy, Robin (2012)

    Doctoral thesis
    University of Canterbury Library

    How can we rigorously prove that an algorithm does what we think it does? Logically verifying programs is very important to industry. Floyd-Hoare Logic (or Hoare Logic for short) is a set of rules that describe a type of valid reasoning for sequential program verification. Many different attempts have been made to extend Hoare Logic for concurrent program verification. We combine ideas from a few of these extensions to formalise a verification framework for specific classes of parallel programs. A new proof rule to deal with the semantics of mesh algorithms is proposed within the verification framework. We use the framework and mesh proof rule to verify the correctness of Sung Bae’s parallel algorithm for the maximum subarray problem.

    View record details