58 results for Conference paper, 1990

  • Clients’ motivations, perceptions, expectations and satisfaction levels: The New Zealand mountain guiding industry

    Carr, Anna M (1997-11)

    Conference paper
    University of Otago

    Mountain guiding has been offered as an activity for tourists to New Zealand for over a century. In the late Nineteenth Century European guides, accompanying clients, introduced techniques to New Zealanders working at the first Hermitage Hotel at Mt Cook who then chose mountain guiding as their profession. Guides today continue a tradition based on experience, skills and knowledge that enables them to operate as successfully as the mountains will allow. The New Zealand Mountain Guides association (NZMGA) has a qualification framework, certification and safety standards that are internationally recognised by the International Union of Mountain Guides (UIAGM). Companies offer year-round activities such as heli-skiing, avalanche courses, glacier walks, trekking, mountaineering and rock climbing courses, ice climbing and high guiding. The latter ranges from high altitude tramping, e.g. the Copland Pass, to ascents of major peaks in New Zealand or overseas in Europe, Nepal, South America, Alaska and Antarctica. Issues faced by the NZMGA include competition from overseas companies, concession procedures, maintaining traditional markets and seeking new ones, access to Mt Cook/Aoraki under Treaty claims, increased aircraft noise affecting product quality and potential conflict with other user groups. Over the 1997/98 summer climbing season the writer will conduct research focussing on the clients of NZMGA guides.

    View record details
  • Guided mountaineering clients in New Zealand’s Southern Alps

    Carr, Anna M (1997-12-02)

    Conference paper
    University of Otago

    Carr, A.M. (1997). "Guided Mountaineering in New Zealand's Southern Alps" in J. Higham and G.W. Kearsley (eds.), Proceedings of trails, tourism and regional development. Centre for Tourism and IGU, University of Otago at Cromwell, New Zealand, 2-5 December 1997, 23-32.

    View record details
  • Taking it slowly with managed care - Invited address in a workshop on Managed Care for Mental Health: International Experiences and the New Zealand Direction, Schizophrenia Fellowship National Conference, Christchurch. September 5-7. 1997

    Bridgman, Geoff (1998)

    Conference paper
    Unitec

    Managed care arose from a need to contain the escalating health costs of the insurance and litigation based US health system, which were rising at rates of more than 10% a year through the early 1990S. It is described as the application of market forces to health. It is an insurance based system in which health management organisations (HMOs) provide cover for illness through a range of preferred providers who discount their services partly on the basis of restricting the options for care relating to particular illness groups. The heart of the managed care system is the utilization review in which the cost-effectiveness of the options for care are analysed, resulting in the more wasteful options being eliminated. Utilization review studies have found as much as a quarter to a third of all medical services performed are of little or no benefit to patients. Utilization reviews have also shifted the emphasis of care towards preventative approaches. While managed care initially resulted in increases to the cost of health care it began to be very effective in 1994 (only a 6.5% increase in national costs) with managed care group health care costs falling by 1.1% and remaining flat in 1995. A recent newspaper report describes the "inexplicable" buoyancy of the US economy, with one commentator saying that the reduction in health care insurance costs was a major contributor. A majority of US citizens have their health insurance paid by their employer, and about half the US population (135 million people) is enrolled in a managed care system. The US government expects to save $250 billion through the implementation of managed care.

    View record details
  • ION architecture for robot learning

    Qualtrough, Paul (1995)

    Conference paper
    The University of Auckland Library

    An open access copy of this article is available and complies with the copyright holder/publisher conditions. It is claimed that one of the main reasons why the development of intelligent robots has been slower than expected is that machine learning has been seen as an “add on” feature-one to be placed in the higher and later-developed levels of robot architectures. A case is made for incorporating machine learning at the earliest possible stage, and relying on it as the primary method of developing robot controllers. An architecture is proposed to support this approach

    View record details
  • Adaptive fuzzy control

    Li, Han-Xiong (1996)

    Conference paper
    The University of Auckland Library

    An open access copy of this article is available and complies with the copyright holder/publisher conditions. In this paper, adaptive fuzzy logic control (FLC) will be designed by using a simple reference model. This design approach is based on our new methodology “design rule base qualitatively and data base quantitatively”. If the linear rule base is used, the model of FLC can be obtained. It is actually a nonlinear function with only three scaling gains need to be designed and tuned. The conventional control theory can thus be used. This model reference adaptive fuzzy control (MRAFC) requires less restriction on the reference model, but often achieves a more robust performance than its classical counterpart

    View record details
  • Efficient word-graph parsing and search with a stochastic context-free grammar

    Waters, C.J.; MacDonald, B.A. (1997)

    Conference paper
    The University of Auckland Library

    An open access copy of this article is available and complies with the copyright holder/publisher conditions. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE. Items in ResearchSpace are protected by copyright, with all rights reserved, unless otherwise indicated. Previously published items are made available in accordance with the copyright policy of the publisher.

    View record details
  • Recovery effect in cellular radio systems

    Carter, L.J.; Maclean, T.S.M. (1990)

    Conference paper
    The University of Auckland Library

    An open access copy of this article is available and complies with the copyright holder/publisher conditions. A novel expression for the attenuation of a radio wave propagating over a mixed land-sea path successfully predicts the recovery of field strength over the sea path. An initial series of measurements has been made in the Auckland area to determine whether the recovery effect is a significant factor at cellular radio frequencies. The results presented are limited by the fact that they were taken in a real environment, rather than in controlled laboratory conditions. It is therefore difficult to eliminate unwanted variables, particularly the effects of clutter. Nevertheless, the results do show consistently that signal enhancement occurs over a sea-water path at cellular radio frequencies

    View record details
  • Implications of propagation modeling on the design of a DS-CDMA in-building mobile communication system

    Butterworth, K.S.; Sowerby, K.W.; Williamson, A.G. (1997)

    Conference paper
    The University of Auckland Library

    An open access copy of this article is available and complies with the copyright holder/publisher conditions. This paper investigates the implications of propagation modeling on the design of a DS-CDMA in-building mobile communication system. Two modeling approaches are considered, namely a floor-averaged propagation model and a localised area model that considers individual propagation paths for a range of potential mobile user locations. Results (measured at 1.8 GHz) show that overall system performance estimates are heavily dependent on the model used to describe the building's propagation characteristics and suggest that the former approach leads to a rather pessimistic prediction of system performance when compared with the later. This suggests that unnecessarily conservative design would be likely if the former approach was utilised as part of a system planning process

    View record details
  • Feasibility of spectrum sharing between DS-CDMA mobile radio systems and microwave point-to-point links

    Marshall, P.J.; Sowerby, K.W.; Shafi, M. (1996)

    Conference paper
    The University of Auckland Library

    An open access copy of this article is available and complies with the copyright holder/publisher conditions. Radio spectrum allocated for many second and third generation mobile radio systems in the 1-3 GHz frequency bands (e.g. USA PCS, DCS1800 and FPLMTS) is currently used in many countries for fixed point-to-point microwave links. General techniques are presented to investigate the feasibility of spectrum sharing between an indoor DS-CDMA mobile radio system with vertical frequency reuse and a fixed point-to-point microwave link. Using a range of system parameters, the limitations of spectrum sharing are estimated. The results indicate that, for the systems considered, spectrum sharing will be difficult to implement without sufficient geographical isolation between the two systems. It is also apparent that the feasibility of spectrum sharing depends largely on the propagation characteristics between the two systems. The feasibility of spectrum sharing depends on the mutual and self-interference that will be received in the fixed and mobile systems. General techniques for characterising this interference and determining the feasibility of spectrum sharing are outlined

    View record details
  • Self synchronising T-codes to replace Huffman codes

    Higgie, Gavin R. (1993)

    Conference paper
    The University of Auckland Library

    An open access copy of this article is available and complies with the copyright holder/publisher conditions. This paper describes recent work on the T-Codes, which are a new class of variable length codes with superlative selfsynchronizing properties. The T-Code construction algorithm is outlined, and it is shown that in situations where codeword synchronization is important the T-Codes can be used instead of Huffman codes, giving excellent self-synchronizing properties without sacrificing coding efficiency

    View record details
  • Public-sector vs. private-sector R&D in India: a comparative analysis of two R&D teams

    Sankaran, Jayaram K.; Suchitra, Mouly V. (1996)

    Conference paper
    The University of Auckland Library

    An open access copy of this article is available and complies with the copyright holder/publisher conditions. The subject of this paper is a comparative analysis of two Indian R&D teams with similar objectives and activities. The team which we first studied (team A) was located in a public-sector electrical power research institute. The second team (team B) was the R&D unit of a private-sector company which manufactures and sells electrical equipment such as motors, generators, and transformers. Using qualitative methodology, we developed a process model of the ineffectiveness of team A. This model served as an interpretive framework with which to study team B and compare it with team A

    View record details
  • IMECO: A reconfigurable FPGA-based image enhancement co-processor framework

    Salcic, Z.; Sivaswamy, J. (1997)

    Conference paper
    The University of Auckland Library

    An open access copy of this article is available and complies with the copyright holder/publisher conditions. This paper presents a way to improve the computational speed of image contrast enhancement using low-cost FPGA-based hardware primarily targeted to X-ray images. The enhancement method considered here consists of filtering via the high boost filter (HBF), followed by histogram modification using histogram equalisation (HE). An image enhancement co-processor (IMECO) concept is proposed that enables efficient hardware implementation of enhancement procedures and hardware/software co-design to achieve high-performance, low-cost solutions. The co-processor runs on an FPGA prototyping ISA-bus board. It consists of two hardware functional units that implement HBF and HE and can be downloaded onto the board sequentially or reside on the board at the same time. These units represent an embryo of virtual hardware units that form a library of image enhancement algorithms. In trials with chest X-ray images performance improvement over software-only implementations was more than two orders of magnitude, thus providing real-time or near real-time enhancement

    View record details
  • Multicast primitive for mobile hosts

    Ye, Xinfeng; Keane, John A. (1996)

    Conference paper
    The University of Auckland Library

    An open access copy of this article is available and complies with the copyright holder/publisher conditions. Due to network latency and the mobility of the host, many existing group communication protocols are limited to a static environment. This paper presents a multicast primitive for delivering multicast messages to mobile hosts. The primitive has the total ordering property which guarantees the ordering of message delivery. The protocol also guaranteed that the messages are delivered to the mobile hosts exactly once. Sequence numbers and message buffers are used to cope with message duplication and message loss

    View record details
  • Fuzzy exposure model

    Rajkumar, T.; Guesgen, Hans W. (1996)

    Conference paper
    The University of Auckland Library

    An open access copy of this article is available and complies with the copyright holder/publisher conditions. This paper presents a fuzzy exposure model which deals with the uncertainties involved in analysing prolonged (chronic) chemical exposure for humans in risk assessment. The imprecise input information for the exposure model is expressed as fuzzy sets using linguistic variables such as high, low and constant. The risk assessor can extend these fuzzy sets with respect to the data availability. The result obtained from the calculations is a fuzzy number that indicates the life average daily exposure (LADE) to human beings. A case study is illustrated to present the methodology

    View record details
  • Detecting termination in static and dynamic systems

    Ye, Xinfeng; Keane, John A. (1996)

    Conference paper
    The University of Auckland Library

    An open access copy of this article is available and complies with the copyright holder/publisher conditions. Distributed termination detection concerns detecting the termination of a distributed computation spread across a set of processors. Most solutions to the problem are not intended for dynamic systems where processes can be created and destroyed during the computation. In this paper, a termination detection algorithm which can be applied to both static and dynamic systems is proposed. The scheme can be applied to any kind of connection topology. The number of control messages is lower than some previous approaches

    View record details
  • Analysis of chemical exposure through inhalation using hybrid neural network

    Rajkumar, T.; Guesgen, Hans W. (1997)

    Conference paper
    The University of Auckland Library

    An open access copy of this article is available and complies with the copyright holder/publisher conditions. In this analysis, human health risk through inhalation due to exposure to Benzene from vehicular emissions in New Zealand is assessed as an example of the application of a hybrid neural network. Exposure factors affecting the inhalation are inhaled contaminant, age, body weight, health status and activity patterns of humans. There are four major variables affecting the inhaled contaminant viz., gas emissions from motor vehicles on the road, wind speed, temperature and atmospheric stability. The topic of uncertainty applies equally to all variables involved in exposure analysis. Neural network and fuzzy theory is implemented to solve the uncertainty, which exists to a greater extent. The architecture of hybrid neural network that is used to estimate the exposure of carcinogens through inhalation is explained in detail in this paper

    View record details
  • Iterative blind deconvolution of extended objects

    Biggs, David S.C.; Andrews, Mark (1997)

    Conference paper
    The University of Auckland Library

    An open access copy of this article is available and complies with the copyright holder/publisher conditions. This paper describes a technique for the blind deconvolution of extended objects such as the Hubble Space Telescope (HST), scanning electron and 3D fluorescence microscope images. The blind deconvolution mechanism is based on the Richardson-Lucy (1972, 1974) algorithm and alternates between deconvolution of the image and point spread function (PSF). This form of iterative blind deconvolution differs from that typically employed in that multiple PSF iterations are performed after each image iteration. The initial estimate for the PSF is the autocorrelation of the blurred image and the edges of the image are windowed to minimise wrap around artifacts. Acceleration techniques are employed to speed restoration and results from real HST, electron microscope and 3D fluorescence images are presented

    View record details
  • Breaking abstractions and unstructuring data structures

    Collberg, Christian; Thomborson, Clark; Low, Douglas (1998)

    Conference paper
    The University of Auckland Library

    An open access copy of this article is available and complies with the copyright holder/publisher conditions. To ensure platform independence, mobile programs are distributed in forms that are isomorphic to the original source code. Such codes are easy to decompile, and hence they increase the risk of malicious reverse engineering attacks. Code obfuscation is one of several techniques which has been proposed to alleviate this situation. An obfuscator is a tool which-through the application of code transformations-converts a program into an equivalent one that is more difficult to reverse engineer. In a previous paper (Collberg et al., 1998) we have described the design of a control flow obfuscator for Java. In this paper we extend the design with transformations that obfuscate data structures and abstractions. In particular we show how to obfuscate classes, arrays, procedural abstractions and built-in data types like strings, integers and booleans

    View record details
  • Towards more practical reinforcement learning

    Qualtrough, Paul (1997)

    Conference paper
    The University of Auckland Library

    An open access copy of this article is available and complies with the copyright holder/publisher conditions. The fields of machine learning, mobile robotics and machine vision have grown steadily closer in recent years, to the extent that learning has been suggested as the best means of producing sophisticated controllers for mobile robots. Such an approach may have merit, but only if the structures and mechanisms provided for learning are tuned to the special needs of robots. These needs are outlined, and reinforcement learning is promoted as the best starting point for fulfilling them. In order to make good on the promise of learning to the level required of mobile robots, significant enhancements are required to current formulations of reinforcement learning. The issues involved in making improvements are discussed, and a simple enhanced model of reinforcement learning is suggested as a first step in this direction

    View record details
  • Mining association rules with composite items

    Ye, Xinfeng; Keane, John A. (1997)

    Conference paper
    The University of Auckland Library

    An open access copy of this article is available and complies with the copyright holder/publisher conditions. Association rules can be used to express relationships between items of data. The process of mining associations rules is to analysis the data in a database to discover “interesting” rules. Existing algorithms for mining association rules require that a record in the database contain all the data items in a rule. This requirement makes it difficult to discover certain useful rules in some applications. To solve the problem, this paper describes an algorithm for mining association rules with composite items. The algorithm has the potential to discover rules which cannot be discovered by existing algorithms

    View record details