13,203 results for UC Research Repository

  • Dancing to a different tune: adaptive evolution fine-tunes protein dynamics

    Donovan, Katherine Aleisha (2015)

    Doctoral thesis
    University of Canterbury Library

    The molecular mechanisms that underpin adaptive evolution are not well understood. This is largely because few studies relate evolved alleles (genotype) with their physiological changes (phenotype), which move a population to better fit its environment (adaptation). The work described in this thesis provides a case study exploring the molecular changes underlying adaptive evolution in a key allosteric enzyme. It builds upon a long-term evolution experiment by Richard Lenksi, where twelve replicate populations of Escherichia coli have adapted in parallel to better fit their low-glucose environment. I focused on the allosteric enzyme pyruvate kinase type 1, since this has been shown to adapt to this environment. First, I used X-ray crystallography to determine a higher resolution structure (2.2 Å) than previously available of the wild-type PK1 enzyme for comparison with the evolved enzymes. I resolved the ambiguous space-group problem that affects these crystals, and demonstrated that the kinetic function of the recombinant enzyme is the same as previously reported. In addition, I propose a new model for allosteric activation: a combination of structural and dynamic analyses determined that the allosteric signal is transferred by a series of dynamic changes between the allosteric site, upon fructose-1,6-bisphosphate binding, and the active site for increased substrate binding. The functional analyses demonstrated that all eight evolved PK1 enzymes have a reduced activity compared to the wild-type PK1 at physiological substrate concentrations. Not only did the evolved PK1 enzymes show a parallel decrease in activity, but they all showed changes to substrate binding affinity and seven of the eight showed an altered allosteric activation mechanism. These results suggest that natural selection has selected for enzymes with a reduced activity by altering the functional mechanism of the evolved enzymes. However, in crystal and in solution structure characterisation determined that all of the evolved PK1 enzymes have maintained the same structural fold as the wild-type PK1. Although the fold is the same, substrate binding promiscuity suggested a change in the flexibility of the enzyme, allowing substrates of different sizes and shapes to bind. Computational and experimental dynamics studies determined that natural selection has selected for reduced activity by altering the dynamics in all of the evolved PK1 enzymes, and it has used altered dynamics to change the allostery of the enzymes. Therefore, this study provides the first example of adaptive evolution fine-tuning protein dynamics to alter allostery. This thesis describes the molecular mechanisms underlying one aspect of adaptation of Escherichia coli to the low-glucose environment in Lenski’s long-term evolution experiment. The adaptive mutations in Escherichia coli’s pyruvate kinase type 1 serve to increase the availability of phosphoenolpyruvate for glucose uptake. From a molecular perspective, natural selection has selected for adaptive amino acid substitutions that produce an enzyme with reduced catalytic activity at low phosphoenolpyruvate concentrations, thus decreasing phosphoenolpyruvate consumption. In addition, the adaptive mutations have altered the enzymes’ affinity for the allosteric activator (fructose- 1,6-bisphosphate), fine-tuning them to match the concentration of fructose-1,6- bisphosphate in the cell at the point of glucose re-introduction. Overall, this work describes the intricate relationship between genetic changes and the resulting phenotype and demonstrates the parallel nature of adaptation for this particular case study. Whereby, parallel changes are mapped from organismal fitness, to enzyme function and to enzyme structure. The dynamic changes, however, are not parallel thus making the prediction of specific changes in adaptive evolution difficult.

    View record details
  • Ceiling systems design and installation lessons from the Canterbury Earthquakes

    Dhakal, R.P.; MacRae, G. (2013)


    University of Canterbury Library

    Damage to ceiling systems resulted in a substantial financial loss to building owners in the Canterbury earthquakes. In some buildings, collapse of ceilings could easily have resulted in severe injury to occupants. This paper summarizes the types of ceiling damage observed in the Canterbury earthquakes, and draws useful lessons from the observed performance of different types of ceiling systems. Existing ceiling manufacturing and installing practices/regulations in New Zealand are critically scrutinized to identify deficiencies, and measures are suggested to improve the practice so that the damage to ceilings and the resulting loss are minimized in future earthquakes.

    View record details
  • Continuous Glucose Monitors and Tight Glycaemic Control in Intensive Care: An In-Silico Proof of Concept Analysis

    Signal, M.; Pretty, C.G.; LeCompte, A.J.; Shaw, G.M.; Chase, J.G. (2010)


    University of Canterbury Library

    Tight glycaemic control (TGC) in critical care has shown distinct benefits, but has also proven difficult to obtain. The risk of severe hypoglycaemia (< 2.2mmol/L) raises significant concerns for safety. Continuous Glucose Monitors (CGMs) offer frequent, though potentially noisy, automated measurement and thus the possibility of using them for early detection and intervention of hypoglycaemic events. This in-silico study investigates the potential of CGM devices to maintain control, prevent hypoglycaemia and reduce clinical effort. Retrospective clinical data from the SPRINT TGC study covering 26 patients was used with clinically validated metabolic system models and 3 different stochastic noise models (two Gaussian and one first-order autoregressive.) The noisy, virtual CGM blood glucose (BG) values were filtered and used to drive the SPRINT TGC protocol. A simple threshold alarm was used to trigger glucose interventions to avert potential hypoglycaemia. Monte Carlo analysis was used to get robust results from the stochastic noise models. Using SPRINT with simulated CGM noise, the BG time in the 4.4-6.1mmol/L band was reduced no more than 3% from 45.2% obtained with glucometer sensors. The number of patients experiencing severe hypoglycaemia was reduced by 0-30%. Duration of hypoglycaemic events was reduced by 19-65%. Finally, nurse workload was reduced by approximately 20 minutes per patient, per day. The results of this proof of concept study justify a pilot clinical study for verification in a clinical setting.

    View record details
  • Rocket Roll Dynamics and Disturbance – Minimal modelling and system identification

    Hann, C.E.; Snowdon, M.; Rao, A.; Tang, R.; Korevaar, A.; Skinner, G.; Keall, A.; Chen, X.Q.; Chase, J.G. (2010)


    University of Canterbury Library

    The roll dynamics of a 5kg, 1.3 m high sounding rocket are analyzed in a vertical wind tunnel. Significant turbulence in the tunnel makes the system identification of the effective inertia, damping and asymmetry with respect to roll challenging. A novel method is developed which decouples the disturbance from the rocket frame’s intrinsic roll dynamics and allows accurate prediction of roll rate and angle. The parameter identification method is integral-based, and treats wind disturbances as equivalent to a movement in the actuator fins. The method is robust, requires minimal computation, and gave a realistic disturbance distribution reflecting the randomness of the turbulent wind flow. The mean absolute roll rate of the rocket frame observed in experiments was 16.4 degree/s and the model predicted the roll rate with a median error of 0.51 degrees/s with a 90th percentile of 1.25 degrees/s. The roll angle (measured by an encoder), was tracked by the model with a median absolute error of 0.25 degrees and a 90th percentile of 0.50 degrees. These results prove the concept of this minimal modeling approach which will be extended to pitch and yaw dynamics in the future.

    View record details
  • Modeled Insulin Sensitivity and Interstitial Insulin Action from a Pilot Study of Dynamic Insulin Sensitivity Tests

    Lin, J.; Jamaludin, U.; Docherty, P.D.; Razak, N.N.; Le Compte, A.J.; Pretty, C.G.; Hann, C.E.; Shaw, G.M.; Chase, J.G. (2010)


    University of Canterbury Library

    An accurate test for insulin resistance can delay or prevent the development of Type 2 diabetes and its complications. The current gold standard test, CLAMP, is too labor intensive to be used in general practice. A recently developed dynamic insulin sensitivity test, DIST, uses a glucose-insulin-C-peptide model to calculate model-based insulin sensitivity, SI. Preliminary results show good correlation to CLAMP. However both CLAMP and DIST ignore saturation in insulin-mediated glucose removal. This study uses the data from 17 patients who underwent multiple DISTs to investigate interstitial insulin action and its influence on modeled insulin sensitivity. The critical parameters influencing interstitial insulin action are saturation in insulin receptor binding, αG, and plasma-interstitial difiusion rate, nI . Very low values of αG and very low values of nI produced the most intra-patient variability in SI. Repeatability in SI is enhanced with modeled insulin receptor saturation. Future parameter study on subjects with varying degree of insulin resistance may provide a better understanding of different contributing factors of insulin resistance.

    View record details
  • Glargine as a Basal Insulin Supplement in Recovering Critically Ill Patients - An In Silico Study

    Razak, N.N.; Lin, J.; Chase, J.G.; Shaw, G.M.; Pretty, C.G.; Le Compte, A.J.; Suhaimi, F.M.; Jamaludin, U. (2010)


    University of Canterbury Library

    Tight glycaemic control is now benefiting medical and surgical intensive care patients by reducing complications associated with hyperglycaemia. Once patients leave this intensive care environment, less acute wards do not continue to provide the same level of glycaemic control. Main reason is that these less acute wards do not have the high levels of nursing resources to provide the same level of glycaemic control. Therefore developments in protocols that are less labour intensive are necessary. This study examines the use of insulin glargine for basal supplement in recovering critically ill patients. These patients represent a group who may benefit from such basal support therapy. In silico study results showed the potential in reducing nursing effort with the use of glargine. However, a protocol using only glargine for glucose control did not show to be effective in the simulated patients. This may be an indication that a protocol using only glargine is more suitable after discharge from critical care.

    View record details
  • Impact of variation in patient response on model-based control of glycaemia in critically ill patients

    LeCompte, A.J.; Chase, J.G.; Shaw, G.M.; Lin, J.; Lynn, A.; Pretty, C.G. (2010)


    University of Canterbury Library

    Critically ill patients commonly experience stress-induced hyperglycaemia, and several studies have shown tight glycaemic control (TGC) can reduce patient mortality. However, tight control is often difficult to achieve due to conflicting drug therapies and evolving patient condition. Thus, a number of studies have failed to achieve TGC possibly due to use of fixed insulin dosing protocols over adaptive patient-specific methods. Model-based targeted glucose control can adapt insulin and dextrose interventions to match identified patient sensitivity. This study explores the impact on control of assuming patient response to insulin is constant versus time-varying. Simulated trials of glucose control were performed on adult and neonatal virtual patient cohorts. Results indicate assumptions of constant insulin sensitivity can lead to significantly increased rates of hypoglycaemia, a commonly cited issue preventing increased adoption of tight glycaemic control in critical care. It is clear that adaptive, patientspecific, approaches are better able to manage inter- and intra- patient variability than typical, fixed protocols.

    View record details
  • Evaluation of the Performances and Costs of a Spectrum of DIST Protocols

    Docherty, P.D.; Chase, J.G.; Lotz, T.F.; Hann, C.E.; TeMorenaga, L.; McAuley, K.A.; Shaw, G.M.; Berkeley, J.E,; Mann, J.I. (2010)


    University of Canterbury Library

    The strategic design of most insulin sensitivity (SI) tests maximises either accuracy or economy, but not both. Hence, accurate, large-scale screening isn’t feasible. The DIST was developed to better optimize both important metrics. The highly flexible DIST protocol samples insulin, glucose and C-peptide during a comparatively short test. Varying the sampling periods and assays, and utilising alternative computational methods enables a wide range of tests with different accuracy and economy tradeoffs. The result is a hierarchy of tests to facilitate low-cost screening. Eight variations of the DIST are evaluated against the fully-sampled test by correlating the SI and endogenous insulin production (Uen(t)) metrics. Five variations include sample and assay reductions and three utilise DISTq parameter estimations. The DISTq identification methods only require glucose assays and thus enable real-time analysis. Three DISTq methods were tested; the fully-sampled, the Short, and the 30 minute two-sample protocol. 218 DIST tests were completed on 84 participants to provide the data for this study. Methods that assayed insulin replicated the findings of the full DIST particularly well (R=0.89~0.92) while those that assayed C-peptide managed to best replicate endogenous insulin metrics (R=0.72~1.0). The three DISTq protocols correlated to the fully-sampled DIST at R=0.83, 0.77 and 0.71 respectively. As expected, test resolution increased with rising protocol cost and intensity. The ability of significantly less expensive tests to replicate the values of the fully-sampled DIST was relatively high (R=0.92 with four glucose and two insulin assays and 0.71 with only two glucose assays). Thus, an SI screening programme could achieve high resolution at a low cost by using a lower resolution DIST test. When an individual’s result is close to a diagnostic threshold stored test samples could be re-assayed for more species to allow a higher resolution analysis without the need for a second invasive clinical test. Hence, a single test can lead to several outcomes with this hierarchy approach, enabling large scale screening with high resolution only where required with minimal and feasible economic cost and only a single invasive clinical procedure.

    View record details
  • A Fast and Accurate Diagnostic Test for Severe Sepsis Using Kernel Classifiers

    Parente, J.D.; Lee, D.S.; Lin, J.; Chase, J.G.; Shaw, G.M. (2010)


    University of Canterbury Library

    Severe sepsis occurs frequently in the intensive care unit (ICU) and is a leading cause of admission, mortality, and cost. Treatment guidelines recommend early intervention, however gold standard blood culture test results may return in up to 48 hours. Insulin sensitivity (SI) is known to decrease with worsening condition and inflammatory response, and could thus be used to aid clinical treatment decisions. Some glycemic control protocols are able to accurately identify SI in real-time. A biomarker for severe sepsis was developed from retrospective SI and concurrent temperature, heart rate, respiratory rate, blood pressure, and SIRS score from 36 adult patients with sepsis. Patients were identified as having sepsis based on a clinically validated sepsis score (ss) of 2 or higher (ss = 0–4 for increasing severity). Kernel density estimates were used for the development of joint probability density profiles for ss = 2 and ss < 2 data hours (213 and 5858 respectively of 6071 total hours) and for classification. From the receiver operator characteristic (ROC) curve, the optimal probability cutoff values for classification were determined for in-sample and out-of-sample estimates. A biomarker including concurrent insulin sensitivity and clinical data for the diagnosis of severe sepsis (ss = 2) achieves 69–94% sensitivity, 75–94% specificity, 0.78–0.99 AUC, 3–17 LHR+, 0.06–0.4 LHR-, 9–38% PPV, 99–100% NPV, and a diagnostic odds ratio of 7–260 for optimal probability cutoff values of 0.32 and 0.27 for in-sample and out-of-sample data, respectively. The overall result lies between these minimum and maximum error bounds. Thus, the clinical biomarker shows good to high accuracy and may provide useful information as a real-time diagnostic test for severe sepsis.

    View record details
  • It's time to review expectations and support for Middle-level leaders' learning

    Fluckiger, B.; Lovett, S.; Dempster, N. (2015)


    University of Canterbury Library

    View record details
  • Hong Kong’s Developing Double Tax Agreement (DTA) Regime: A Case Study of the HKSAR-New Zealand DTA

    Sawyer, A.J. (2011)


    University of Canterbury Library

    Double taxation traditionally occurs when a taxpayer is taxed twice on the same income by two jurisdictions (source jurisdiction & residence jurisdiction). Relief is usually made on a unilateral basis (domestic laws) or a bilateral basis (DTAs). HKSAR is actively establishing a network of comprehensive DTAs with its major trading and investment partners (over 20 agreements have been reached – not all are in force). Where no comprehensive DTAs exist HKSAR has over 25 agreements for avoidance of double taxation on airline income, 6 on shipping income (plus 2 agreements combining two areas). HKSAR is a destination for trade and investment, and seen as an attractive entry for many countries into the wider South East Asian economies. Also HKSARNZ Free Trade Agreement. HKSAR is mounting a serious challenge to Singapore (with over 60 DTAs), as a location to locate holding companies.

    View record details
  • E-readers: devices for passionate leisure readers or an empowering scholarly resource?

    Lund, Peter. (2011)


    University of Canterbury Library

    E-books are increasingly common in academic libraries and e-book reading devices such as the Kindle and iPad are achieving huge sales for leisure readers. The authors undertook a small study at Loughborough University Library to explore areas in which a variety of e-book readers might be applied. Areas included: e-books on reading lists, PDFs of journal articles, inter-library loans supplied from the British Library and teaching support for Shakespeare studies. Whilst the e-readers did not offer sufficient advantages to merit integrating them into a service, the study proved useful in developing library expertise in the use of and support for e-readers.

    View record details
  • Saltwater Modelling of Fire Gas Flow through a Horizontal Ceiling Opening

    Le Quesne, Marcus Andrew (2010)

    Masters thesis
    University of Canterbury Library

    When fires occur in domestic or commercial buildings it is the smoke from the fire that leads to far more injury and death than the heat produced from the flames. Understanding the movement of smoke within the fire compartment and through openings in the enclosure is critical for designing buildings to prevent fire fatalities. Prediction of the movement of smoke is a complex phenomenon and is a continued focus of research throughout the world. Work has been conducted in the past on the exchange flow rates through vertical openings, but very little has been done on horizontal ceiling openings. Current smoke transport calculations are most often carried out using standard vent flow models that do not accurately take in to account the buoyancy component of the flow. The fire zone model BRANZFire was developed with a ceiling vent flow algorithm based on the work of Cooper who found there was very little data on which to base his predictions. This report aims to provide additional experimental data on exchange flow rates through horizontal ceiling openings through the use of saltwater modelling and compare this to the work previously undertaken by Cooper. Taking measurements of fire phenomena in hot and smoky environments can be difficult and expensive because the sooty environment and high temperatures involved can damage equipment and make taking accurate readings a challenge. Herein this problem is overcome through the use of a saltwater analogue system to model the conditions in a real fire scenario. The density difference created by a fire between the hot fire gases and the ambient air is replicated by using fresh and saltwater. The orientation of the experiment is inverted compared to the real life scenario as the saltwater which has the higher density is added to the fresh water. The saltwater is injected from a source on the ‘floor’ of the compartment into a tank of fresh water which generates a buoyant plume that ‘rises’ to the ceiling forming a distinct upper layer. Fluid in this layer exchanges with the ambient fluid through the ceiling opening. The saltwater is dyed and Light Attenuation (LA) is used to discern the density of the fluid and hence the amount of mixing that has occurred. This can then be used to determine the amount of exchange flow through the ceiling vent. An integral model for the descent of the interface between the hot smoky zone and the cool ambient zone has been developed and was found to perform well when compared with the saltwater experiments and another predictive model developed by Turner and Baines. The model was then developed further using mass conservation conventions to calculate the exchange flow through the ceiling opening. The exchange rate through the ceiling opening was calculated and was found to compare well with Cooper’s algorithm when an equivalent fire size of 323 kW was used but differed significantly when a fire twice this size was considered. It was found that Cooper’s method did not adequately take into account the difference in fire sizes as the exchange flow predicted was almost identical between fire sizes for a particular ceiling vent. The implications of this are that the exchange, and hence the mixing and the amount of smoke, may be under predicted using larger fires in BRANZFire and this could lead to non-conservative design.

    View record details
  • A Converging Image? Capitalism and the Visual Identity of Alternative and Mainstream News Sites

    Kenix, L.J. (2011)


    University of Canterbury Library

    While largely overlooked in mass communication research, visual imagery is central to how organizations represent, make meaning, create identities, and communicate with the rest of the world (Messaris, 1994). This research explores visual differences between alternative and mainstream news sites along the conceptual categorization of deviance. More deviant groups have historically represented themselves through alternative media with themes of confrontation and challenge, often through violent or sexualized imagery (Ray & Marsh II, 2001). However, that might not still be the case in an online environment where the whole world is watching and the omnipresent ideology of capitalism may influence the commercialism and professionalisation of media messages.

    View record details
  • Accountability to Stakeholders in a Student-Managed Organisation

    Hill, C.; Crombie, N.A. (2010)


    University of Canterbury Library

    Purpose: Management is normally held accountable to stakeholders because it enters into long-term relationships with them. However, managers which are planning on departing the organisation can take advantage of stakeholders’ trust for their own personal benefit. This paper examines how accountability mechanisms can preserve organisation-stakeholder relationships in the face of high management turnover. Design/methodology/approach: A case study of student-managed non-profit organisation in which the management team is replaced annually. Representatives of all (internal and external) stakeholders are interviewed. Findings: While stakeholders and management work towards mutually agreed upon objectives, at times they also work against each other and pursue their own self-interests. The organisation has, however, been able to survive due to the introduction of accountability mechanisms. Originality/value: Drawing on stakeholder-agency theory, this paper shows how accountability mechanisms can preserve organisational memory and organisation-stakeholder relationships.

    View record details
  • Balancing national security and personal freedom in a low-risk society: The case of New Zealand

    Small, D. (2008)


    University of Canterbury Library

    In the pre-dawn darkness of 17 October 2007, dozens of police from the Armed Offenders’ Squad and secret Special Tactics Group conducted simultaneous raids across New Zealand. They announced to a startled country that the raids were authorised by warrants issued under the Firearms Act and the Terrorism Suppression Act, and they had broken a network of terrorist training camps centred in the remote Urewera mountains in the centre of the North Island. The seventeen people arrested were not, however, members of an Al-Quaeda sleeper cell. They were all local political activists. Some had high public profiles. Many were Maori nationalists. The raids drew widespread public condemnation. They were decried as racist political harassment and evidence of the dangers inherent in the anti-terrorism legislation introduced at the behest of the US in the wake of the September 11 attacks. The high-level Officials Committee for Domestic and External Security had been briefed before the raids. However, in order to charge people under the Terrorism Suppression Act, the Police still required the approval of the Solicitor-General. When they sought that approval for the October 2007 raids, he said that on the basis of the evidence that had been gathered he could not assert to the charges, and he went on to describe the existing law as ‘unworkable’. The Government responded by giving the New Zealand Law Commission the task of examining whether existing law needed to be amended ‘to cover the conduct of individuals that creates risk to, or public concern about, the preservation of public safety and security and the means of obtaining evidence in relation to that conduct’. It also specified that the Commission must take account of the ‘the need to ensure an appropriate balance between the preservation of public safety and security and the maintenance of individual rights and freedoms’ This paper examines some dimensions of how this balance might be achieved. In doing so, it considers the nature and degree of a terrorist threat to New Zealand and situates this within a discussion of wider issues of negotiating risk in contemporary globalised world. It looks at the problems inherent in legislating against terrorism and the associated difficulties of enforcing that legislation. It sets this in the context of recent instances from New Zealand of how the interests of national security and individual freedoms have intersected in practice.

    View record details
  • The Matignon Accords and Kanak Education in New Caledonia

    Small, D. (1995)


    University of Canterbury Library

    Signed in June 1988, the Matignon Accords have been credited with bringing peace and development to the French Pacific territory of New Caledonia. For the indigenous Kanak population, however, the Accords have led to demobilisation, division and disillusionment in an independence movement that had shown considerable unity and strength. This paper examines the political origins and consequences of the Accords and discusses concerns about the development model upon which they are based. It outlines the educational promises of the Matignon Accords which were devised in response to growing Kanak dissatisfaction with and mobilisation against French education. The paper shows that, in the face of fundamental critiques Kanak people have made of French education in New Caledonia, the orientation of the territory's educational authority, the Vice-Rectorat, remains unchanged. It also highlights attempts by the Vice-Rectorat to downplay and even conceal the failure of the school system to address the underachievement of Kanak pupils. The paper presents a critique of a number of educational and training initiatives that have been introduced in line with the Matignon Accords including the introduction of Kanak languages into the curriculum, the production of locally oriented school text books, the Programme d'Enrichissement Instrumental and Operation 400 Cadres. It argues that these programmes are an integral part of an approach to development that is leading not towards Kanak independence but to the strengthening of French control and influence in New Caledonia.

    View record details
  • Balanced Copyright Would Be Nice

    Cheer, U. (2009)


    University of Canterbury Library

    View record details
  • Conflation of, and Conflict Between, Regulatory Mandates: Managing the Fragmentation of International Environmental Law in a Globalised World

    Scott, K.N. (2010)


    University of Canterbury Library

    The concepts of globalisation and fragmentation present both challenges and opportunities for international environmental governance. They are, nevertheless, contradictory concepts. On the one hand globalisation emphasises notions of interdependence and linkage between problems and solutions. Within the field of environmental protection the concept of ecological interdependence has long since been recognised and the globalisation of international environmental law is arguably a necessary component of modern international environmental governance. On the other hand, fragmentation of international law – as characterised by “the emergence of specialized and (relatively) autonomous rules or rule complexes, legal institutions and spheres of legal practice”1 – emphasises the isolation and disconnect between regimes and institutions. Nevertheless, both globalisation and fragmentation create similar challenges to international environmental governance: how to manage the interaction between environmental regimes so as to minimise unnecessary conflation of, and conflict between, their regulatory mandates. It is this question that provides the central theme of this paper.

    View record details
  • Marine Geo-engineering: A New Challenge for the Law of the Sea

    Scott, K.N. (2010)


    University of Canterbury Library

    In light of the apparent failure to agree to directly address climate change through emissions reductions, attention is increasingly focusing on alternative options to reduce the impacts of climate change. Some of these options involve engineering the earth to reduce the impact or affect of climate change; in particular, marine geo-engineering is seeking to explore ocean-based climate change mitigation measures. One of these options – the sub-seabed sequestration of carbon dioxide – has recently (and controversially) been addressed by the 1996 London Protocol to the 1972 London (Dumping) Convention. The parties to the 1996 Protocol have also asserted that this instrument has jurisdiction over ocean fertilization activities and are currently developing guidelines designed to permit fertilization for the purpose of science only. Neither sequestration nor fertilization fits entirely comfortably within the dumping regime, and it is clear that other geo-engineering schemes (such as those involving the deposit of devices into the ocean and the placement of dams across straits) will fall outside of the regulatory remit of these instruments. This paper will explore the extent to which the law of the sea is capable of responding to the marine geo-engineering challenge, and whether the current regulatory tools provide the appropriate regulatory framework for proactive management of marine geo-engineering. This paper will conclude with an outline of a proposal for the development of a regime to regulate emerging climate change mitigation technologies. Whilst policy questions and general principles relating to geo-engineering are arguably best addressed within the regime established by the 1992 United Nations Framework Convention on Climate Change, detailed regulation and management of geo-engineering technologies is better suited to institutions and regimes that have specialist expertise in the area of the technology in question. The proposal developed in the final part of this paper attempts to address both these requirements.

    View record details