Massey Documents by Type

Permanent URI for this communityhttps://mro.massey.ac.nz/handle/10179/294

Browse

Search Results

Now showing 1 - 10 of 22
  • Item
    Flesh, blood, relic & liturgy : on the subject of the museum : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Museum Studies, Massey University, Te Kunenga ki Pūrehuroa, Manawatū, New Zealand
    (Massey University, 2023-09-30) Haig, Nicholas Graham
    This thesis models a methodology for disturbing the liberal-progressive accord in museum practice and for contesting the ascendancy of post-criticality within museology. Together the liberal-progressive accord and post-critical museology normalise a subject position that, despite appearances of agency, cannot act upon its socio-historical situation. How, I ask, might the subject of the museum be reinvested in ways that counteract its demise in the relation between the contemporary museum and museology? Seeking to re/establish the conditions of existence for (a) critical museology, in the first instance this thesis asserts the primacy of “the subject” as the museological problematic requiring theorisation. A poetical-analytical schema of flesh, blood, relic and liturgy, a schema that pivots on the transposition of the work of Eric L. Santner into a museological frame, provides the means for reasserting the primacy of the subject in a manner able to anticipate new capacities for action in that subject. Incited by the museal representation of violent legacies, in particular the centennial commemorations of the First World War, this thesis encircles one institutional formation and two exhibitionary productions: The Museum of New Zealand Te Papa Tongarewa and its exhibition Gallipoli: The Scale of Our War and the standalone production, The Great War Exhibition. These monographs provide material instrumental to the argument. Emerging as a negation of the negation that follows the schema’s intervention into the relation between the museum and museology are three affirmations addressed to the prospects of (a) critical museology: (1) a critical museology must transfer crisis into the heart of its language; (2) a critical museology must attend to that which does not work but which is made to work in the museum; (3) a critical museology must strike at that which is not there.
  • Item
    A quantitative comparison of towed-camera and diver-camera transects for monitoring coral reefs
    (PeerJ Inc., 2021) Cresswell AK; Ryan NM; Heyward AJ; Smith ANH; Colquhoun J; Case M; Birt MJ; Chinkin M; Wyatt M; Radford B; Costello P; Gilmour JP; Toonen R
    Novel tools and methods for monitoring marine environments can improve efficiency but must not compromise long-term data records. Quantitative comparisons between new and existing methods are therefore required to assess their compatibility for monitoring. Monitoring of shallow water coral reefs is typically conducted using diver-based collection of benthic images along transects. Diverless systems for obtaining underwater images (e.g. towed-cameras, remotely operated vehicles, autonomous underwater vehicles) are increasingly used for mapping coral reefs. Of these imaging platforms, towed-cameras offer a practical, low cost and efficient method for surveys but their utility for repeated measures in monitoring studies has not been tested. We quantitatively compare a towed-camera approach to repeated surveys of shallow water coral reef benthic assemblages on fixed transects, relative to benchmark data from diver photo-transects. Differences in the percent cover detected by the two methods was partly explained by differences in the morphology of benthic groups. The reef habitat and physical descriptors of the site-slope, depth and structural complexity-also influenced the comparability of data, with differences between the tow-camera and the diver data increasing with structural complexity and slope. Differences between the methods decreased when a greater number of images were collected per tow-camera transect. We attribute lower image quality (variable perspective, exposure and focal distance) and lower spatial accuracy and precision of the towed-camera transects as the key reasons for differences in the data from the two methods and suggest changes to the sampling design to improve the application of tow-cameras to monitoring.
  • Item
    Frequentist-Bayesian analyses in parallel using JASP - A tutorial
    (PsyArXiv, 2022-07-21) Perezgonzalez J
    A tutorial to demonstrate the use of parallel Frequentist-Bayesian analyses using JASP, and the plausible inferences one may be able to make from such combined analysis.
  • Item
    Using market research methodologies to advance public engagement with emerging climate technologies : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy via publication in Marketing at Massey University, Manawatū, New Zealand
    (Massey University, 2022) Carlisle, Daniel
    The world is facing an unprecedented climate emergency that threatens humanity and global ecosystems. To help avoid some of the worst impacts, scientists are developing innovative technologies for addressing rising greenhouse gas emissions and climate change. However, in the early stages of research and development, the effectiveness, consequences, and desirability of implementing these technologies remains highly uncertain. Early public engagement is therefore critical for ensuring research and development pathways are acceptable to society. Currently, it remains unclear how best to engage the public on a global scale; an issue addressed in this thesis by drawing on theories and methodologies applied in the marketing discipline to advance the field of public engagement. The core methodology draws on marketing theories and measurement metrics by drawing on associative network theories of memory (ANTM) to model cognitive associations (i.e., public perceptions) with unfamiliar concepts. Study One is a replication and extension of work by (Wright, Teagle, & Feetham, 2014) and uses qualitative and quantitative methods to measure public perceptions of six climate engineering technologies across countries and over time. The results show strong perceptual differences between technologies, but remarkable consistency between countries and over time. This consistency validates the cognitive association method as a robust tool for rapid public engagement and tracking perceptions as they evolve. Study Two builds on Study One by drawing on additional dual processing theories and using an experimental design to test how citizens form opinions about emerging climate technologies. Contrary to concerns that survey methods elicit insufficiently considered responses, the study finds that citizens rely on rapid, snap judgements to form opinions, and that encouraging more thorough consideration does not affect their responses. Thus, the research further validates the use of survey methodologies for public engagement. Study Three shifts focus, measuring perceptions of alternative fuels for decarbonising the shipping industry – a previously unresearched topic. The study is also the first to use a mixed-method approach to modelling cognitive associations in academic literature. Again, the quantitative findings showed strong, previously-unknown differences in perceptions between alternative fuels. Furthermore, the qualitative analysis supplemented these findings with rich insights into the drivers behind differing public perceptions. This thesis makes several notable contributions: Practically, the results demonstrate the public’s consistent preference for Carbon Dioxide Removal over Solar Radiation Management, their cautious support for carbon capture technologies, a strong distaste for stratospheric aerosol injection and ammonia as a shipping fuel, a striking preference for nuclear propulsion over heavy fuel oil, support for hydrogen and biofuel powered shipping, support for local implementation of alternative shipping fuels, and conditional support for small-scale research into acceptable emerging technologies. Theoretically, the research advances ANTM and dual processing theories in the context of emerging technologies, yielding results that are broadly applicable to not only public engagement with science, but also market research, brand tracking, and consumer judgement. Methodologically, the research validates cognitive association methods for cross-country public engagement, demonstrates the ability to track perceptions over time, and demonstrates a mixed-method approach to modelling cognitive associations. Finally, the research demonstrates the importance of conducting early and ongoing public engagement to identify acceptable decarbonisation pathways, guide research trajectories, and inform climate policy.
  • Item
    Estimating credibility of science claims : analysis of forecasting data from metascience projects : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Statistics at Massey University, Albany, New Zealand
    (Massey University, 2021) Gordon, Michael
    The veracity of scientific claims is not always certain. In fact, sufficient claims have been proven incorrect that many scientists believe that science itself is facing a “replication crisis”. Large scale replication projects provided empirical evidence that only around 50% of published social and behavioral science findings are replicable. Multiple forecasting studies showed that the outcomes of replication projects could be predicted by crowdsourced human evaluators. The research presented in this thesis builds on previous forecasting studies, deriving new findings and exploring new scope and scale. The research is centered around the DARPA SCORE (Systematizing Confidence in Open Research and Evidence) programme, a project aimed at developing measures of credibility for social and behavioral science claims. As part of my contribution to SCORE, myself, along with a international collaboration, elicited forecasts from human experts via surveys and prediction markets to predict the replicability of 3000 claims. I also present research on other forecasting studies. In chapter 2, I pool data from previous studies to analyse the performance of prediction markets and surveys with higher statistical power. I confirm that prediction markets are better at forecasting replication outcomes than surveys. This study also demonstrates the relationship between p-values of original findings and replication outcomes. These findings are used to inform the experimental and statistical design to forecast the replicability of 3000 claims as part of the SCORE programme. A full description of the design including planned statistical analyses is included in chapter 3. Due to COVID-19 restrictions, our generated forecasts could not be validated through direct replication, experiments conducted by other teams within the SCORE collaboration, thereby preventing results being presented in this thesis. The completion of these replications is now scheduled for 2022, and the pre-analysis plan presented in Chapter 3 will provide the basis for the analysis of the resulting data. In chapter 4, an analysis of ‘meta’ forecasts, or forecasts regarding field wide replication rates and year specific replication rates, is presented. We presented and published community expectations that replication rates will differ by field and will increase over time. These forecasts serve as valuable insights into the academic community’s views of the replication crisis, including those research fields for which no large-scale replication studies have been undertaken yet. Once the full results from SCORE are available, there will be additional insights from validations of the community expectations. I also analyse forecaster’s ability to predict replications and effect sizes in Chapters 5 (Creative Destruction in Science) and 6 (A creative destruction approach to replication: Implicit work and sex morality across cultures). In these projects a ‘creative destruction’ approach to replication was used, where a claim is compared not only to the null hypothesis but to alternative contradictory claims. I conclude forecasters can predict the size and direction of effects. Chapter 7 examines the use of forecasting for scientific outcomes beyond replication. In the COVID-19 preprint forecasting project I find that forecasters can predict if a preprint will be published within one year, including the quality of the publishing journal. Forecasters can also predict the number of citations preprints will receive. This thesis demonstrates that information about scientific claims with respect to replicability is dispersed within scientific community. I have helped to develop methodologies and tools to efficiently elicit and aggregate forecasts. Forecasts about scientific outcomes can be used as guides to credibility, to gauge community expectations and to efficiently allocate sparse replication resources.
  • Item
    Determining the relative validity and reproducibility of a food frequency questionnaire to assess food group intake in high performing athletes : a thesis presented in partial fulfilment of the requirements for the degree of Masters of Science in Nutrition and Dietetics, Massey University, Albany, New Zealand
    (Massey University, 2018) Stockley, Dayna
    Background: Optimal nutrition is essential for high performing athletes in order to train effectively, optimise recovery and improve their performance. Given the differences in dietary requirements and practices that exist between athletes and the general population, dietary assessment tools designed specifically for athletes are required. Food frequency questionnaires (FFQs) are commonly used to assess habitual dietary intake as they are inexpensive, quick and easy to administer. Currently there are no athlete-specific, up-to-date, valid and reproducible FFQs to assess food group intake of athletes. This study aims to determine the relative validity and reproducibility of an athlete-specific FFQ against an estimated four day food record (4DFR) to assess food group intake in high performing athletes. Methods: Data from 66 athletes (24 males, 42 females) representing their main sport at regional level or higher and aged 16 years and over, was collected as part of a validation study in 2016. Athletes completed the athlete-specific FFQ at baseline (FFQ1) and four weeks later (FFQ2) to assess reproducibility. An estimated 4DFR was completed between these assessments to determine the relative validity of the FFQ1. Foods appearing in the 4DFR were classified into the same 129 food groups as the FFQ, and then further classified into 28 food groups in gram amounts. Agreement between the two methods for intake of food group and core food group intake was assessed using Wilcoxon signed rank tests, Spearmans correlation coefficients, cross classification with tertiles, the weighted kappa statistic and Bland-Altman analysis. Results: The FFQ overestimated intake for 17 of 28 food groups compared with the 4DFR (p<0.05). Correlations ranged from 0.11 (processed foods) to 0.78 (tea, coffee & hot chocolate), with a mean of 0.41. Correct classification of food groups into the same tertile ranged from 35.4% (starchy vegetables) to 55.5% (fats & oils). Misclassification into the opposite tertile ranged from 4.6% (legumes) to 15.4% (starchy vegetables; sauces & condiments). The weighted kappa demonstrated fair to moderate agreement (k=0.21-0.60) for food groups. Bland-Altman plots suggested that for most of food groups, the difference between FFQ1 and the 4DFR increased as the amount of each food group consumed increased. Intake from FFQ1 was significantly higher than from FFQ2 for 13 of 28 food groups. All effect sizes were small (r=0.1). Reproducibility correlations ranged from 0.49 (potato chips; fats & oils) to 1.00 (tea, coffee & hot chocolate), with a mean of 0.65. For the 23 food groups classified into tertile, 20 had >50% of participants correctly classified, <10% grossly misclassified, and 20 demonstrated moderate to good agreement (k=0.61-0.80). The exceptions were dairy; fats & oils; and processed foods & drinks which presented fair agreement (k=0.21-0.40). Conclusions: The FFQ showed reasonable validity and good reproducibility for assessing food group intake in high performance athletes in New Zealand. The FFQ could be used in future research as a convenient, cost-effective and simple way to obtain athletes’ food group intake, and identify those who could benefit from interventions to improve their nutritional adequacy and potentially their athletic performance.
  • Item
    Forecasting the decline of superseded technologies : a comparison of alternative methods to forecast the decline phase of technologies : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Marketing at Massey University, New Zealand
    (Massey University, 2018) MacRae, Murray Stuart
    An understanding of the economic life of technologies is important for firms, as new technological diffusion often results in rapid erosion of the market value of a firm’s existing technological investments. Little is known about the decline of an older incumbent technology, despite significant effort has been devoted to studying the diffusion of new technologies over the last five decades. There is, it appears, a pro-innovation bias (Rogers, 1995), as theory has a singular focus on the growth side of the substitution phenomenon. Yet to a modern enterprise managing the decline of the older technology may be at least as important as managing the diffusion of the new technology. Consequently, this research takes the first steps towards addressing this gap by investigating how best to predict the decline of an incumbent technology, through an examination of the performance of well-established forecasting methods when applied to the decline phase of a technology life cycle. Interestingly, during the search for historic data it was found that decline series are both rarer than diffusion series, and short, although not as short as diffusion series. Three studies were undertaken; the first study was a competition of four marketing science diffusion models; the Pearl logistic, Gompertz, Bass, and log-logistic models. The second study tested a pooled analogous series approach against the four models from the first study. Twenty-five decline data series were used in those two studies. The final study applied expert judgment to the task using an online panel of 250 UK managers with forecasting experience. These managers undertook expert judgmental forecasting tasks on 12 of the 25 series, spilt over two cue information treatments. Both absolute and comparative measures of accuracy were deployed along with measures to understand bias and variability. The measures were not always in perfect consensus as to the best models in each study; however, the results in aggregate were conclusive. It was found that the Bass and the Pearl logistic were consistently the best marketing science models. However, the online panel of forecasting experts provided a pooled estimate that was competitive with those best marketing science models. Importantly, forecasts from presenting data on decline in tabular form to the panel outperformed the same data presented in graphical form, such that tabular presentation was better than any marketing science model. Also well performed was an analogous series model formed from the average value of a normalised pool of the 25 series, as this approach provided forecasts that were within the range of the two best diffusion models. A straight-line model fitted to the last three data points in the estimation data constantly matched or outperformed all three methods over short horizons. This indicates that simple diffusion models, such as a simple pooled average of available analogous series or even a straight-line model can provide a viable forecast, providing further evidence that simple methods are in general all that is needed to forecast in such situations. Despite laboratory research indicating that individuals are poor at this task, the judgmental study indicates that humans can be successfully used to forecast S-shaped curve trajectories in field trials; however, there are cost and time implications in using a panel that would preclude its use in many situations. References: Rogers, E. M. (1995). Diffusion of innovations (4th ed.). New York, NY: The Free Press.
  • Item
    Analyzing seismic signals to understand volcanic mass flow emplacement : a thesis presented in partial fulfillment of the requirements for the degree of Doctor of Philosophy in Earth Sciences at Massey University, Palmerston North, Manawatu, Aotearoa New Zealand
    (Massey University, 2017) Walsh, Braden Michael Larson
    Natural hazards are one of the greatest threats to life, industry, and infrastructure. It has been estimated that around a half billion people worldwide are in direct proximity to the danger of volcanic hazards. For volcanic mass flows, such as pyroclastic density currents and lahars, extreme runout distances are common. The close proximity of large population centers to volcanoes requires the implementation of early warning and realOtime monitoring systems. A large portion of the progress towards realOtime monitoring is through the use of geophysical instrumentation and techniques. This research looks into emerging geophysical methods and tries to better constrain and apply them for volcanic purposes. Specifically, multiple types of amplitude source location techniques are described and used for locating and estimating the dynamics of volcanic mass flows and eruptions. Other methods, such as semblance and back projection, are also employed. Applying the active seismic source method to a lahar that occurred on October 13th 2012 at Te Maari, New Zealand, locations and estimations of lahar energy were calculated in an increased noise environment. Additionally, the first ever calibration of the amplitude source location (ASL) method was conducted using active seismic sources. The calibration proved to decrease true error distances by over 50%. More calibration on the ASL method was accomplished by using all three components of the broadband seismometer. Initial results showed that using all three components reduced extreme errors and increase the overall precision of the locations. Finally, multiple geophysical methods (ASL, semblance, back projection, waveform migration, acoustic-seismic ratios) were used to show that a combination of instrumentation could produce more reliable results. This research has filled gaps in the preexisting knowledge for hazards. With these results, more effective hazard warnings can be produced, and systems for real time estimations of locations and dynamics of volcanic events could be developed.
  • Item
    Civic circle : empowering young New Zealanders to volunteer with local non-profit organisations : a thesis submitted by Ross Patel in partial fulfilment of the requirements for the degree of Master of Design, Massey University, Wellington, New Zealand
    (Massey University, 2018) Patel, Ross
    The most common comment made by volunteer-involving organisations in both 2015 and 2016 State of Volunteering in New Zealand reports was that “the majority of volunteers are older (aging) and there aren’t enough young people stepping up” (Volunteering New Zealand, 2017, p. 26). Another common observation was that volunteers are less committed and are ‘time poor’. This is in contrast to the research that shows millennials (people born between the 1980s and 1990s) are upbeat about their ability to have a positive impact on the world (Green, 2003). Millennials can offer many skills and qualities to help non-profit organisations, although such organisations are currently inadequately prepared to welcome them (Fine, 2008). This design-led research sets out to explore how to empower young New Zealanders to volunteer with local non-profit organisations. Participatory design methods were employed to engage 27 representatives from 21 organisations and 19 young New Zealanders in the design process. Keywords: Volunteering, non-profit organisations, volunteering-involving organisations, young New Zealanders, millennial engagement, generation-y, civic engagement, design thinking, co-design.
  • Item
    The New Zealand census : some technical and historical aspects : a thesis presented in partial fulfilment of the requirements for the degree of Master of Science in Statistics at Massey University
    (Massey University, 1989) Dixon, Shirley Ann
    This thesis provides an overview of the New Zealand Census of Population and Dwellings. Certain critical aspects are examined in detail, including the collection phase involving questionnaire content and the enumeration process, the testing before and after, the preparation of the data for entry into a computer and the subsequent dissemination of the information. The information for this research was obtained from published material from overseas, from published and unpublished material from the New Zealand Department of Statistics and from interviews with some officers of the Department. In each aspect, New Zealand is compared and contrasted with other major countries; specifically America, Australia and India. Because of its geographical proximity, any developments in Australia have an immediate impact on New Zealand. The US Bureau of the Census is often a forerunner in the development of census procedures and techniques. The procedures developed in India to cope with their own specific and peculiar problems in census-taking provide an interesting comparison with those of New Zealand. Where pertinent, aspects of censuses in other countries are also compared with those of New Zealand censuses. New Zealand has adopted many of the procedures used in other countries, but limited resources have hindered or prevented census staff from developing and maintaining some of the procedures used in American and Canadian censuses. In particular, pilot testing of questionnaires has only recently been incorporated into the census procedures, and major post­ censal evaluations are not conducted. On the other hand, the small size of the New Zealand population has facilitated innovations in such areas as data entry, editing and imputation. The history of census-taking is covered to gain a perspective on the place of the census in modern society. Alternatives to censuses were examined; specifically, regular major surveys, administrative records and data banks. It is found that surveys suffer a lower response rate than censuses and that the problems of differential undercoverage of various population groups experienced in censuses are exacerbated in surveys. Administrative records frequently do not contain sufficient detail, varying definitions are employed to categorise the data and the quality of the data cannot always be assured. Data banks provide a rapidly growing source of information, but currently also suffer from a lack of universal definitions, and many data banks do not incorporate strict quality control procedures as a matter of course. Moreover, strict confidentiality laws currently prevent access by census staff to administrative files and data banks. It could be argued that censuses should continue to be taken because of the need to obtain current, detailed information on all members of any population for planning for present and future needs of that society. A census is the only vehicie for collecting information supplied by all members of the population at a single point in time. If censuses are to remain credible and acceptable to the individual members of a population, challenges must continue to be addressed such as: the accuracy of estimates must be protected by obtained the highest possible response rate from all sections of the population; confidentiality of data must be guaranteed; the costs of the census operation must be kept within budget, while still maintaining high data quality and publication of data in a time frame that is acceptable to users of census data; universal definitions must be employed to minimise the redundancy between censuses, surveys and administrative lists; results of the census must be attractively presented to the public using a variety of media and accompanying analysis reports must be aimed at increasing the public awareness and of the importance and need for regular, successful censuses.