Massey Documents by Type

Permanent URI for this communityhttps://mro.massey.ac.nz/handle/10179/294

Browse

Search Results

Now showing 1 - 2 of 2
  • Item
    Grape yield analysis with 3D cameras and ultrasonic phased arrays : a thesis by publications presented in fulfillment of the requirements for the degree of Doctor of Philosophy in Engineering at Massey University, Albany, New Zealand
    (Massey University, 2024-01-18) Parr, Baden
    Accurate and timely estimation of vineyard yield is crucial for the profitability of vineyards. It enables better management of vineyard logistics, precise application of inputs, and optimization of grape quality at harvest for higher returns. However, the traditional manual process of yield estimation is prone to errors and subjectivity. Additionally, the financial burden of this manual process often leads to inadequate sampling, potentially resulting in sub-optimal insights for vineyard management. As such, there is a growing interest in automating yield estimation using computer vision techniques and novel applications of technologies such as ultrasound. Computer vision has seen significant use in viticulture. Current state-of-the-art 2D approaches, powered by advanced object detection models, can accurately identify grape bunches and individual grapes. However, these methods are limited by the physical constraints of the vineyard environment. Challenges such as occlusions caused by foliage, estimating the hidden parts of grape bunches, and determining berry sizes and distributions still lack clear solutions. Capturing 3D information about the spatial size and position of grape berries has been presented as the next step towards addressing these issues. By using 3D information, the size of individual grapes can be estimated, the surface curvature of berries can be used as identifying features, and the position of grape bunches with respect to occlusions can be used to compute alternative perspectives or estimate occlusion ratios. Researchers have demonstrated some of this value with 3D information captured through traditional means, such as photogrammetry and lab-based laser scanners. However, these face challenges in real-world environments due to processing time and cost. Efficiently capturing 3D information is a rapidly evolving field, with recent advancements in real-time 3D camera technologies being a significant driver. This thesis presents a comprehensive analysis of the performance of available 3D camera technologies for grape yield estimation. Of the technologies tested, we determined that individual berries and concave details between neighbouring grapes were better represented by time-of-flight based technologies. Furthermore, they worked well regardless of ambient lighting conditions, including direct sunlight. However, distortions of individual grapes were observed in both ToF and LiDAR 3D scans. This is due to subsurface scattering of the emitted light entering the grapes before returning, changing the propagation time and by extension the measured distance. We exploit these distortions as unique features and present a novel solution, working in synergy with state-of-the-art 2D object detection, to find and reconstruct in 3D, grape bunches scanned in the field by a modern smartphone. An R2 value of 0.946 and an average precision of 0.970 was achieved when comparing our result to manual counts. Furthermore, our novel size estimation algorithm was able accurately to estimate berry sizes when manually compared to matching colour images. This work represents a novel and objective yield estimation tool that can be used on modern smartphones equipped with 3D cameras. Occlusion of grape bunches due to foliage remains a challenge for automating grape yield estimation using computer vision. It is not always practical or possible to move or trim foliage prior to image capture. To this end, research has started investigating alternative techniques to see through foliage-based occlusions. This thesis introduces a novel ultrasonic-based approach that is able to volumetrically visualise grape bunches directly occluded by foliage. It is achieved through the use of a highly directional ultrasonic phased array and novel signal processing techniques to produce 3D convex hulls of foliage and grape bunches. We utilise a novel approach of agitating the foliage to enable spatial variance filtering to remove leaves and highlight specific volumes that may belong to grape bunches. This technique has wide-reaching potential, in viticulture and beyond.
  • Item
    Printed sensors for indoor air quality : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Engineering, Massey University, Albany, New Zealand
    (Massey University, 2022) Rehmani, Muhammad Asif Ali
    On average, a human inhale about 14,000 litres of air every day. The quality of inhaled air is highly important as the presence of pathogens and contaminants in air can adversely affect human health. Generally, the probability of pathogens/contaminant is high in indoor environment where humans spend an estimated 90% of their total lifetime. Continuous urbanization, increasing population, technological advancement and automation has further increased the time spent indoors. The length of exposure and indoor activities such as cooking, smoking, ventilation and frequency of cleaning can further aggravate the health risk due to localized higher concentrations of the contaminants. According to the Environmental Protection Agency (EPA), poor indoor air quality (IAQ) is considered one of the top environmental dangers to the public as increasing number of people are suffering from asthma, allergies, heart disease, and even lung cancer. In New Zealand, poor air quality is estimated to cause 730 premature deaths and cost over one billion dollars in restricted activity days per year. The above premise cannot be validated until and unless there are means and measures of continually monitoring the indoor air pollutants with emphasis that the same can be fabricated using low cost and energy efficient methods. Furthermore, any remedial actions cannot be undertaken if the quantitative values of the environmental pollutants are unknown. Existing solutions for the air quality monitoring are expensive and can only be applied in certain numbers, leaving areas of the houses, offices, and schools unmonitored. Therefore, a ubiquitous system of air quality monitoring is needed, the one that can be applied on large areas like walls, roofs and so on. Such a prevalent system will allow sensing of air quality parameters rapidly, continuously, and with low power consumption. To realize the bigger objective of achieving sensing and aware surfaces for indoor air quality, this research proposes to print sensors on large surfaces rather than making them in batches and packaging in discrete units. Recent advancements in inkjet printing provide solutions which can enable the implementation of such sensors. However, the choice of inkjet printing method has major impact on the efficacy of printed sensors. Therefore, we have explored printing techniques based on conventional screen printing and non-conventional electrohydrodynamic (EHD) inkjet printing. These printing methods offer low-cost, rapid prototyping and high-thorough-put conductive printing of features as compared to other inkjet printing methods with the latter bringing further advantages of improved resolution, scalability, customization and little or no environmental waste printing solution. For screen printing, laser ablation process has been used to implement several customized transduction schemes. The utility of this technique is demonstrated by humidity sensing. It has been found that the designs of the transduction electrodes can easily be customized, and large area printing can be realized on the substrate. The fabricated humidity sensor provides higher sensitivity through bio-compatible sensing layer with good response and recovery time. Next, EHD printing was explored for high-resolution conductive printing on flexible substrates. Current EHD printing focuses on improving the print resolution by decreasing the printhead nozzle diameter thus limiting the type of ink to be used for printing purpose. In the proposed EHD printhead design we overcame this major shortcoming by improving the resolution of printed feature with a bigger nozzle of 0.5 mm diameter. This resulted in the printed feature resolution of less than 10 µm in general with the highest achieved resolution 1.85 µm. The effective nozzle diameter to printed feature ratio of more than 250 was achieved. The use of bigger nozzle for fine resolution printing opens the avenue for utilizing higher concentration of metallic nano-particles inks through EHD printing. The hallmark of the presented EHD printhead design is the utilization of off-the-shelf components which does not require expensive manufacturing process while highlighting the importance of wetting area profile of the nozzle to facilitate fine resolution printing which until now has not been explored in detail. Furthermore, the work highlights the issue of crack development during EHD printing in the conductive tracks while using available piezoelectric inkjet ink. Later the ink was modified to minimise the cracks in EHD printed features. Finally, a comprehensive study on the 3D printed microfluidic channels was conducted. The study highlights the variation of pressure developed in different microfluidic channel designs and the susceptibility of leakages from microfluidic devices. The work presents the possibility of utilizing the 3D printed microfluidics with printed sensors for deploying as lab-on-a-chip in various applications, such as passing a stream of air through sensors integrated in a microfluidic device for analysing the volatile organic compounds, humidity, toxic gases, and other analytes of interest. Overall, the presented work demonstrates the feasibility of using conventional and non-conventional printing methods through simple implementations for the fabrication of IAQ sensors with high degree of customization, low processing cost and scalability.