Massey Documents by Type

Permanent URI for this communityhttps://mro.massey.ac.nz/handle/10179/294

Browse

Search Results

Now showing 1 - 6 of 6
  • Item
    Template-driven teacher modelling approach : a thesis submitted in partial fulfilment of the requirements for the degree of Master of Science in Information Science at Massey University, Palmerston North
    (Massey University, 2004) Shi, Yanmin
    This thesis describes the Template-driven Teacher Modeling Approach, the initial implementation of the template server and the formative evaluation on the prototype. The initiative of Template-driven teacher modeling is to integrate the template server and intelligent teacher models in Web-based education systems for course authoring. There are a number of key components in the proposed system: user interface, template server and content repository. The Template-Driven Teacher Modeling (TDTM) architecture supports the course authoring by providing higher degree of control over the generation of presentation. The collection of accumulated templates in the template repository for a teacher or a group of teachers are selected as the inputs for the inference mechanism in teacher's model to calculate the best representation of the teaching strategy, and then predict teacher intention when he or she interacts with the system. Moreover, the presentation templates are kept to support the re-use of the on-line content at the level of individual screens with the help of Template Server.
  • Item
    Micro-threading and FPGA implementation of a RISC microprocessor : a thesis presented in partial fulfilment of the requirements for the degree of Master of Science in Computer Science at Massey University, Palmerston North, New Zealand
    (Massey University, 2007) Al-Ali, Firas
    This thesis is the outcome of research in two areas of computer technology: microprocessor and multi-processor architectures (specifically from the perspective of how differently they tolerate highly-latent and non-deterministic events), and hardware design of complex digital systems containing both datapath and control (particularly microprocessors). This thesis starts by pointing out that in order to achieve high processing speeds, current popular superscalar microprocessors (e.g. Intel Pentiums, Digital Alpha, etc) rely heavily on the technique of speculating the outcome of instruction flow in order to predict the behaviour of non-deterministic computing operations (as in loading operands from high-latency memory into the processor). This is fine only if the speculation is correct. But, what if it isn't? If the speculation fails, this would mean that the processor has to abandon its current decision (which now proved to be the wrong one) for the instruction flow path taken and to start all over again with the other path (the actual correct one). This is a waste of valuable processing time and hardware resources and a reduction of performance when speculation fails. Therefore, these processors can achieve high performance only when the majority of speculations are successful (being able to predict the right path). In an attempt to overcome the above shortcomings, the first part of this thesis is an investigation of the novel vector micro-threading architecture as an alternative approach to the current superscalar-based speculative microprocessor designs. Micro-threading is based on the not-so-novel multithreading technique, which avoids speculation altogether and instead, starts running a different thread of instructions while waiting for the non-determinism to be resolved. This utilizes the chip resources more efficiently without waste of any processing power. The rest of this thesis focuses on the baseline RISC processor platform, the MIPS R2000, which is reviewed first then partially synthesized from the RTL (Register Transfer Level) description using VHDL and then simulated and tested. This is conducted in order for future research to build upon and add the micro-threading architectural add-ons and modifications. Keywords: Micro-threading, Latency Tolerance, FPGA Synthesis, RISC Architecture, MIPS R2000 processor, VHDL.
  • Item
    Integrated sensor and controller framework : a thesis presented in partial fulfilment of the requirements for the degree of Master of Engineering in Information and Telecommunications Engineering at Massey University, Palmerston North, New Zealand
    (Massey University, 2007) Weir, Ryan David
    This thesis presents a software platform to integrate sensors, controllers, actuators and instrumentation within a common framework. This provides a flexible, reusable, reconfigurable and sealable system for designers to use as a base for any sensing and control platform. The purpose of the framework is to decrease system development time, and allow more time to be spent on designing the control algorithms, rather than implementing the system. The architecture is generic, and finds application in many areas such as home, office and factory automation, process and environmental monitoring, surveillance and robotics. The framework uses a data driven design, which separates the data storage areas (dataslots) from the components of the framework that process the data (processors). By separating all the components of the framework in this way, it allows a flexible configuration. When a processor places data into a dataslot, the dataslot queues all the processors that use that data to run. A system that is based on this framework is configured by a text file. All the components are defined in the file, with the interactions between them. The system can be thought of as multiple boxes, with the text file defining how these boxes are connected together. This allows rapid configuration of the system, as separate text files can be maintained for different configurations. A text file is used for the configuration instead of a graphical environment to simplify the development process, and to reduce development time. One potential limitation of the approach of separating the computational components is an increased overhead or latency. It is acknowledged that this is an important consideration in many control applications, so the framework is designed to minimise the latency through implementation of prioritized queues and multitasking. This prevents one slow component from degrading the performance of the rest of the system. The operation of the framework is demonstrated through a range of different applications. These show some of the key features including: acquiring data, handling multiple dataslots that a processor reads from or writes to, controlling actuators, how the virtual instrumentation works, network communications, where controllers fit into the framework, data logging, image and video dataslots. timers and dynamically linked libraries. A number of experiments show the framework under real conditions. The framework's data passing mechanisms are demonstrated, a simple control and data logging application is shown and an image processing application is shown to demonstrate the system under load. The latency of the framework is also determined. These illustrate how the framework would operate under different hardware and software applications. Work can still be done on the framework, as extra features can be added to improve the usability. Overall, this thesis presents a flexible system to integrate sensors, actuators, instrumentation and controllers that can be utilised in a wide range of applications.
  • Item
    Design and development of a hybrid flexible manufacturing system : a thesis presented in fulfilment of the requirements for the degree of Master of Technology at Massey University
    (Massey University, 1999) Jolly, Matthew J
    The ability of a manufacturing environment to be able to modify itself and to incorporate a wide variety of heterogeneous multi-vendor devices is becoming a matter of increasing importance in the modern manufacturing enterprise. Many companies in the past have been forced to procure devices which are compatible with existing systems but are not as suitable as other less compatible devices. The inability to be able to integrate new devices into an existing company has made such enterprises dependent on one vendor and has decreased their ability to be able to respond to changes in the market. It is said that typically 60% of orders received in a company are new orders. Therefore the ability of a company to be able to reconfigure itself and respond to such demands and reintegrate itself with new equipment requirements is of paramount importance. In the past much effort has been made towards the integration of shop floor devices in industry whereby such devices can communicate with each other so that certain tasks are able to be achieved in a single environment. Up until recently however much of this was carried out in a very much improvised fashion with no real structure existing within the factory. This meant that once the factory was set up it became a hard-wired entity and extensibility and modiflability were difficult indeed. When formalised Computer Integrated Manufacturing (CIM) system architectures were developed it was found that although they solved many existing shortcomings there were inherent problems associated with these as well. What became apparent was that a fresh approach was required that took the advantages of existing architectures and combined them into an new architecture that not only capitalised on these advantages but also nullified the weaknesses of the existing systems. This thesis outlines the design of a new FMS architecture and its implementation in a factory environment on a PC based system.
  • Item
    Web-based asynchronous synchronous environment : a thesis presented in partial fulfilment of the requirements for the degree of Master of Information Science in Computer Science at Massey University
    (Massey University, 2002) Yang, Ang
    In the face of the coming of new information technology era of 21st century, web-based learning has become the major trend of future teaching and learning model. The web-based learning systems are created to simulate the real teaching-learning environment in the classroom using computer software and web-based tools. Learner can study web-based teaching materials according to their individual needs and instructional schedule. Although web-based learning has a lot of advantages over traditional face-to-face learning, the lack of the explanations and interpretation of teaching materials from human teacher in most existing web-based learning system is critical. This project proposed an innovative solution to the problem by combining the benefits of classroom learning in the web-based education. In this project, a prototype Web-based Asynchronous Synchronous Environment (WASE) is developed that not only combines the benefits of tools such as WebCT and AudioGraph, but also integrates lectures given by the human teacher within the system. WASE provides simultaneous low-bandwidth streaming of lecture video and presentation, while facilitating students with presentation annotation facilities, and peer discussion on particular issues related to the topic. The prototype system is built using a three-tier, client-server architecture. The client tier is a set of HTML frames embedded with RealPlayer running in the students' web browsers to provide course contents and navigation guide. The middle tier is an application server which consists of Java Sevlet, JSP engine, and application programs to receive the students' request and send the corresponding course contents and navigation guide information to the client side. The third tier is the relational database for storing the course structure and contents, and for recording the interaction between students and teachers. This project provides a solution where the off-campus students are able to enjoy the explanations and interpretation of course materials from human teacher just as normal on-campus students do in the traditional face-to-face learning environment, while still reaping the benefits of web-based learning.
  • Item
    An investigation of system integrations and XML applications within a NZ government agency : a thesis submitted in partial fulfillment of the requirements for the degree of Master of Information Systems at Massey University, New Zealand
    (Massey University, 2009) Li, Steven
    With the evolution of Information Technology, especially the Internet, system integration is becoming a common way to expand IT systems within and beyond an enterprise network. Although system integration is becoming more and more common within large organizations, however, the literature review had found IS research in this area had not been sufficient, especially for the development of integration solutions within large organizations. It has made research like this one conducted within a large NZ government agency necessary. Four system integration projects were selected and studied using case study research methodology. The case study was designed and conducted using guidelines mainly from the well-known R. K. Yin’s (2002) “Case Study Research” book. The research was set to seek answers for a series of research questions, which were related to requirements of system integration and challenges for solution development. Special attention had been given to XML applications, as system integration and XML were found to be coupled in many system integrations and frameworks during the literature review. Data were first gathered from all four projects one by one, and then the bulk of analysis was done on the summarized data. Various analysis methods including chain-of-evidence, root-cause-analysis and pattern-matching were adopted. The principles of interpretive research proposed by Klein and Myers (1999) and triangulation were observed. In conclusions, a set of models have been derived from the research, namely a model for clarifying integration requirements; a model for integration solution architecture; a model for integration development life cycle and a model of critical success factor for integration projects. A development framework for small to medium size integration projects has also been proposed based on the models. The research also found XML application indeed would play an important role for system integration; the critical success factors for XML application included suitable development tools, development skills and methodologies.