The authors have given permission for these theses to be consulted as a regular part of the Arthur A. Wishart Library collection and also to reproduce all or parts of it, for scholarly research only, in compliance with the Canadian Copyright Act. These digital editions are released under Creative Commons Attribution-Noncommercial-No Derivative Works 2.5 Canada License. You are free to share — to copy, distribute and transmit the works under the following conditions:
Attribution. You must attribute the works in the manner specified by the authors or licensors (but not in any way that suggests that they endorse you or your use of the works).
Noncommercial. You may not use these works for commercial purposes.
No Derivative Works. You may not alter, transform, or build upon these works. For any reuse or distribution, you must make clear to others the licence terms of this works. Any of the above conditions can be waived if you get permission from the copyright holders.
Minor conservation performed on series.
212 MB of electronic textual records. - 83 PDFs
0.4 m of textual records
Series comprises undergraduate computer science theses.
Cameron, Ian C.
In this modern age computers dot the landscape peifouning a plethora of functions, and with all of these machines comes the monumental task that is maintenance. Circuit boards short out software becomes corrupted and viruses invade, and dealing with all these problems is the lowly technical support representative. To do this job the computer industry needs more expert representatives than can possibly be found let alone trained. This means that if a system can be created that would make the expertise of an experienced professional could be put at the fingertips of a new technician right out of training both perfoimance and customer satisfaction could be improved immensely. In this thesis the goal is to address one key question. Would a computer troubleshooting expert system represent the logical next step in troubleshooting aids? To answer this we will explore several topics. The first will be expert systems, the problems that they are best at solving, and how they are usually built. The second will be the Jess rule language which was chosen to build a prototype application. The third topic will be the analysis of the problem faced by the industry, and how it will be met. Finally we will describe the design and implementation of a prototype computer troubleshooting expert system.
2.73 MB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 2005. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures. -- Contents: Thesis.
Pagnotta, Anthony
Many of us interact with information systems on a daily basis without even realizing it. There are hundreds of information systems around us every day ensuring we get paid, providing us with internet access as well allowing us to purchase products on credit to name a few. The way these systems are set up, if they are working properly you do not even notice they are there. However if something goes wrong with the system, the repercussions can be large even if it is the slightest glitch. For this reason it is imperative that the system’s design is not only fault tolerant but allows for errors to be handled without taking down the rest of the system. In order to ensure these features are present, they must first be incorporated in the initial design of the solution. The developers which are most likely to build this kind of system are those whom have had to deal with the issues before, and have learned to become proactive to these kinds of issues. One way which people can obtain this experience without having the real world repercussions is through information technology training, such as is present in the Business Systems Management specialization. The purpose of this thesis paper will be to study the ways that Information Technology and Information Systems are taught in North America, specifically focusing on smaller institutions, and ways which these programs could be modified to provide experience comparable to that of working in the IT field.
9.09 MB of textual records (pdf)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 2010. -- Submitted in partial fulfillment of the requirements for the degree of Bachelor of Computer Science Business Systems Management Specialization. -- Includes bibliography, appendices, and figures. -- Contents: Thesis.
Lajoie, Michael
In February 2005, Jesse James Garrett gave a methodology, slowly growing in popularity, a tremendous kick-start into high gear by providing it with a catchy name in the form of AJAX. Short for Asynchronous JavaScript and XML, the name encapsulated some of the key aspects of a methodology with the promise of creating web applications with the look, functionality, and near-power of traditional desktop applications. The Ajax methodology holds the promise of enhancing the users overall experience and productivity while adding value to the organization. One of the most appealing aspects of Ajax is its ability to achieve its objectives using only the technologies included in every modern browser. It relies on JavaScript, DOM events, XHTML, CSS, and asynchronous communications using the HTTP protocol. All of these technologies are present in every modern browser and have been present since 2000. This thesis will present a survey of the Ajax methodologies including: its advantages and disadvantages; some of the issues associated with its use; and a survey of the support currently available for developing applications using the methodology.
3.68 MB of textual records (pdf)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 2007. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures. -- Contents: Thesis.
Kraushaar, Michelle
This paper discusses an experiment in implementing a Modus Ponens and Contextual Modus Ponens reducer function for Partial Information Logic model generation. The reducer is a preprocessor that is to be used with the PIL implementation of a beth tableau generator as developed by Rajnovich and Nait Abdallah.
991.22 KB of textual records (pdf)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 1997. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures and tables. -- Contents: Thesis.
Rosychuk, Rhema
Availability of digital media has recently increased exponentially due to the internet. Media is transferred via the internet at rates which were previously unheard of. While this decreases the time and man power required to complete a transaction, it also allows files to be illegally distributed at the same rate. This thesis paper covers the topic of Encryption and its application to audio files. Encryption has become crucial in online transactions due to the increase in both sales of digital media as well as an individuals ability to illegally obtain and distribute this media. This paper will offer an alternative algorithm for performing audio file encryption that can also be applied to securely encrypt other media such as video, picture and text files.
1.91 MB of textual records (pdf)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 2007. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures. -- Contents: Thesis.
Johnson, Michael
Bluetooth technology is a newer technology which allows for short range wireless communication. The range Bluetooth technology can handle depends on the devices being used however it is generally between 10 and 100 meters. The technology allows for multiple devices to communicate with each other. Currently, the most common use for Bluetooth technology is the wireless headset for cellular phones or Personal Digital Assistants. Many companies are trying to determine new and different ways to exploit this technology for its products. Those companies see a value that Bluetooth brings to the industry of wireless communication. A lot of these companies which are rivals in certain aspects are actually helping each other out to develop new ways to use Bluetooth. They understand that they may be helping their competitors but they also see more revenue they can generate from this technology.
5.42 MB of textual records (pdf)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.) -- Algoma University, 2008. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures. -- Contents: Thesis.
Congdon, Spencer
Today's video game industry is plagued with many of the problems the software crisis brings to traditional software engineering. The increasing complexity of game development results in projects rarely being delivered on time, within budget, and bug free. These issues are complicated by the many disciplines involved in game design and the communication gaps that exist between them. Documentation is often used to help programmers increase software quality that is demanded by the software crisis. Game design does not employ any unique documentation types despite having additional complications when compared to traditional software engineering. This thesis investigates interdisciplinary documentation as a solution to the communication problem and consequently the video game software crisis. Specifically, the communication between artists and programmers during game level design is discussed. Based on the investigation of existing documentations used by artists and programmers a new documentation called MyGameFlow is developed. The application of these new conventions to existing game levels provides a look at how this documentation can aid developers. The results show that interdisciplinary documentation can be a useful tool for improving communication and quality control in game design. By providing a set of conventions for describing a game level, MyGameFlow is able to help artists and programmers communicate.
8.31 MB of textual records (pdf)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.) -- Algoma University, 2008. --Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures. -- Contents: Thesis.
Tacaberry, Jason
2.04 MB of textual records (pdf)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 2000. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures and tables. -- Contents: Thesis.
Sethi, Mridu
The importance of testing a source code is universally acknowledged. However there are more excuses prevalent to skip the process of testing with few common reasons being "My source code doesn't need testing" and a few like "I do not wish to waste time in testing; code development is what I need to work on". However, testing plays an imperative role in the construction phase of a software application or project. The concept of unit testing evokes bipolar reactions among people, either they are totally for it or absolutely deny its usefulness and strengths. The best way to begin would be to write some code, entrap the encountered bugs by testing them, mend them, and work on these iterations of writing, testing, fixing repeatedly until the desired output is achieved. It is important to note that testing reveals the presence of faults and thus its goal is to find faults, the more the better. Unit testing is the first and foremost level of testing. It is used to test the components in isolation, so as to uncover irregularities between the behavior and interface specification. It is widely recommended that developers devote 25-50% of their time to testing. In this thesis, I have worked towards developing an automated test generator which solves the purpose of generating appropriate tests which are specific and general at the same time, and helpful in catching bugs in a .class file of Java. Testing yields good returns during development by the way of adding new functionality and efficiency to the project at hand, by discovering bugs which prevent a software program to give the desired results.
15.9 MB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 2007. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures. -- Contents: Thesis.
Bhuiyan, Mohammad A J
The field of software visualization exists to facilitate both human understanding and effective use of computer software. This thesis demonstrates eight software visualization tools to acquire information about the current state of software visualization systems. It also explores the famous Bloom's taxonomy and finally makes a new classification of those software visualization tools according to Bloom's taxonomy six levels. The new classification would help people to choose appropriate software for their specific needs.
4.99 MB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 2007. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures and tables. -- Contents: Thesis.
Masahiro, Morodomi
This paper introduces the use of a new weighting scheme for structured document classification. There are several ways to classify semantic dataset. The majority of existing text classifiers use machine learning techniques and represent text as a Bag of Words (BoW), that is, the features in document vectors represent weighed occurrence frequencies of individual words. (Sahlgren, et al., 2004) Term Frequency and Inverse Document Frequency (tf-idf) (Salton, et al., 1987) is a common weighting method for BoW, however, tf-idf considers the number of occurrences in a set of documents and ignores the important information of the semantic relationship in a document. To solve this problem, several methods have been proposed to improve text representation with external resources such as WordNet. (Miller, 1995; Hotho, et al., 2003) However, these approaches have some limitations such that they need external files and cannot cover all of like synonyms and acronyms. To improve efficiency and correctness of text categorization for well structured documents such as Wikipedia articles this paper proposes a new weighting method for structure-based documents called tfs to create a mining model using one data mining algorithm and techniques of knowledge discovery in databases to understand the relatedness of structured documents, and compare the effect of weighting schemes. Finally, the results of text categorization with the tfs weighting scheme shows that the performance of the mining model changed.
924.84 KB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 2010. -- Submitted in fulfillment of the requirements of Computer Science 4235. -- Includes figures, tables, abbreviations, appendices, and a bibliography. -- Contents: Thesis.
Thompson, David
This report details the investigation into the effect of several artificial neural network (ANN) architectures on the ability of these networks to predict highway traffic volumes (HTVs). The scope of this project is intended as a basis for further research by determining favourable choices for an underlying network architecture which may be expanded to incorporate recurrent temporal sensitive features. This will be determined on the basis of the errors that the predicted traffic volumes deviate from the target values in the training set. The training phase of the models will exclude data for up to a year to allow for the performance evaluation of the trained networks.
1.25 MB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 1996. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures and tables. -- Contents: Thesis.
Hooper, Dan
The purpose of this thesis is to determine how to create realistic looking "maps" using OpenGL. A "map" in the context of computer graphics is a three dimensional environment. Thus a realistic looking map is a realistic looking three-dimensional environment. Once familiar with the concepts of OpenGL, I will attempt to create a realistic looking map of the student lounge at Algoma University College, called the T-Bird Lounge. I will also make the map interactive to the viewer. The viewer will be able to move through the lounge and examine anything they wish. Through the generation of this map I will gain a deeper understanding of what is involved in creating such graphically intensive software. I will be using Microsoft Visual C++ 6.0 to write the program. I am also using the latest version of OpenGL available at the time, version 1.2.
4.33 MB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 2002. -- Submitted in partial fulfillment of course requirements for COSC 4235.-- Includes figures. -- Contents: Thesis.
Delgado Aquilar, Rodrigo
Distributed denial-of-service (DDoS) attacks represent a major security problem for every internet user. A defense system against a DDoS attack should be able to detect these attacks and quickly respond in order to stop the flooding of the victim network. A DoS attack can also be created by the high demand of users in a popular website. This is why it is equally important to recognize the legitimate traffic and keep providing the service to these users. For a DoS attack we can notice the one-to-one relationship between the attacker and the victim, therefore it might not be necessary to have any extra help since the situation depends only on two people. It is a win-lose situation. In the case of a DDoS attack the relationship clearly has an advantage for the attacker: an N-to-one relationship, where N is the number of attackers and one is the victim. In this scenario there is no win-lose situation for the victim, hence it raises the need for a Distributed Defense System against DDoS attacks. Current solutions are only affordable by big companies or people who have thousands of dollars to spend monthly for such protection [Table 1]. The proposed defense against DDoS attacks in this document is open source based, because it is intended to be an affordable defense for every internet user who requires to be protected. Every internet user should have the right to be protected against both DoS and DDoS attacks, especially when the internet has become a high medium of communication.
3.33 MB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.) -- Algoma University, 2008. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures and tables. -- Contents: Thesis.
Dale, Brian
This report provides a description of the design and implementation of a recurrent spatial knowledge base. The project aims to implement a flexible and efficient knowledge base from which large quantities of spatial temporal knowledge can be retrieved. The three components of the knowledge base are the function base, database and the procedure base. The database model that was chosen was the NF 2 model. The database was implemented by programming a subset of typical relational database functions to operate on a database which is stored as a tree using the DOS hierarchical directory structure. The function base is a library of routines designed to process a users request for variables which require extra processing. The knowledge base is part of a system in which user queries are preprocessed and subsequently passed to the knowledge base management system which retrieves the knowledge from either the database or the function base. The retrieved knowledge can then be passed to any of a set of procedures contained within the procedure base where further semantic processing can be performed.
3.9 MB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.) -- Algoma University, 1994. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures and tables. -- Contents: Thesis.
Loureiro, Joe C
The main objective of this thesis paper is to investigate the design and implementation of an experimental rule-based expert system to aid in an academic degree counselling program based on the 1995/96 Algoma University calendar. The approach in the design will include the use of an expert system. The type of expert system chosen for this project is a rule-based expert system for reason mentioned in this paper. The implementation will be done in the language PROLOG. Prolog is a very well suited language for this type of project and its strengths will be explored in this paper. An experimental prototype program is included which handles the Bachelor of Arts (General) Computer Science Degree. This degree is used as an example, and other degrees could easily be added by adding their requirements as rules. The assumption made here is that if the expert system can work for one degree, then with the addition of a menu structure, any degree can be chosen. The prototype is not intended to be a finished product, but as an experiment to prove my point.
3.59 MB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 1996. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures. -- Contents: Thesis.
Calcagnini, Carlo
Engineers need not be burdened by the task of learning new tools for scientific analysis. The objective of this project was to develop an interface that uses a natural language-like query, for retrieval and analysis of knowledge. Described are the design and implementation of the parser. The grammar is general and need not be modified for porting to other applications. The interface uses a dictionary to maintain generality. The interface is written in object-oriented C++. Once the query is parsed it is changed into a postfix expression. This expression is passed along to a knowledge base engine for knowledge retrieval. The knowledge returned by the knowledge base is then analyzed. Results of the query are shown. The use of the proposed Scientific Analysis Language (SAL) will free the engineer from having to learn standard programming languages for analysis of data. Also, because of its generality it can have widespread usage.
2.82 MB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 1994. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures and tables. --Contents: Thesis.
Carmo, Anthony J R
The Internet has become an effective tool within the work environment, but with this vital tool comes a downfall with the loss of work productivity through employee distraction. Previous attempts at providing widespread bans and blocks of illegitimate sites have caused issues with Internet access and various Internet related services and also employee morale. The use of banning IP address and redirecting the IP addresses to business websites informing of improper use are proving to be ineffective in the attempt to create working environment with an internal barricade to prevent these abuses of company policies and work productivity. The use of firewalls and various other technologies to enable this prevention has proven to be ineffective and is capable of being bypassed by experienced users with information technology knowledge. Advancements with modern computing technology attempt to correct the issues within programs such as CyberPatrol and Netnanny. The current solutions to combat the loss of productivity are only solved within one basic solution, with alterations depending from program to program, but the software never evolves to into anything beyond the original method. This thesis discusses the effects of network Quality of Service on workplace productivity and how both can be maintained respectively. This thesis adopts the philosophies of its counterparts in the fact that productivity must be maintained but with an alternate scheme of preventing access to Internet resources. Effective and objective, the "productivity filter" proposes the use of a penalty without the knowledge of the user. In effect this thesis extracts the philosophies of the current technologies and applies a different view to solving the problem of reducing non-work related Internet activity.
1.95 MB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.) -- Algoma University, 2007. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures and tables. -- Contents: Thesis.
Adamo, Mario
This study compared the estimations of annual average daily traffic (AADT) volumes using the conventional method(Factors), multiple regression analysis, and the neural network approach. All three approaches were compared using three different classification schemes as well as different duration of traffic counts. The neural network and multiple regression approaches consistently performed better than the conventional approach, and the neural network approach in many cases slightly outperformed the multiple regression approach. Apart from providing a good modeling tool for estimating AADT, the results also provide useful insight into the duration of the short term traffic counts and the classification schemes for the highway sites.
2.23 MB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 1994. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures, tables and Graphs. -- Contents: Thesis.
Khan, Arif
This thesis presents a comprehensive overview of the problem of facial recognition. A survey of available facial recognition algorithms as well as implementation and tests of a computationally efficient and near real time well established approach to face recognition is presented. One of the oldest and robust face recognition algorithms, Eigenface, along with enhancement to that algorithm Histogram Equalization, Circular Tracing, Principal Component Analysis and Averaging techniques are added to get better results to identify individual from face dataset. A proof of concept implementation is provided, that is written in Visual Basic 6 on Visual Studio 6 platform. Extensive testing of the project along with performance measures are presented in easy to understand graphical images. These test evaluations showed that the proposed implementation of eigenface technique can be increased by better preprocessing and normalization of the input face space before raw eigenface approach takes over. Tests also suggested that the core eigenface technique hits a plateau once it hits the threshold of optimal number of eigenvectors. Finally, this thesis presents a discussion of the test results and a section on future direction this project may be led on.
2.28 MB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.) -- Algoma University, 2007. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures and tables. Contents: Thesis.
Achal R Chaurasia
I was always fascinated with games and every aspect of game development including the hardware and software. I have worked with software before and therefore wanted to do explore the world of gaming hardware. Therefore the objective of this thesis is to study and research the history of various gaming devices, the various types and technologies of gaming devices, see the various facets of the gaming industry and observe how it rose from its humble beginnings to form such a global market worldwide, the future of gaming devices and to study a particular technology for the purpose of developing a gaming device and develop a working model to demonstrate.
1.45 MB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.) -- Algoma University, 2007. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures and tables. -- Contents: Thesis.
Pacione, Angelo D
In many recent publications on computer graphics, digital image processing has been divided into two major categories: image enhancement and image restoration. Digital image processing can be simply equated to 'manipulating' an acquired image (Sheldon, BYTE 1987, p.141). These two methods of digital image processing may seem very similar in practice, however they are different in their primary goals.
6.77 MB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 1996. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures. -- Contents: Thesis.
Biocchi, Michael
Technology is solving many problems and conflicts today that we were unable to solve in the past. Tennis matches now turn to a three dimensional (3D) recreated replay to determine where the ball landed and baseballs are being tracked in 3D to reveal the exact distance that the player hit the ball. These examples are great for entertainment purposes, but how can we put this technology to better use or to possibly determine someone's fate.
4.19 MB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 2007. -- Submitted in partial fulfillment of course requirements for COSC 4235. --Includes tables. -- Contents: Thesis.
Keppel-Jones, Trevor
The main purpose of this project is to explore the function and capability of 64 bit processors. Sixty-four bit processors have been on the market for roughly 5 years now, but it is not clear what their advantage is over the 32 bit processors. The 64 bit processor would be expected to provide benefits in the area of computation speed and mathematical accuracy. My project explores the advantages of the 64 bit processor over the 32 bit processor specifically in the area of floating point computation.
1.7 MB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.) -- Algoma University, 2003. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures and tables. -- Contents: Thesis.
Broad, MacLeod
Now more than ever, as computer systems have become essential for productivity in large enterprises, it is necessary to manage such systems in a way that ensures hardware and software is up to date and operational interruptions are at a minimum while security, performance and adherence to budget constraints are met, in accordance to ITIL best practices [19]. Windows based operating systems bundled with enterprise-level configuration management software, such as System Center Configuration Manager 2007 is one of the most common methods of managing the configuration of a large-scale network, but unfortunately it is not a complete solution. The goal of this thesis paper is to analyze the strengths and weaknesses of configuration management practices in a large-scale Windows-based environment, and provide recommendations and possible solutions to the deficiencies of current configuration management software. Firstly, the strengths and weaknesses of configuration management software were researched by analyzing how well the available products aligned with configuration and fault management practices. Common technical failures were noted and three proof-of concept applications were developed to resolve these failures more efficiently. Time and cost saving benefits were analyzed through time trials conducted by experienced IT professionals on a small but scalable network. The time trials compared manual and semi-manual methods of resolving regularly occurring technical failures, against the automated methods available within the software that was developed as a product of this thesis. Solutions include: remote file compression and remote checksum calculation using wrapper scripts, and a tool for troubleshooting other common Windows client errors.
1.73 MB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 2010. -- Submitted in partial fulfillment of the requirements for the degree of Bachelor of Computer Science. -- Includes figures and tables, appendices of code samples, and a bibliography. -- Contents: Thesis.
Acton, Gavan
Portable Document File. -- System requirements: Electronic device with World Wide Web browser and PDF reader. -- Online access via World Wide Web at http://archives.algomau.ca/.
In this paper we propose, implement and test a new approach to Dynamic Optimization inspired by microbiological swarms. Our approach makes use of the strengths of real bacteria, namely self organization, adaptation and natural selection to perform optimization. Finally, we test and show that our swarm significantly outperforms a state of the art approach by achieving comparable optimization in environments.
3.83 MB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.) -- Algoma University, 2007. -- Submitted in partial fulfillment of course requirements for COSC 4235.
Jones, William R
The purpose of this thesis is to analyze the potential effects of offsetting different dataintensive, general purpose tasks to the GPU (Graphics Processing Unit). The focus will not be on whether or not improvements are attainable by using the GPU for general-purpose computations, as this is already a well-established fact. Instead, the focus will be on examining speed increases attainable by this method, and on determining the effects of parallelization, work efficiency, and bank conflict avoidance on the improvements. As well, any other factors which can contribute to, or hinder the speed at which these parallel computations can be done on the GPU will be discussed and analyzed.
6.3 MB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 2008. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures and tables. -- Contents: Thesis.
Tsang, Chun Yan
This paper presents the basic concepts in the area of text data mining. Text mining is a relatively new technique in computer science for extracting important information. Text mining is base on another mechanism known as data mining. We will compare and contrast between data mining and text mining. We will discuss the significance of text mining and why it is needed. We will look at some existing applications that use text mining. We will also discuss some of the methods used in order to elicit useful knowledge from textual data that is use to implement those applications. The goal of this paper is to use the techniques from text mining to implement an email application program. The program will predict whether the sender is going to send an attachment in the email based on the context of the email body. When the program predicts the user wants to send a file, it will remind the sender to attach a file before sending it. Also we have test result and evaluation of the e-mail application. In addition, we will discuss the future direction of using text mining technique.
1.05 MB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 2008. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures. -- Contents: Thesis.
Holmes, Brian
The purpose of this thesis is to investigate the application of neural networks to automate the selection and adjustment of music to provide an appropriate emotional element to the video game experience. The containing information is a series of tests and analysis on a group of collected datum engineered through various tests, data structures, and coding practices. The thesis presents the motivation and implementation of the various techniques used in an attempt to create a different approach when dealing with music in video games. A brief history of the topic as well as various code demonstrations is included as well as past, present and future technologies surrounding music as media and sound devices in general. The thesis also outlines several different styles of techniques, a neural network for one, used in order to implement unique technology and music integrations followed by rigorous analysis of said integrations.
9.17 MB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.) -- Algoma University, 2007. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures? tables. -- Contents: Thesis.
Saldanha, Gwyneth
This thesis will attempt to examine the issues involved in customizing or modifying the hardware or the underlying operating system for thin client devices based on Microsoft Windows XP Embedded and offer a solution for working through the required process.
900.55 KB of textual records (PDF)
Audience: Undergraduate. --Dissertation: Thesis (B. A.). -- Algoma University, 2007. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures and tables. -- Contents: Thesis.
Gowans, Dan
Science fiction often presents technologies outside the scope of current reality. Mission Impossible III includes a scene where one man is able to steal another man’s voice purely on a recording containing all of the phonemes in the language. Although this scene is fictional, elements of it can be recreated on a computer. This thesis explores the scope of implementing such a system. It attempts to recreate the functionality of the ears, brain, and voice with respect to communicating Canadian English. Phoneme isolation and identification will be examined. Feature selection will be considered to improve the recognition process. An assessment of the effectiveness of the project is included, along with possible applications.
1.48 MB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.) -- Algoma University, 2007. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures and tables. -- Contents: Thesis.
Carvalho, John U
The objective of this paper is to examine the Radon transform, its properties, and wavelet analysis applied to numerical analysis of the inverse Radon transform. With applications in computer science in mind.
1.33 MB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 1996. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures. -- Contents: Thesis.
Leed, Sarah Anne
This paper presents an innovative use of peripheral interface controllers in the design of a concrete project. The design features of these microcontrollers are investigated, followed by an in depth analysis of the language used to program these microcontrollers. Communications standards and protocols used in the design of the project are also analyzed for completeness. The design details of a programmable array of LEDs are investigated, and the feasibility of such a design is analyzed. A proof of concept model is designed as a prototype to demonstrate the feasibility and functionality of such a project.
10.84 MB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 2007. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures. -- Contents: Thesis.
Templeton, David
An Object Oriented Language ( OOP ) called Smalltalk/V was used for the implementation of this project. Project Intentions: - teaching tool in education - aid to the learning process - a means of visualizing abstract data in meaningful form - multiple points of view on the same data - easy to use interface - limited environment for unlimited manipulation - implementation on generally accessible hardware
5.43 MB of textual records (PDF)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 1992. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures and tables. -- Contents: Thesis.
Abigail, Shawn G
Portable Document File. -- System requirements: Electronic device with World Wide Web browser and PDF reader. -- Online access via World Wide Web at http://archives.algomau.ca/.
It was my original goal to "develop a [Microsoft] Windows program in C++ which will permit the molecular biologist to enter and analyze DNA, RNA and protein sequences through a graphical interface."' I hoped to be able to create a program that would perform many of the basic functions that are important in analyzing a gene. My thesis meets this overall goal.
p. 50 ; 28 cm. -- pdf (textual record)
Audience: Undergraduate. -- Dissertation: Thesis (B. A.) -- Algoma University, 1994.-- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures,tables and diagrams. -- Contents: Thesis.
0.7 cm of textual records
1 cm of textual records
0.7 cm of textual records
0.8 cm of textual records
0.6 cm of textual records
0.7 cm of textual records
0.7 cm of textual records
1 CD-R
1 cm of textual records
0.6 cm of textual records
0.6 cm of textual records
0.8 cm of textual records
0.7 cm of textual records
0.9 cm of textual records
0.7 cm of textual records
1.3 cm of textual records
1.2 cm of textual records
1.1 cm of textual records
0.7 cm of textual records
0.9 cm of textual records
1.5 cm of textual records
0.8 cm of textual records
1.1 cm of textual records
0.6 cm of textual records
0.9 cm of textual records
0.8 cm of textual records
0.9 cm of textual records
Jonathan Krotkiewicz
0.9 cm of textual records
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 2016. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures. -- Contents: Thesis.
ABSTRACT -- Cryptographic hash functions, namely Message Digest 5 and Secure Hash Algorithm 1 were published over two decades ago and are still in frequent practice as a password security measure. Since publication, associated weaknesses and vulnerabilities have been identified with each function. From an information security perspective, the algorithms on their own are considered broken and insecure respectively. The presented literature seeks to illustrate the degree of vulnerability associated with the credited algorithms through extensive research, relevant statistical data, and firsthand experimentation. Well known attack methods such as a dictionary and rainbow table attacks are undertaken against a set of MD5 and SHA-1 hash values in a real environment to extract significant data regarding time and space complexity. The data are used in comparison to approximated results of secure cryptographic hashing standards in practice today. Consequently, information to counteract such attack methods is discussed in detail to proactively prevent the likelihood of a successful data breach in a real system.
Jordan Kahtava
2.5 cm of textual records
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 2016. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures. -- Contents: Thesis.
ABSTRACT -- Mobile tablets have begun playing a larger role in mobile computing because of their portability. To gather an understanding of what mobile computers can currently accomplish Microsoft and Apple tablets were examined. In general this topic is very broad and hard to research because of the number of mobile devices and tablets. Examining this document should provide detailed insight into mobile tablets and their hardware, operating systems, and programming environment. Any developer can use the information to develop, publish, and setup the appropriate development environments for either Apple or Microsoft. The Incremental waterfall methodology was used to develop two applications that utilize the Accelerometer, Gyroscope, and Inclinometer/Attitude sensors. In addition extensive research was conducted and combined to outline how applications can be published and the rules associated with each application store. The Apple application used Xcode and Objective-C while the Microsoft application used Visual Studio 2012, C-Sharp, and XAML. It was determined that developing for Apple is significantly easier because of the extensive documentation and examples available. In addition Apple’s IDE Xcode can be used to develop, design, test, and publish applications without the need for other programs. It is hard to find easily understandable documentation from Microsoft regarding a particular operating system. Visual Studio 2012 or later must be used to develop Microsoft Store applications. Visual Studios can be used to develop, design, and publish applications, however to test against a Surface tablet a developer must install certificates and side load the application. All of the research gathered can be used by any developer wishing to target iOS and Microsoft Store applications. In conclusion, all the information gathered can be used by a business or individual trying to determine the cost and complexity of developing for either Apple or Microsoft.
Daniel Kellar
1 cm of textual records
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 2016. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures. -- Contents: Thesis.
ABSTRACT -- Object recognition has become an enormous area of study in the field of computer vision allowing for the automated detection of known objects. The applications of this technology are vast, and this thesis looked to examine the application of a well-known approach to object detection as a method of automated identification of cloud formations: Haar classification. Using this technology, an application was developed which attempted to identify clouds within an image. The purpose of the research was to investigate the feasibility of Haar classification as a tool which could be used by weather watchers and forecasters for producing updated weather forecasts.
Daniel Imre
1.2 cm of textual records
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 2016. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures. -- Contents: Thesis.
ABSTRACT -- As video games continue to penetrate the mainstream, their target audiences expand and diversify. For this reason, it is unreasonable to expect a video game with static difficulty levels to cater to an audience with a variety of skills and emotional traits. Affective dynamic difficulty adjustment, one of the many areas of exploration within the nascent field of affective gaming, is a high-level design concept that is intended to leverage the player’s indicators of emotion (often physiological) to manipulate the difficulty of a video game in real-time. A review of two dozen studies reveals that skin conductance – the most widely used physiological response system in the history of psychophysiology – can be used to modify the difficulty of a video game, but it is most effective when paired with other psychological indicators of emotion, such as heart rate. Overall, a cross-disciplinary review of over 90 publications provides readers with a comprehensive view of the history, current works, future challenges, and design issues pertaining to psychophysiology, affective gaming, and other related fields. To illustrate one way in which skin conductance can be used to inform an affective dynamic difficulty adjustment algorithm, a top-down shooter titled Electroderma is created. A performant emotion-sensing algorithm, titled data subset analysis, is developed by the author as part of the game. Initial results for both the game and algorithm are promising, but usability testing must be conducted to formally validate the author’s work.
Tayler Pino
0.6 cm of textual records
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 2016. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures. -- Contents: Thesis.
ABSTRACT -- The performance of wireless sensor networks depends greatly on how long the network can run on its limited battery life. There are many ways of increasing the lifetime of a network, with two major techniques being sleep/wake scheduling and topology control algorithms. These two approaches work well, but can be improved by taking into account the variable energy levels in each node in a network. Using this idea, existing sleep/wake scheduling algorithms (the minimum dominating set problem) are improved using three local search techniques: traditional local search, fixed depth and variable depth. As well, an existing local topology control algorithm is studied and improved upon.
Arnav Bhardwaj
1.5 cm of textual records
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 2016. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures. -- Contents: Thesis.
ABSTRACT -- In the field of automated software coding, the main idea was to understand how to convert a code from one type to another type. This broad field of study includes converting code from one programming language to another as well as upgrading coding approaches within the same language. For this report, the prime focus was put towards creating a hybrid technique that can convert a Swing file into JavaFX file. The study was conducted to first understand how these techniques are formed. In addition, to understand how the scientific approaches applied to formulate a reliable code convertor. The research was conducted upon learning about five techniques general conversion methods, ACTR technique, SICORE System, Automated conversion via Design Patterns, and Lexical Analysis. Each of these techniques was different from one another but they demonstrate that each technique was produced in a similar manner. Following the analysis of the aforementioned techniques, a hybrid solution was proposed and detailed. This new approach was a hybrid that derives from the two versions of ACTR and it resembles other technical approaches studied as well. Then a utility was created using the iterative development method. It was used to demonstrate that this technique successfully converts a given Swing file into JavaFX portion.
Brown, Robert.
421 KB of textual records. - 1 PDF.
0.5 cm of textual records. - 1 thesis.
Audience: Undergraduate. -- Dissertation: Thesis (BCS). -- Algoma University, 2017. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures. -- Contents: Thesis.
Abstract: This thesis provides insight into wireless sensor networks and the algorithms used in them to provide optimized coverage and power efficiency. These wireless sensor networks (otherwise known as WSN's) have special uses that require the best in both of these parameters and as such we have tested some algorithms in a hexagonal WSN simulation specifically for accuracy and to note any differences from the traditional cellular automata square grid.
Campioni, Fabio
842 KB of textual records. - 1 PDF.
0.5 cm of textual records. - 1 thesis.
Audience: Undergraduate. -- Dissertation: Thesis (BCS). -- Algoma University, 2017. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures. -- Contents: Thesis.
Abstract: The goal of this thesis is to find a localized algorithm to efficiently schedule battery powered readers in RFID networks. RFID (Radio Frequency Identication) networks consist of readers and tags and are used to track physical items using radio communication. Tags are placed on objects to be tracked, and are read by readers placed throughout the environment. By scheduling readers based on energy between active and low power states, readers last longer, therefore improving overall network lifetime. The development of a software simulator allows immediate testing of algorithms. In addition, a new centralized algorithm was developed based on the set cover problem that produces disjoint sets of readers. After testing these algorithms, results are graphed and calculations are performed to compare their performance. The results show that the localized algorithm achieves very similar performance to a centralized algorithm, while being easier to implement and more flexible to changing networks. In this report, a brief introduction to the technical aspects, history, and applications of RFID will be presented. Then, development of the algorithm will be explained, as well as the process of implementing and testing these algorithms. Then, the results of several experiments will be presented, as well as a description of these results. Several algorithms' performance will be compared and discussed.
Chen, Yang
917 KB of textual records. - 1 PDF.
1 cm of textual records. - 1 thesis.
Audience: Undergraduate. -- Dissertation: Thesis (BCS). -- Algoma University, 2016. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures. -- Contents: Thesis.
Abstract: Data type and amount is growing in an amazing speed and making a big difference in human society. This is the age of big data. Now, data is no longer just a processing object, it already becomes a resource that help human society develop. The application of big data has coved widely areas, such as medical, finance, public service and so on. However, there are not many researches that related big data with coal industry, especially in the western countries. Many people have not noticed the valuable utilize of big data in coal industry. Actually, big data can contribute a lot in coal industry, and there are several countries already started do the research about develop big data technology in coal industry.
Lyon, Dylan.
1.58 MB of textual records. - 1 PDF.
1 cm of textual records. - 1 thesis.
Audience: Undergraduate. -- Dissertation: Thesis (BCS). -- Algoma University, 2017. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures. -- Contents: Thesis.
Abstract: The main problem this thesis attempts to solve is how difficult and potentially expensive making content for video games is. In this thesis, procedural level generation coupled with dynamic difficulty adjustment are tested as ways to generate game content in an efficient manner, while also attempting to maintain a high level of fun and enjoyment for the players. Some of the research questions at the outset of this project included how procedural generation works in gaming, as well as dynamic difficulty systems, the answers to which there are many and some are detailed within the thesis.
Sigouin, William
2.06 MB of textual records. - 1 PDF and raw computer code.
0.5 cm of textual records. - 1 thesis.
Audience: Undergraduate. -- Dissertation: Thesis (BCS). -- Algoma University, 2017. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures. -- Contents: Thesis.
Abstract: Email has become a tool that many people use every day and depend on for communication. Phishers abuse people’s trust in the email system to exploit them for financial gain by stealing personal information from them using the phishing email. These social engineering tools attempt to use various tactics to scare or manipulate people into giving up information that could lead to financial loss. Many people are unaware phishing scams exist until they fall victim to them. However, there are various aspects of phishing emails that can provide an indication that it is phishing. This research extracts several of these aspects and uses them to train and test an Artificial Neural Network. This construct is excellent at solving classification problems such as this one.
Wilding, Tyler.
1.57 MB of textual records. - 1 PDF.
1 cm of textual records. - 1 thesis.
Audience: Undergraduate. -- Dissertation: Thesis (BCS). -- Algoma University, 2017. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures. -- Contents: Thesis.
Abstract: This thesis examines the use of machine learning techniques, namely support-vector machines and genetic algorithms, for the purpose of detecting the following application layer web threats: SQL injections, cross-site scripting, and remote le inclusion attacks. Detecting these attacks becomes more important as the Internet grows and leveraging the strengths of machine learning is one of the many potential avenues in order to improve detection. The examination entails using the techniques to detect the aforementioned threats in a collection of unseen web request data and drawing critical conclusions about their strengths, weaknesses, and viability. Through this process several drawbacks to the genetic algorithm approach stood out, more specifically in its error prone detection and high variability of performance; and while the support-vector machine did solve several of these issues and produced great results it's complexity could be troublesome for real-world applications.
Templeton, David
2 cm of textual records.
Dale, Brian
1 cm of textual records.
Abigail, Shawn
1 cm of textual records.
Adamo, Mario
0.5 cm of textual records.
Calcagnini, Carlo
1 cm of textual records.
Carvalho, John
0.5 cm of textual records.
Pacione, Angelo
0.5 cm of textual records.
Thompson, David
0.5 cm of textual records.
Loureiro, Joe
1.5 cm of textual records.
Belsito, Mark
1 cm of textual records.
Preston, Jean
0.5 cm of textual records.
Karushaar, Michelle
0.5 cm of textual records.
Olar, Doug
1 cm of textual records.
Ecer, Jiri
0.5 cm of textual records.
Tackaberry, Jason
0.5 cm of textual records.
Hooper, Dan
1.3 cm of textual records.
Peltsch, Klaus
0.5 cm of textual records.
Cameron, Ian
0.5 cm of textual records.
Carmo, Anthony
0.5 cm of textual records.
Chaurasia, Achal
1.5 cm of textual records.
Gowans, Dan
1 cm of textual records.
Holmes, Brian
1 cm of textual records.
Acton, Gavan
1.3 cm of textual records.
Bhuiyan, Mohammad
1 cm of textual records.
Biocchi, Michael
0.5 cm of textual records.
Sethi, Mridu
1 cm of textual records.
Saldanha, Gwyneth
1 cm of textual records.
Rosychuk, Rhema
0.5 cm of textual records.
Khan, Arif
1.3 cm of textual records.
Lajoie, Michael
1 cm of textual records.
Leed, Sarah Anne
1 cm of textual records.
1 cm of textual records.
0.7 cm of textual records.
1.3 cm of textual records.
1.3 cm of textual records.
Dipeeka Luitel
14.8 MB of textual records. - 1 thesis.
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 2020. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures and tables. -- Contents: Thesis.
Haoyang Zhang
2.41 MB of textual records. - 1 thesis.
Audience: Undergraduate. -- Dissertation: Thesis (B. A.). -- Algoma University, 2020. -- Submitted in partial fulfillment of course requirements for COSC 4235. -- Includes figures and tables. -- Contents: Thesis.
0.7 cm of textual records.