Visit the COVID-19 Corona Virus South African Resource Portal on https://sacoronavirus.co.za/
A multispectral and machine learning approach to early stress classification in plants
- Authors: Poole, Louise Carmen
- Date: 2022-04
- Subjects: Uncatalogued
- Language: English
- Type: Master's thesis , text
- Identifier: http://hdl.handle.net/10962/232410 , vital:49989
- Description: Crop loss and failure can impact both a country’s economy and food security, often to devastating effects. As such, the importance of successfully detecting plant stresses early in their development is essential to minimize spread and damage to crop production. Identification of the stress and the stress-causing agent is the most critical and challenging step in plant and crop protection. With the development of and increase in ease of access to new equipment and technology in recent years, the use of spectroscopy in the early detection of plant diseases has become notably popular. This thesis narrows down the most suitable multispectral imaging techniques and machine learning algorithms for early stress detection. Datasets were collected of visible images and multispectral images. Dehydration was selected as the plant stress type for the main experiments, and data was collected from six plant species typically used in agriculture. Key contributions of this thesis include multispectral and visible datasets showing plant dehydration as well as a separate preliminary dataset on plant disease. Promising results on dehydration showed statistically significant accuracy improvements in the multispectral imaging compared to visible imaging for early stress detection, with multispectral input obtaining a 92.50% accuracy over visible input’s 77.50% on general plant species. The system was effective at stress detection on known plant species, with multispectral imaging introducing greater improvement to early stress detection than advanced stress detection. Furthermore, strong species discrimination was achieved when exclusively testing either early or advanced dehydration against healthy species. , Thesis (MSc) -- Faculty of Science, Ichthyology & Fisheries Sciences, 2022
- Full Text:
- Date Issued: 2022-04
A decision-making model to guide securing blockchain deployments
- Authors: Cronje, Gerhard Roets
- Date: 2021-10-29
- Subjects: Blockchains (Databases) , Bitcoin , Cryptocurrencies , Distributed databases , Computer networks Security measures , Computer networks Security measures Decision making , Ethereum
- Language: English
- Type: Masters theses , text
- Identifier: http://hdl.handle.net/10962/188865 , vital:44793
- Description: Satoshi Nakamoto, the pseudo-identity accredit with the paper that sparked the implementation of Bitcoin, is famously quoted as remarking, electronically of course, that “If you don’t believe it or don’t get it, I don’t have time to try and convince you, sorry” (Tsapis, 2019, p. 1). What is noticeable, 12 years after the famed Satoshi paper that initiated Bitcoin (Nakamoto, 2008), is that blockchain at the very least has staying power and potentially wide application. A lesser known figure Marc Kenisberg, founder of Bitcoin Chaser which is one of the many companies formed around the Bitcoin ecosystem, summarised it well saying “…Blockchain is the tech - Bitcoin is merely the first mainstream manifestation of its potential” (Tsapis, 2019, p. 1). With blockchain still trying to reach its potential and still maturing on its way towards a mainstream technology the main question that arises for security professionals is how do I ensure we do it securely? This research seeks to address that question by proposing a decision-making model that can be used by a security professional to guide them through ensuring appropriate security for blockchain deployments. This research is certainly not the first attempt at discussing the security of the blockchain and will not be the last, as the technology around blockchain and distributed ledger technology is still rapidly evolving. What this research does try to achieve is not to delve into extremely specific areas of blockchain security, or get bogged down in technical details, but to provide a reference framework that aims to cover all the major areas to be considered. The approach followed was to review the literature regarding blockchain and to identify the main security areas to be addressed. It then proposes a decision-making model and tests the model against a fictitious but relevant real-world example. It concludes with learnings from this research. The reader can be the judge, but the model aims to be a practical valuable resource to be used by any security professional, to navigate the security aspects logically and understandably when being involved in a blockchain deployment. In contrast to the Satoshi quote, this research tries to convince the reader and assist him/her in understanding the security choices related to every blockchain deployment. , Thesis (MSc) -- Faculty of Science, Computer Science, 2021
- Full Text:
- Date Issued: 2021-10-29
Improved concurrent Java processes
- Authors: Ntlahla, Mbalentle Apelele Wiseman
- Date: 2021-10-29
- Subjects: Java (Computer program language) , Computer multitasking , Sequential processing (Computer science) , Parallel programming (Computer science) , Simultaneous multithreading processors
- Language: English
- Type: Master's theses , text
- Identifier: http://hdl.handle.net/10962/192129 , vital:45198
- Description: The rise in the number of cores in a processor has resulted in computer programmers needing to write concurrent programs to utilize the extra available processors. Concurrent programming can utilize the extra processors available in a multi-core architecture. However, writing concurrent programs introduces complexities that are not encountered in sequential programming (race conditions, deadlocks, starvation, liveness, etc., are some of the complexities that come with concurrent programming). These complexities require programming languages to provide functionality to help programmers with writing concurrent programs. The Java language is designed to support concurrent programming, mostly through threads. The support is provided through the Java programming language itself and Java class libraries. Although concurrent processes are important and have their own advantages over concurrent threads Java has limited support for concurrent processes. In this thesis we attempt to provide the same support that Java has for threads through the java.util.concurrent library to processes. This is attempted to be done through a Java library (za.co.jcp). The library will provide synchronisation methods of multiple processes, Java process shared variables, atomic variables, process-safe data structures, and a process executors framework similar to that of the executor framework provided by Java for threads. The two libraries' similarities, and performance is analyzed. The analysis between the two libraries is performed to compare the code portability, ease of use, and performance difference between the two libraries. The results from the project have shown that it is possible for Java to provide support for concurrency through processes and not only threads. In addition from the benchmarks performed the performance of the za.co.jcp library is not significantly slower than the current java.util.concurrent thread library. This means that Java concurrent applications will also now be able to use cooperating processes rather than be confined to using threads. , Thesis (MSc) -- Faculty of Science, Computer Science, 2021
- Full Text:
- Date Issued: 2021-10-29
Peer-to-peer energy trading system using IoT and a low-computation blockchain network
- Authors: Ncube, Tyron
- Date: 2021-10-29
- Subjects: Blockchains (Databases) , Internet of things , Renewable energy sources , Smart power grids , Peer-to-peer architecture (Computer networks) , Energy trading system
- Language: English
- Type: Master's theses , text
- Identifier: http://hdl.handle.net/10962/192119 , vital:45197
- Description: The use of renewable energy is increasing every year as it is seen as a viable and sustain- able long-term alternative to fossil-based sources of power. Emerging technologies are being merged with existing renewable energy systems to address some of the challenges associated with renewable energy, such as reliability and limited storage facilities for the generated energy. The Internet of Things (IoT) has made it possible for consumers to make money by selling off excess energy back to the utility company through smart grids that allow bi-directional communication between the consumer and the utility company. The major drawback of this is that the utility company still plays a central role in this setup as they are the only buyer of this excess energy generated from renewable energy sources. This research intends to use blockchain technology by leveraging its decentralized architecture to enable other individuals to be able to purchase this excess energy. Blockchain technology is first explained in detail, and its main features, such as consensus mechanisms, are examined. This evaluation of blockchain technology gives rise to some design questions that are taken into consideration to create a low-energy, low-computation Ethereum-based blockchain network that is the foundation for a peer-to-peer energy trading system. The peer-to-peer energy trading system makes use of smart meters to collect data about energy usage and gives users a web-based interface where they can transact with each other. A smart contract is also designed to facilitate payments for transactions. Lastly, the system is tested by carrying out transactions and transferring energy from one node in the system to another. , Thesis (MSc) -- Faculty of Science, Computer Science, 2021
- Full Text:
- Date Issued: 2021-10-29
Remote fidelity of Container-Based Network Emulators
- Authors: Peach, Schalk Willem
- Date: 2021-10-29
- Subjects: Uncatalogued
- Language: English
- Type: Master's theses , text
- Identifier: http://hdl.handle.net/10962/192141 , vital:45199
- Description: This thesis examines if Container-Based Network Emulators (CBNEs) are able to instantiate emulated nodes that provide sufficient realism to be used in information security experiments. The realism measure used is based on the information available from the point of view of a remote attacker. During the evaluation of a Container-Based Network Emulator (CBNE) as a platform to replicate production networks for information security experiments, it was observed that nmap fingerprinting returned Operating System (OS) family and version results inconsistent with that of the host Operating System (OS). CBNEs utilise Linux namespaces, the technology used for containerisation, to instantiate \emulated" hosts for experimental networks. Linux containers partition resources of the host OS to create lightweight virtual machines that share a single OS kernel. As all emulated hosts share the same kernel in a CBNE network, there is a reasonable expectation that the fingerprints of the host OS and emulated hosts should be the same. Based on how CBNEs instantiate emulated networks and that fingerprinting returned inconsistent results, it was hypothesised that the technologies used to construct CBNEs are capable of influencing fingerprints generated by utilities such as nmap. It was predicted that hosts emulated using different CBNEs would show deviations in remotely generated fingerprints when compared to fingerprints generated for the host OS. An experimental network consisting of two emulated hosts and a Layer 2 switch was instantiated on multiple CBNEs using the same host OS. Active and passive fingerprinting was conducted between the emulated hosts to generate fingerprints and OS family and version matches. Passive fingerprinting failed to produce OS family and version matches as the fingerprint databases for these utilities are no longer maintained. For active fingerprinting the OS family results were consistent between tested systems and the host OS, though OS version results reported was inconsistent. A comparison of the generated fingerprints revealed that for certain CBNEs fingerprint features related to network stack optimisations of the host OS deviated from other CBNEs and the host OS. The hypothesis that CBNEs can influence remotely generated fingerprints was partially confirmed. One CBNE system modified Linux kernel networking options, causing a deviation from fingerprints generated for other tested systems and the host OS. The hypothesis was also partially rejected as the technologies used by CBNEs do not influence the remote fidelity of emulated hosts. , Thesis (MSc) -- Faculty of Science, Computer Science, 2021
- Full Text:
- Date Issued: 2021-10-29
Building the field component of a smart irrigation system: A detailed experience of a computer science graduate
- Authors: Pipile, Yamnkelani Yonela
- Date: 2021-10
- Subjects: Uncatalogued
- Language: English
- Type: Master's theses , text
- Identifier: http://hdl.handle.net/10962/191814 , vital:45167
- Description: South Africa is a semi-arid area with an average annual rainfall of approximately 450mm, 60 per cent of which goes towards irrigation. Current irrigation systems generally apply water in a uniform manner across a field, which is both inefficient and can kill the plants. The Internet of Things (IoT), an emerging technology involving the utilization of sensors and actuators to build complex feedback systems, present an opportunity to build a smart irrigation solution. This research project illustrates the development of the field components of a water monitoring system using o_ the shelf and inexpensive components, exploring at the same time how easy or difficult it would be for a general Computer Science graduate to use hardware components and associated tools within the IoT area. The problem was initially broken down through a classical top-down process, in order to identify the components such as micro-computers, micro- controllers, sensors and network connections, that would be needed to build the solution. I then selected the Raspberry Pi 3, the Arduino Arduino Uno, the MH-Sensor-Series hygrometer, the MQTT messaging protocol, and the ZigBee communication protocol as implemented in the XBee S2C. Once the components were identified, the work followed a bottom-up approach: I studied the components in isolation and relative to each other, through a structured series of experiments, with each experiment addressing a specific component and examining how easy was to use the component. While each experiment allowed the author to acquire and deepen her understanding of each component, and progressively built a more sophisticated prototype, towards the complete solution. I found the vast majority of the identified components and tools to be easy to use, well documented, and most importantly, mature for consumption by our target user, until I encountered the MQTT-SN (MQTT-Sensor Network) implementation, not as mature as the rest. This resulted in us designing and implementing a light-weight, general ZigBee/MQTT gateway, named “yoGa” (Yonella's Gateway) from the author. At the end of the research, I was able to build the field components of a smart irrigation system using the selected tools, including the yoGa gateway, proving practically that a Computer Science graduate from a South African University can become productive in the emerging IoT area. , Thesis (MSc) -- Faculty of Science, Computer Science, 2021
- Full Text:
- Date Issued: 2021-10
An exploratory investigation into an Integrated Vulnerability and Patch Management Framework
- Authors: Carstens, Duane
- Date: 2021-04
- Subjects: Computer security , Computer security -- Management , Computer networks -- Security measures , Patch Management , Integrated Vulnerability
- Language: English
- Type: thesis , text , Masters , MSc
- Identifier: http://hdl.handle.net/10962/177940 , vital:42892
- Description: In the rapidly changing world of cybersecurity, the constant increase of vulnerabilities continues to be a prevalent issue for many organisations. Malicious actors are aware that most organisations cannot timeously patch known vulnerabilities and are ill-prepared to protect against newly created vulnerabilities where a signature or an available patch has not yet been created. Consequently, information security personnel face ongoing challenges to mitigate these risks. In this research, the problem of remediation in a world of increasing vulnerabilities is considered. The current paradigm of vulnerability and patch management is reviewed using a pragmatic approach to all associated variables of these services / practices and, as a result, what is working and what is not working in terms of remediation is understood. In addition to the analysis, a taxonomy is created to provide a graphical representation of all associated variables to vulnerability and patch management based on existing literature. Frameworks currently being utilised in the industry to create an effective engagement model between vulnerability and patch management services are considered. The link between quantifying a threat, vulnerability and consequence; what Microsoft has available for patching; and the action plan for resulting vulnerabilities is explored. Furthermore, the processes and means of communication between each of these services are investigated to ensure there is effective remediation of vulnerabilities, ultimately improving the security risk posture of an organisation. In order to effectively measure the security risk posture, progress is measured between each of these services through a single averaged measurement metric. The outcome of the research highlights influencing factors that impact successful vulnerability management, in line with identified themes from the research taxonomy. These influencing factors are however significantly undermined due to resources within the same organisations not having a clear and consistent understanding of their role, organisational capabilities and objectives for effective vulnerability and patch management within their organisations. , Thesis (MSc) -- Faculty of Science, Computer Science, 2021
- Full Text:
- Date Issued: 2021-04
An investigation into the current state of web based cryptominers and cryptojacking
- Authors: Len, Robert
- Date: 2021-04
- Subjects: Cryptocurrencies , Malware (Computer software) , Computer networks -- Security measures , Computer networks -- Monitoring , Cryptomining , Coinhive , Cryptojacking
- Language: English
- Type: thesis , text , Masters , MSc
- Identifier: http://hdl.handle.net/10962/178248 , vital:42924
- Description: The aim of this research was to conduct a review of the current state and extent of surreptitious crypto mining software and its prevalence as a means for income generation. Income is generated through the use of a viewer's browser to execute custom JavaScript code to mine cryptocurrencies such as Monero and Bitcoin. The research aimed to measure the prevalence of illicit mining scripts being utilised for “in-browser" cryptojacking while further analysing the ecosystems that support the cryptomining environment. The extent of the research covers aspects such as the content (or type) of the sites hosting malicious “in-browser" cryptomining software as well as the occurrences of currencies utilised in the cryptographic mining and the analysis of cryptographic mining code samples. This research aims to compare the results of previous work with the current state of affairs since the closure of Coinhive in March 2018. Coinhive were at the time the market leader in such web based mining services. Beyond the analysis of the prevalence of cryptomining on the web today, research into the methodologies and techniques used to detect and counteract cryptomining are also conducted. This includes the most recent developments in malicious JavaScript de-obfuscation as well as cryptomining signature creation and detection. Methodologies for heuristic JavaScript behaviour identification and subsequent identification of potential malicious out-liars are also included within the research of the countermeasure analysis. The research revealed that although no longer functional, Coinhive remained as the most prevalent script being used for “in-browser" cryptomining services. While remaining the most prevalent, there was however a significant decline in overall occurrences compared to when coinhive.com was operational. Analysis of the ecosystem hosting \in-browser" mining websites was found to be distributed both geographically as well as in terms of domain categorisations. , Thesis (MSc) -- Faculty of Science, Computer Science, 2021
- Full Text:
- Date Issued: 2021-04
Evaluating the cyber security skills gap relating to penetration testing
- Authors: Beukes, Dirk Johannes
- Date: 2021
- Subjects: Computer networks -- Security measures , Computer networks -- Monitoring , Computer networks -- Management , Data protection , Information technology -- Security measures , Professionals -- Supply and demand , Electronic data personnel -- Supply and demand
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/171120 , vital:42021
- Description: Information Technology (IT) is growing rapidly and has become an integral part of daily life. It provides a boundless list of services and opportunities, generating boundless sources of information, which could be abused or exploited. Due to this growth, there are thousands of new users added to the grid using computer systems in a static and mobile environment; this fact alone creates endless volumes of data to be exploited and hardware devices to be abused by the wrong people. The growth in the IT environment adds challenges that may affect users in their personal, professional, and business lives. There are constant threats on corporate and private computer networks and computer systems. In the corporate environment companies try to eliminate the threat by testing networks making use of penetration tests and by implementing cyber awareness programs to make employees more aware of the cyber threat. Penetration tests and vulnerability assessments are undervalued; are seen as a formality and are not used to increase system security. If used regularly the computer system will be more secure and attacks minimized. With the growth in technology, industries all over the globe become fully dependent on information systems in doing their day-to-day business. As technology evolves and new technology becomes available, the bigger the risk becomes to protect against the dangers which come with this new technology. For industry to protect itself against this growth in technology, personnel with a certain skill set is needed. This is where cyber security plays a very important role in the protection of information systems to ensure the confidentiality, integrity and availability of the information system itself and the data on the system. Due to this drive to secure information systems, the need for cyber security by professionals is on the rise as well. It is estimated that there is a shortage of one million cyber security professionals globally. What is the reason for this skills shortage? Will it be possible to close this skills shortage gap? This study is about identifying the skills gap and identifying possible ways to close this skills gap. In this study, research was conducted on the cyber security international standards, cyber security training at universities and international certification focusing specifically on penetration testing, the evaluation of the need of industry while recruiting new penetration testers, finishing with suggestions on how to fill possible gaps in the skills market with a conclusion.
- Full Text:
- Date Issued: 2021
An Analysis of Internet Background Radiation within an African IPv4 netblock
- Authors: Hendricks, Wadeegh
- Date: 2020
- Subjects: Computer networks -- Monitoring –- South Africa , Dark Web , Computer networks -- Security measures –- South Africa , Universities and Colleges -- Computer networks -- Security measures , Malware (Computer software) , TCP/IP (Computer network protocol)
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/103791 , vital:32298
- Description: The use of passive network sensors has in the past proven to be quite effective in monitoring and analysing the current state of traffic on a network. Internet traffic destined to a routable, yet unused address block is often referred to as Internet Background Radiation (IBR) and characterised as unsolicited. This unsolicited traffic is however quite valuable to researchers in that it allows them to study the traffic patterns in a covert manner. IBR is largely composed of network and port scanning traffic, backscatter packets from virus and malware activity and to a lesser extent, misconfiguration of network devices. This research answers the following two questions: (1) What is the current state of IBR within the context of a South African IP address space and (2) Can any anomalies be detected in the traffic, with specific reference to current global malware attacks such as Mirai and similar. Rhodes University operates five IPv4 passive network sensors, commonly known as network telescopes, each monitoring its own /24 IP address block. The oldest of these network telescopes has been collecting traffic for over a decade, with the newest being established in 2011. This research focuses on the in-depth analysis of the traffic captured by one telescope in the 155/8 range over a 12 month period, from January to December 2017. The traffic was analysed and classified according the protocol, TCP flag, source IP address, destination port, packet count and payload size. Apart from the normal network traffic graphs and tables, a geographic heatmap of source traffic was also created, based on the source IP address. Spikes and noticeable variances in traffic patterns were further investigated and evidence of Mirai like malware activity was observed. Network and port scanning were found to comprise the largest amount of traffic, accounting for over 90% of the total IBR. Various scanning techniques were identified, including low level passive scanning and much higher level active scanning.
- Full Text:
- Date Issued: 2020
An exploration of the overlap between open source threat intelligence and active internet background radiation
- Authors: Pearson, Deon Turner
- Date: 2020
- Subjects: Computer networks -- Security measures , Computer networks -- Monitoring , Malware (Computer software) , TCP/IP (Computer network protocol) , Open source intelligence
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/103802 , vital:32299
- Description: Organisations and individuals are facing increasing persistent threats on the Internet from worms, port scanners, and malicious software (malware). These threats are constantly evolving as attack techniques are discovered. To aid in the detection and prevention of such threats, and to stay ahead of the adversaries conducting the attacks, security specialists are utilising Threat Intelligence (TI) data in their defense strategies. TI data can be obtained from a variety of different sources such as private routers, firewall logs, public archives, and public or private network telescopes. However, at the rate and ease at which TI is produced and published, specifically Open Source Threat Intelligence (OSINT), the quality is dropping, resulting in fragmented, context-less and variable data. This research utilised two sets of TI data, a collection of OSINT and active Internet Background Radiation (IBR). The data was collected over a period of 12 months, from 37 publicly available OSINT datasets and five IBR datasets. Through the identification and analysis of common data between the OSINT and IBR datasets, this research was able to gain insight into how effective OSINT is at detecting and potentially reducing ongoing malicious Internet traffic. As part of this research, a minimal framework for the collection, processing/analysis, and distribution of OSINT was developed and tested. The research focused on exploring areas in common between the two datasets, with the intention of creating an enriched, contextualised, and reduced set of malicious source IP addresses that could be published for consumers to use in their own environment. The findings of this research pointed towards a persistent group of IP addresses observed on both datasets, over the period under research. Using these persistent IP addresses, the research was able to identify specific services being targeted. Amongst these persistent IP addresses were significant packets from Mirai like IoT Malware on port 23/tcp and 2323/tcp as well as general scanning activity on port 445/TCP.
- Full Text:
- Date Issued: 2020
An investigation into the application of Distributed Endpoint Processing to 3D Immersive Audio Rendering
- Authors: Devonport, Robin Sean
- Date: 2020
- Subjects: Uncatalogued
- Language: English
- Type: thesis , text , Masters , MSc
- Identifier: http://hdl.handle.net/10962/163258 , vital:41022
- Description: Thesis (MSc)--Rhodes University, Faculty of Science, Computer Science, 2020.
- Full Text:
- Date Issued: 2020
An investigation into the readiness of open source software to build a Telco Cloud for virtualising network functions
- Authors: Chindeka, Tapiwa C
- Date: 2020
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/124320 , vital:35593
- Description: Cloud computing offers new mechanisms that change the way networks can be created and managed. The increased demand for multimedia and Internet of Things (IoT) services using the Internet Protocol is also fueling the need to look more into a networking approach that is less reliant on physical hardware components and allows new networks and network components to be created on-demand. Network Function Virtualisation (NFV) is a networking paradigm that decouples network functions from the hardware on which they run on. This offers new approaches to telecommunication providers who are looking to new ways of improving Quality of Service (QoS) in cost effective ways. Cloud technologies have given way to more specialised cloud environments such as the telco cloud. The telco cloud is a cloud environment where telecommunication services are hosted utilising NFV techniques. As the use of telecommunication standards moves towards 5G, network services will be provided in a virtualised manner in order to keep up with the demand. Open source software is a driver for innovation as it is has a collaborative culture to support it. This research investigates the readiness of open source tools to build a telco cloud that supports functions such as autoscaling and fault tolerance. Currently available open source software was explored for the different aspects involved in building a cloud from the ground up. The ETSI NFV MANO framework is also discussed as it is a widely used guiding standard for implementing NFV. Guided by the ETSI NFV MANO framework, open source software was used in an experiment to build a resilient cloud environment in which a virtualised IP Multimedia Subsystem (vIMS) network was deployed. Through this experimentation, it is evident that open source tools are mature enough to build the cloud environment and its ETSI NFV MANO compliant orchestration. However, features such as autoscaling and fault tolerance are still fairly immature and experimental.
- Full Text:
- Date Issued: 2020
Building a flexible and inexpensive multi-layer switch for software-defined networks
- Authors: Magwenzi, Tinashe
- Date: 2020
- Subjects: Software-defined networking (Computer network technology) , Telecommunication -- Switching systems , OpenFlow (Computer network protocol) , Local area networks (Computer networks)
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/142841 , vital:38122
- Description: Software-Defined Networking (SDN) is a paradigm which enables the realisation of programmable network through the separation of the control logic from the forwarding functions. This separation is a departure from the traditional architecture. Much of the work done in SDN enabled devices has concentrated on higher end, high speed networks (10s GBit/s 100s GBit/s), rather than the relatively low bandwidth links (10s MBit/s to a few GBit/s) which are seen, for example, in South Africa. As SDN is increasingly becoming more accepted, due to its advantages over the traditional networks, it has been adopted for industrial purposes such as networking in data centres and network providers. The demand for programmable networks is increasing but is limited by the ability of providers to upgrade their infrastructure. In addition, as access to the Internet has become less expensive, the use of Internet is increasing in academic institutions, NGOs, and small to medium enterprises. This thesis details a means of building and managing a small scale Software-Defined Network using commodity hardware and open source tools. Core to the SDN Network illustrated in this thesis is the prototype of a multi-layer SDN switch. The proposed device is targeted to serve lower bandwidth communication (in relation to commercially produced high speed SDN-enabled devices). The performance of the prototype multilayer switch had shown to achieve: data-rates of up to 99.998%, average latencies that are under 40µs during forwarding/switching and under 100µs during routing while using packet sizes between 64 bytes and 1518 bytes, and a jitter of less than 15µs during all tests. This research explores in detail the design, development, and management of a multi-layer switch and its placement and integration in small scale SDN network. This includes testing of Layer 2 forwarding and Layer 3 routing, OpenFlow compliance testing, the management of the switch using created SDN applications, and real life network functionality such as forwarding, routing and VLAN networking to demonstrate its real world applicability.
- Full Text:
- Date Issued: 2020
NFComms: A synchronous communication framework for the CPU-NFP heterogeneous system
- Authors: Pennefather, Sean
- Date: 2020
- Subjects: Network processors , Computer programming , Parallel processing (Electronic computers) , Netronome
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/144181 , vital:38318
- Description: This work explores the viability of using a Network Flow Processor (NFP), developed by Netronome, as a coprocessor for the construction of a CPU-NFP heterogeneous platform in the domain of general processing. When considering heterogeneous platforms involving architectures like the NFP, the communication framework provided is typically represented as virtual network interfaces and is thus not suitable for generic communication. To enable a CPU-NFP heterogeneous platform for use in the domain of general computing, a suitable generic communication framework is required. A feasibility study for a suitable communication medium between the two candidate architectures showed that a generic framework that conforms to the mechanisms dictated by Communicating Sequential Processes is achievable. The resulting NFComms framework, which facilitates inter- and intra-architecture communication through the use of synchronous message passing, supports up to 16 unidirectional channels and includes queuing mechanisms for transparently supporting concurrent streams exceeding the channel count. The framework has a minimum latency of between 15.5 μs and 18 μs per synchronous transaction and can sustain a peak throughput of up to 30 Gbit/s. The framework also supports a runtime for interacting with the Go programming language, allowing user-space processes to subscribe channels to the framework for interacting with processes executing on the NFP. The viability of utilising a heterogeneous CPU-NFP system for use in the domain of general and network computing was explored by introducing a set of problems or applications spanning general computing, and network processing. These were implemented on the heterogeneous architecture and benchmarked against equivalent CPU-only and CPU/GPU solutions. The results recorded were used to form an opinion on the viability of using an NFP for general processing. It is the author’s opinion that, beyond very specific use cases, it appears that the NFP-400 is not currently a viable solution as a coprocessor in the field of general computing. This does not mean that the proposed framework or the concept of a heterogeneous CPU-NFP system should be discarded as such a system does have acceptable use in the fields of network and stream processing. Additionally, when comparing the recorded limitations to those seen during the early stages of general purpose GPU development, it is clear that general processing on the NFP is currently in a similar state.
- Full Text:
- Date Issued: 2020
Securing software development using developer access control
- Authors: Ongers, Grant
- Date: 2020
- Subjects: Computer software -- Development , Computers -- Access control , Computer security -- Software , Computer networks -- Security measures , Source code (Computer science) , Plug-ins (Computer programs) , Data encryption (Computer science) , Network Access Control , Data Loss Prevention , Google’s BeyondCorp , Confidentiality, Integrity and Availability (CIA) triad
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/149022 , vital:38796
- Description: This research is aimed at software development companies and highlights the unique information security concerns in the context of a non-malicious software developer’s work environment; and furthermore explores an application driven solution which focuses specifically on providing developer environments with access control for source code repositories. In order to achieve that, five goals were defined as discussed in section 1.3. The application designed to provide the developer environment with access control to source code repositories was modelled on lessons taken from the principles of Network Access Control (NAC), Data Loss Prevention (DLP), and Google’s BeyondCorp (GBC) for zero-trust end-user computing. The intention of this research is to provide software developers with maximum access to source code without compromising Confidentiality, as per the Confidentiality, Integrity and Availability (CIA) triad. Employing data gleaned from examining the characteristics of DLP, NAC, and Beyond- Corp—proof-of-concept code was developed to regulate access to the developer’s environment and source code. The system required sufficient flexibility to support the diversity of software development environments. In order to achieve this, a modular design was selected. The system comprised a client side agent and a plug-in-ready server component. The client side agent mounts and dismounts encrypted volumes containing source code. Furthermore, it provides the server with information of the client that is demanded by plug-ins. The server side service provided encryption keys to facilitate the mounting of the volumes and, through plug-ins, asked questions of the client agent to determine whether access should be granted. The solution was then tested with integration and system testing. There were plans to have it used by development teams who were then to be surveyed as to their view on the proof of concept but this proved impossible. The conclusion provides a basis by which organisations that develop software can better balance the two corners of the CIA triad most often in conflict: Confidentiality in terms of their source code against the Availability of the same to developers.
- Full Text:
- Date Issued: 2020
Technology in conservation: towards a system for in-field drone detection of invasive vegetation
- Authors: James, Katherine Margaret Frances
- Date: 2020
- Subjects: Drone aircraft in remote sensing , Neural networks (Computer science) , Drone aircraft in remote sensing -- Case studies , Machine learning , Computer vision , Environmental monitoring -- Remote sensing , Invasive plants -- Monitoring
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/143408 , vital:38244
- Description: Remote sensing can assist in monitoring the spread of invasive vegetation. The adoption of camera-carrying unmanned aerial vehicles, commonly referred to as drones, as remote sensing tools has yielded images of higher spatial resolution than traditional techniques. Drones also have the potential to interact with the environment through the delivery of bio-control or herbicide, as seen with their adoption in precision agriculture. Unlike in agricultural applications, however, invasive plants do not have a predictable position relative to each other within the environment. To facilitate the adoption of drones as an environmental monitoring and management tool, drones need to be able to intelligently distinguish between invasive and non-invasive vegetation on the fly. In this thesis, we present the augmentation of a commercially available drone with a deep machine learning model to investigate the viability of differentiating between an invasive shrub and other vegetation. As a case study, this was applied to the shrub genus Hakea, originating in Australia and invasive in several countries including South Africa. However, for this research, the methodology is important, rather than the chosen target plant. A dataset was collected using the available drone and manually annotated to facilitate the supervised training of the model. Two approaches were explored, namely, classification and semantic segmentation. For each of these, several models were trained and evaluated to find the optimal one. The chosen model was then interfaced with the drone via an Android application on a mobile device and its performance was preliminarily evaluated in the field. Based on these findings, refinements were made and thereafter a thorough field evaluation was performed to determine the best conditions for model operation. Results from the classification task show that deep learning models are capable of distinguishing between target and other shrubs in ideal candidate windows. However, classification in this manner is restricted by the proposal of such candidate windows. End-to-end image segmentation using deep learning overcomes this problem, classifying the image in a pixel-wise manner. Furthermore, the use of appropriate loss functions was found to improve model performance. Field tests show that illumination and shadow pose challenges to the model, but that good recall can be achieved when the conditions are ideal. False positive detection remains an issue that could be improved. This approach shows the potential for drones as an environmental monitoring and management tool when coupled with deep machine learning techniques and outlines potential problems that may be encountered.
- Full Text:
- Date Issued: 2020
Towards a capability maturity model for a cyber range
- Authors: Aschmann, Michael Joseph
- Date: 2020
- Subjects: Computer software -- Development , Computer security
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/163142 , vital:41013
- Description: This work describes research undertaken towards the development of a Capability Maturity Model (CMM) for Cyber Ranges (CRs) focused on cyber security. Global cyber security needs are on the rise, and the need for attribution within the cyber domain is of particular concern. This has prompted major efforts to enhance cyber capabilities within organisations to increase their total cyber resilience posture. These efforts include, but are not limited to, the testing of computational devices, networks, and applications, and cyber skills training focused on prevention, detection and cyber attack response. A cyber range allows for the testing of the computational environment. By developing cyber events within a confined virtual or sand-boxed cyber environment, a cyber range can prepare the next generation of cyber security specialists to handle a variety of potential cyber attacks. Cyber ranges have different purposes, each designed to fulfil a different computational testing and cyber training goal; consequently, cyber ranges can vary greatly in the level of variety, capability, maturity and complexity. As cyber ranges proliferate and become more and more valued as tools for cyber security, a method to classify or rate them becomes essential. Yet while a universal criteria for measuring cyber ranges in terms of their capability maturity levels becomes more critical, there are currently very limited resources for researchers aiming to perform this kind of work. For this reason, this work proposes and describes a CMM, designed to give organisations the ability to benchmark the capability maturity of a given cyber range. This research adopted a synthesised approach to the development of a CMM, grounded in prior research and focused on the production of a conceptual model that provides a useful level of abstraction. In order to achieve this goal, the core capability elements of a cyber range are defined with their relative importance, allowing for the development of a proposed classification cyber range levels. An analysis of data gathered during the course of an expert review, together with other research, further supported the development of the conceptual model. In the context of cyber range capability, classification will include the ability of the cyber range to perform its functions optimally with different core capability elements, focusing on the Measurement of Capability (MoC) with its elements, namely effect, performance and threat ability. Cyber range maturity can evolve over time and can be defined through the Measurement of Maturity (MoM) with its elements, namely people, processes, technology. The combination of these measurements utilising the CMM for a CR determines the capability maturity level of a CR. The primary outcome of this research is the proposed level-based CMM framework for a cyber range, developed using adopted and synthesised CMMs, the analysis of an expert review, and the mapping of the results.
- Full Text:
- Date Issued: 2020
Transformative ICT education practices in rural secondary schools for developmental needs and realities: the Eastern Cape Province, South Africa
- Authors: Simuja, Clement
- Date: 2020
- Subjects: Education, Secondary -- South Africa -- Data processing , Information technology -- Study and teaching (Secondary) --South Africa , Educational technology -- Developing countries , Rural development -- Developing countries , Computer-assisted instruction -- South Africa -- Eastern Cape , Internet in education -- South Africa , Rural schools -- South Africa -- Eastern Cape , Community and school -- South Africa -- Eastern Cape
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/150631 , vital:38991
- Description: The perceived social development significance of Information and Communication Technology (ICT) has dramatically expanded the domains in which this cluster of ICTs is being discussed and acted upon. The action to promote community development in rural areas in South Africa has made its way into the introduction of ICT education in secondary schools. Since rural secondary schools form part of the framework for rural communities, they are being challenged to provide ICT education that makes a difference in learners’ lives. This requires engaging education practices that inspire learners to construct knowledge of ICT that does not only respond to examination purposes but rather, to the needs and development aspirations of the community. This research examines the experience of engaging learners and communities in socially informed ICT education in rural secondary schools. Specifically, it seeks to develop a critique of current practices involved in ICT education in rural secondary schools, and explores plausible alternatives to such practices that would make ICT education more transformative and structured towards the developmental concerns of communities. The main empirical focus for the research was five rural secondary schools in the Eastern Cape Province in South Africa. The research involved 53 participants that participated in a socially informed ICT training process. The training was designed to inspire participants to share their self-defined ICT education and ICT knowledge experiences. Critical Action Learning and Philosophical Inquiry provided the methodological framework, whilst the theoretical framework draws on Foucault’s philosophical ideas on power-knowledge relations. Through this theoretical analysis, the research examines the dynamic interplay of practices in ICT education with the values, ideals, and knowledge that form the core-life experiences of learners and rural communities. The research findings of this study indicate that current ICT education practices in rural secondary schools are endowed with ideologies that are affecting learners’ identity, social experiences, power, and ownership of the reflective meaning of using ICTs in community development. The contribution of this thesis lies in demonstrating ways that reframe ICT education transformatively, and more specifically its practices in the light of the way power, identity, ownership and social experience construct and offer learners a transformative view of self and the world. This could enable ICT education to fulfil the potential of contributing to social development in rural communities. The thesis culminates by presenting a theoretical framework that articulates the structural and authoritative components of ICT education practices – these relate to learners’ conscious understandings and represented thoughts, sensations and meanings embedded in the context, and actions and locations of using their knowledge of ICT.
- Full Text:
- Date Issued: 2020
A comparative study of CERBER, MAKTUB and LOCKY Ransomware using a Hybridised-Malware analysis
- Authors: Schmitt, Veronica
- Date: 2019
- Subjects: Microsoft Windows (Computer file) , Data protection , Computer crimes -- Prevention , Computer security , Computer networks -- Security measures , Computers -- Access control , Malware (Computer software)
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/92313 , vital:30702
- Description: There has been a significant increase in the prevalence of Ransomware attacks in the preceding four years to date. This indicates that the battle has not yet been won defending against this class of malware. This research proposes that by identifying the similarities within the operational framework of Ransomware strains, a better overall understanding of their operation and function can be achieved. This, in turn, will aid in a quicker response to future attacks. With the average Ransomware attack taking two hours to be identified, it shows that there is not yet a clear understanding as to why these attacks are so successful. Research into Ransomware is limited by what is currently known on the topic. Due to the limitations of the research the decision was taken to only examined three samples of Ransomware from different families. This was decided due to the complexities and comprehensive nature of the research. The in depth nature of the research and the time constraints associated with it did not allow for proof of concept of this framework to be tested on more than three families, but the exploratory work was promising and should be further explored in future research. The aim of the research is to follow the Hybrid-Malware analysis framework which consists of both static and the dynamic analysis phases, in addition to the digital forensic examination of the infected system. This allows for signature-based findings, along with behavioural and forensic findings all in one. This information allows for a better understanding of how this malware is designed and how it infects and remains persistent on a system. The operating system which has been chosen is the Microsoft Window 7 operating system which is still utilised by a significant proportion of Windows users especially in the corporate environment. The experiment process was designed to enable the researcher the ability to collect information regarding the Ransomware and every aspect of its behaviour and communication on a target system. The results can be compared across the three strains to identify the commonalities. The initial hypothesis was that Ransomware variants are all much like an instant cake box consists of specific building blocks which remain the same with the flavouring of the cake mix being the unique feature.
- Full Text:
- Date Issued: 2019