A decision support tool for equitable and sustainable water management in the Koue Bokkeveld
- Authors: Tholanah, Rodney
- Date: 2024-10-11
- Subjects: Uncatalogued
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/464344 , vital:76503
- Description: Water is an important natural resource with multiple domestic and industrial uses. South Africa has a water scarcity problem, with domestic and industrial demand projected to increase due to population growth. One affected area is the Koue Bokkeveld (KBV), an agricultural catchment in the Western Cape. Water scarcity, especially in the summer, can cause conflicts among the farmers. This study sought to determine the extent to which Agent-Based (AB) modelling could be used to model the KBV catchment area and simulate future climate and usage scenarios. The study used the ComMod methodology as it allows stakeholders to be involved at each step of the modelling process, thus improving the model’s credibility as a decision-support tool (DST). The model was implemented using Cormas, an Agent-Based Model (ABM) implementation framework built with the Smalltalk language. The model was verified and validated through consultations with the catchment coordinator and through workshops with stakeholders. The ABM reflected the catchment characteristics. Farms known to have water shortages had water shortages in the ABM. However, there was one that did not have shortages, which is attributed to land use change. The ABM was used to run multiple simulation scenarios, and it provides simulation results at the crop field, farm and catchment levels, which allows the ABM to be used as a bottom-up DST. , Thesis (MSc) -- Faculty of Science, Computer Science, 2024
- Full Text:
- Date Issued: 2024-10-11
- Authors: Tholanah, Rodney
- Date: 2024-10-11
- Subjects: Uncatalogued
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/464344 , vital:76503
- Description: Water is an important natural resource with multiple domestic and industrial uses. South Africa has a water scarcity problem, with domestic and industrial demand projected to increase due to population growth. One affected area is the Koue Bokkeveld (KBV), an agricultural catchment in the Western Cape. Water scarcity, especially in the summer, can cause conflicts among the farmers. This study sought to determine the extent to which Agent-Based (AB) modelling could be used to model the KBV catchment area and simulate future climate and usage scenarios. The study used the ComMod methodology as it allows stakeholders to be involved at each step of the modelling process, thus improving the model’s credibility as a decision-support tool (DST). The model was implemented using Cormas, an Agent-Based Model (ABM) implementation framework built with the Smalltalk language. The model was verified and validated through consultations with the catchment coordinator and through workshops with stakeholders. The ABM reflected the catchment characteristics. Farms known to have water shortages had water shortages in the ABM. However, there was one that did not have shortages, which is attributed to land use change. The ABM was used to run multiple simulation scenarios, and it provides simulation results at the crop field, farm and catchment levels, which allows the ABM to be used as a bottom-up DST. , Thesis (MSc) -- Faculty of Science, Computer Science, 2024
- Full Text:
- Date Issued: 2024-10-11
Comparative analysis of YOLOV5 and YOLOV8 for automated fish detection and classification in underwater environments
- Authors: Kuhlane, Luxolo
- Date: 2024-10-11
- Subjects: Uncatalogued
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/464333 , vital:76502
- Description: The application of traditional manual techniques for fish detection and classification faces significant challenges, primarily stemming from their labour-intensive nature and limited scalability. Automating these kinds of processes through computer vision practices and machine learning techniques has emerged as a potential solution in recent years. With the development of and increase in ease of access to new technology in recent years, the use of a deep learning object detector known as YOLO (You Only Look Once) in the detection and classification of fish has steadily become notably popular. This thesis thus explores suitable YOLO architectures for detecting and classifying fish. The YOLOv5 and YOLOv8 models were evaluated explicitly for detecting and classifying fish in underwater environments. The selection of these models was based on a literature review highlighting their success in similar applications but remains largely understudied in underwater environments. Therefore, the effectiveness of these models was evaluated through comprehensive experimentation on collected and publicly available underwater fish datasets. In collaboration with the South African Institute of Biodiversity (SAIAB), five datasets were collected and manually annotated for labels for supervised machine learning. Moreover, two publicly available datasets were sourced for comparison to the literature. Furthermore, after determining that the smallest YOLO architectures are better suited to these imbalanced datasets, hyperparameter tuning tailored the models to the characteristics of the various underwater environments used in the research. The popular DeepFish dataset was evaluated to establish a baseline and feasibility of these models in the understudied domain. The results demonstrated high detection accuracy for both YOLOv5 and YOLOv8. However, YOLOv8 outperformed YOLOv5, achieving 97.43% accuracy compared to 94.53%. After experiments on seven datasets, trends revealed YOLOv8’s enhanced generalisation accuracy due to architectural improvements, particularly in detecting smaller fish. Overall, YOLOv8 demonstrated that it is the better fish detection and classification model on diverse data. , Thesis (MSc) -- Faculty of Science, Computer Science, 2024
- Full Text:
- Date Issued: 2024-10-11
- Authors: Kuhlane, Luxolo
- Date: 2024-10-11
- Subjects: Uncatalogued
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/464333 , vital:76502
- Description: The application of traditional manual techniques for fish detection and classification faces significant challenges, primarily stemming from their labour-intensive nature and limited scalability. Automating these kinds of processes through computer vision practices and machine learning techniques has emerged as a potential solution in recent years. With the development of and increase in ease of access to new technology in recent years, the use of a deep learning object detector known as YOLO (You Only Look Once) in the detection and classification of fish has steadily become notably popular. This thesis thus explores suitable YOLO architectures for detecting and classifying fish. The YOLOv5 and YOLOv8 models were evaluated explicitly for detecting and classifying fish in underwater environments. The selection of these models was based on a literature review highlighting their success in similar applications but remains largely understudied in underwater environments. Therefore, the effectiveness of these models was evaluated through comprehensive experimentation on collected and publicly available underwater fish datasets. In collaboration with the South African Institute of Biodiversity (SAIAB), five datasets were collected and manually annotated for labels for supervised machine learning. Moreover, two publicly available datasets were sourced for comparison to the literature. Furthermore, after determining that the smallest YOLO architectures are better suited to these imbalanced datasets, hyperparameter tuning tailored the models to the characteristics of the various underwater environments used in the research. The popular DeepFish dataset was evaluated to establish a baseline and feasibility of these models in the understudied domain. The results demonstrated high detection accuracy for both YOLOv5 and YOLOv8. However, YOLOv8 outperformed YOLOv5, achieving 97.43% accuracy compared to 94.53%. After experiments on seven datasets, trends revealed YOLOv8’s enhanced generalisation accuracy due to architectural improvements, particularly in detecting smaller fish. Overall, YOLOv8 demonstrated that it is the better fish detection and classification model on diverse data. , Thesis (MSc) -- Faculty of Science, Computer Science, 2024
- Full Text:
- Date Issued: 2024-10-11
Enhancing licence plate recognition for a robust vehicle re-identification system
- Authors: Boby, Alden Zachary
- Date: 2024-10-11
- Subjects: Automobile theft South Africa , Deep learning (Machine learning) , Object detection , YOLOv7 , YOLO , Pattern recognition systems , Image processing Digital techniques , Automobile license plates
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/464322 , vital:76501
- Description: Vehicle security is a growing concern for citizens of South Africa. Law enforcement relies on reports and security camera footage for vehicle identification but struggles to match the increasing number of carjacking incidents and low vehicle recovery rates. Security camera footage offers an accessible means to identify stolen vehicles, yet it often poses hurdles like anamorphic plates and low resolution. Furthermore, depending on human operators proves inefficient, requiring faster processes to improve vehicle recovery rates and trust in law enforcement. The integration of deep learning has revolutionised object detection algorithms, increasing the popularity of vehicle tracking for security purposes. This thesis investigates advanced deep-learning methods for a comprehensive vehicle search and re-identification system. It enhances YOLOv7’s algorithmic capabilities and employs preprocessing techniques like super-resolution and perspective correction via the Improved Warped Planar Object Detection network for more effective licence plate optical character recognition. Key contributions include a specifically annotated dataset for training object detection models, an optical character recognition model based on YOLOv7, and a method for identifying vehicles in unrestricted data. The system detected rectangular and square licence plates without prior shape knowledge, achieving a 98.7% character recognition rate compared to 95.31% in related work. Moreover, it outperformed traditional optical character recognition by 28.25% and deep-learning EasyOCR by 14.18%. Its potential applications in law enforcement, traffic management, and parking systems can improve surveillance and security through automation. , Thesis (MSc) -- Faculty of Science, Computer Science, 2024
- Full Text:
- Date Issued: 2024-10-11
- Authors: Boby, Alden Zachary
- Date: 2024-10-11
- Subjects: Automobile theft South Africa , Deep learning (Machine learning) , Object detection , YOLOv7 , YOLO , Pattern recognition systems , Image processing Digital techniques , Automobile license plates
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/464322 , vital:76501
- Description: Vehicle security is a growing concern for citizens of South Africa. Law enforcement relies on reports and security camera footage for vehicle identification but struggles to match the increasing number of carjacking incidents and low vehicle recovery rates. Security camera footage offers an accessible means to identify stolen vehicles, yet it often poses hurdles like anamorphic plates and low resolution. Furthermore, depending on human operators proves inefficient, requiring faster processes to improve vehicle recovery rates and trust in law enforcement. The integration of deep learning has revolutionised object detection algorithms, increasing the popularity of vehicle tracking for security purposes. This thesis investigates advanced deep-learning methods for a comprehensive vehicle search and re-identification system. It enhances YOLOv7’s algorithmic capabilities and employs preprocessing techniques like super-resolution and perspective correction via the Improved Warped Planar Object Detection network for more effective licence plate optical character recognition. Key contributions include a specifically annotated dataset for training object detection models, an optical character recognition model based on YOLOv7, and a method for identifying vehicles in unrestricted data. The system detected rectangular and square licence plates without prior shape knowledge, achieving a 98.7% character recognition rate compared to 95.31% in related work. Moreover, it outperformed traditional optical character recognition by 28.25% and deep-learning EasyOCR by 14.18%. Its potential applications in law enforcement, traffic management, and parking systems can improve surveillance and security through automation. , Thesis (MSc) -- Faculty of Science, Computer Science, 2024
- Full Text:
- Date Issued: 2024-10-11
Investigating unimodal isolated signer-independent sign language recognition
- Authors: Marais, Marc Jason
- Date: 2024-04-04
- Subjects: Convolutional neural network , Sign language recognition , Human activity recognition , Pattern recognition systems , Neural networks (Computer science)
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/435343 , vital:73149
- Description: Sign language serves as the mode of communication for the Deaf and Hard of Hearing community, embodying a rich linguistic and cultural heritage. Recent Sign Language Recognition (SLR) system developments aim to facilitate seamless communication between the Deaf community and the broader society. However, most existing systems are limited by signer-dependent models, hindering their adaptability to diverse signing styles and signers, thus impeding their practical implementation in real-world scenarios. This research explores various unimodal approaches, both pose-based and vision-based, for isolated signer-independent SLR using RGB video input on the LSA64 and AUTSL datasets. The unimodal RGB-only input strategy provides a realistic SLR setting where alternative data sources are either unavailable or necessitate specialised equipment. Through systematic testing scenarios, isolated signer-independent SLR experiments are conducted on both datasets, primarily focusing on AUTSL – a signer-independent dataset. The vision-based R(2+1)D-18 model emerged as the top performer, achieving 90.64% accuracy on the unseen AUTSL dataset test split, closely followed by the pose-based Spatio- Temporal Graph Convolutional Network (ST-GCN) model with an accuracy of 89.95%. Furthermore, these models achieved comparable accuracies at a significantly lower computational demand. Notably, the pose-based approach demonstrates robust generalisation to substantial background and signer variation. Moreover, the pose-based approach demands significantly less computational power and training time than vision-based approaches. The proposed unimodal pose-based and vision-based systems were concluded to both be effective at classifying sign classes in the LSA64 and AUTSL datasets. , Thesis (MSc) -- Faculty of Science, Ichthyology and Fisheries Science, 2024
- Full Text:
- Date Issued: 2024-04-04
- Authors: Marais, Marc Jason
- Date: 2024-04-04
- Subjects: Convolutional neural network , Sign language recognition , Human activity recognition , Pattern recognition systems , Neural networks (Computer science)
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/435343 , vital:73149
- Description: Sign language serves as the mode of communication for the Deaf and Hard of Hearing community, embodying a rich linguistic and cultural heritage. Recent Sign Language Recognition (SLR) system developments aim to facilitate seamless communication between the Deaf community and the broader society. However, most existing systems are limited by signer-dependent models, hindering their adaptability to diverse signing styles and signers, thus impeding their practical implementation in real-world scenarios. This research explores various unimodal approaches, both pose-based and vision-based, for isolated signer-independent SLR using RGB video input on the LSA64 and AUTSL datasets. The unimodal RGB-only input strategy provides a realistic SLR setting where alternative data sources are either unavailable or necessitate specialised equipment. Through systematic testing scenarios, isolated signer-independent SLR experiments are conducted on both datasets, primarily focusing on AUTSL – a signer-independent dataset. The vision-based R(2+1)D-18 model emerged as the top performer, achieving 90.64% accuracy on the unseen AUTSL dataset test split, closely followed by the pose-based Spatio- Temporal Graph Convolutional Network (ST-GCN) model with an accuracy of 89.95%. Furthermore, these models achieved comparable accuracies at a significantly lower computational demand. Notably, the pose-based approach demonstrates robust generalisation to substantial background and signer variation. Moreover, the pose-based approach demands significantly less computational power and training time than vision-based approaches. The proposed unimodal pose-based and vision-based systems were concluded to both be effective at classifying sign classes in the LSA64 and AUTSL datasets. , Thesis (MSc) -- Faculty of Science, Ichthyology and Fisheries Science, 2024
- Full Text:
- Date Issued: 2024-04-04
Natural Language Processing with machine learning for anomaly detection on system call logs
- Authors: Goosen, Christo
- Date: 2023-10-13
- Subjects: Natural language processing (Computer science) , Machine learning , Information security , Anomaly detection (Computer security) , Host-based intrusion detection system
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/424699 , vital:72176
- Description: Host intrusion detection systems and machine learning have been studied for many years especially on datasets like KDD99. Current research and systems are focused on low training and processing complex problems such as system call returns, which lack the system call arguments and potential traces of exploits run against a system. With respect to malware and vulnerabilities, signatures are relied upon, and the potential for natural language processing of the resulting logs and system call traces needs further experimentation. This research looks at unstructured raw system call traces from x86_64 bit GNU Linux operating systems with natural language processing and supervised and unsupervised machine learning techniques to identify current and unseen threats. The research explores whether these tools are within the skill set of information security professionals, or require data science professionals. The research makes use of an academic and modern system call dataset from Leipzig University and applies two machine learning models based on decision trees. Random Forest as the supervised algorithm is compared to the unsupervised Isolation Forest algorithm for this research, with each experiment repeated after hyper-parameter tuning. The research finds conclusive evidence that the Isolation Forest Tree algorithm is effective, when paired with a Principal Component Analysis, in identifying anomalies in the modern Leipzig Intrusion Detection Data Set (LID-DS) dataset combined with samples of executed malware from the Virus Total Academic dataset. The base or default model parameters produce sub-optimal results, whereas using a hyper-parameter tuning technique increases the accuracy to within promising levels for anomaly and potential zero day detection. , Thesis (MSc) -- Faculty of Science, Computer Science, 2023
- Full Text:
- Date Issued: 2023-10-13
- Authors: Goosen, Christo
- Date: 2023-10-13
- Subjects: Natural language processing (Computer science) , Machine learning , Information security , Anomaly detection (Computer security) , Host-based intrusion detection system
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/424699 , vital:72176
- Description: Host intrusion detection systems and machine learning have been studied for many years especially on datasets like KDD99. Current research and systems are focused on low training and processing complex problems such as system call returns, which lack the system call arguments and potential traces of exploits run against a system. With respect to malware and vulnerabilities, signatures are relied upon, and the potential for natural language processing of the resulting logs and system call traces needs further experimentation. This research looks at unstructured raw system call traces from x86_64 bit GNU Linux operating systems with natural language processing and supervised and unsupervised machine learning techniques to identify current and unseen threats. The research explores whether these tools are within the skill set of information security professionals, or require data science professionals. The research makes use of an academic and modern system call dataset from Leipzig University and applies two machine learning models based on decision trees. Random Forest as the supervised algorithm is compared to the unsupervised Isolation Forest algorithm for this research, with each experiment repeated after hyper-parameter tuning. The research finds conclusive evidence that the Isolation Forest Tree algorithm is effective, when paired with a Principal Component Analysis, in identifying anomalies in the modern Leipzig Intrusion Detection Data Set (LID-DS) dataset combined with samples of executed malware from the Virus Total Academic dataset. The base or default model parameters produce sub-optimal results, whereas using a hyper-parameter tuning technique increases the accuracy to within promising levels for anomaly and potential zero day detection. , Thesis (MSc) -- Faculty of Science, Computer Science, 2023
- Full Text:
- Date Issued: 2023-10-13
Evaluation of the effectiveness of small aperture network telescopes as IBR data sources
- Authors: Chindipha, Stones Dalitso
- Date: 2023-03-31
- Subjects: Computer networks Monitoring , Computer networks Security measures , Computer bootstrapping , Time-series analysis , Regression analysis , Mathematical models
- Language: English
- Type: Academic theses , Doctoral theses , text
- Identifier: http://hdl.handle.net/10962/366264 , vital:65849 , DOI https://doi.org/10.21504/10962/366264
- Description: The use of network telescopes to collect unsolicited network traffic by monitoring unallocated address space has been in existence for over two decades. Past research has shown that there is a lot of activity happening in this unallocated space that needs monitoring as it carries threat intelligence data that has proven to be very useful in the security field. Prior to the emergence of the Internet of Things (IoT), commercialisation of IP addresses and widespread of mobile devices, there was a large pool of IPv4 addresses and thus reserving IPv4 addresses to be used for monitoring unsolicited activities going in the unallocated space was not a problem. Now, preservation of such IPv4 addresses just for monitoring is increasingly difficult as there is not enough free addresses in the IPv4 address space to be used for just monitoring. This is the case because such monitoring is seen as a ’non-productive’ use of the IP addresses. This research addresses the problem brought forth by this IPv4 address space exhaustion in relation to Internet Background Radiation (IBR) monitoring. In order to address the research questions, this research developed four mathematical models: Absolute Mean Accuracy Percentage Score (AMAPS), Symmetric Absolute Mean Accuracy Percentage Score (SAMAPS), Standardised Mean Absolute Error (SMAE), and Standardised Mean Absolute Scaled Error (SMASE). These models are used to evaluate the research objectives and quantify the variations that exist between different samples. The sample sizes represent different lens sizes of the telescopes. The study has brought to light a time series plot that shows the expected proportion of unique source IP addresses collected over time. The study also imputed data using the smaller /24 IPv4 net-block subnets to regenerate the missing data points using bootstrapping to create confidence intervals (CI). The findings from the simulated data supports the findings computed from the models. The CI offers a boost to decision making. Through a series of experiments with monthly and quarterly datasets, the study proposed a 95% - 99% confidence level to be used. It was known that large network telescopes collect more threat intelligence data than small-sized network telescopes, however, no study, to the best of our knowledge, has ever quantified such a knowledge gap. With the findings from the study, small-sized network telescope users can now use their network telescopes with full knowledge of gap that exists in the data collected between different network telescopes. , Thesis (PhD) -- Faculty of Science, Computer Science, 2023
- Full Text:
- Date Issued: 2023-03-31
- Authors: Chindipha, Stones Dalitso
- Date: 2023-03-31
- Subjects: Computer networks Monitoring , Computer networks Security measures , Computer bootstrapping , Time-series analysis , Regression analysis , Mathematical models
- Language: English
- Type: Academic theses , Doctoral theses , text
- Identifier: http://hdl.handle.net/10962/366264 , vital:65849 , DOI https://doi.org/10.21504/10962/366264
- Description: The use of network telescopes to collect unsolicited network traffic by monitoring unallocated address space has been in existence for over two decades. Past research has shown that there is a lot of activity happening in this unallocated space that needs monitoring as it carries threat intelligence data that has proven to be very useful in the security field. Prior to the emergence of the Internet of Things (IoT), commercialisation of IP addresses and widespread of mobile devices, there was a large pool of IPv4 addresses and thus reserving IPv4 addresses to be used for monitoring unsolicited activities going in the unallocated space was not a problem. Now, preservation of such IPv4 addresses just for monitoring is increasingly difficult as there is not enough free addresses in the IPv4 address space to be used for just monitoring. This is the case because such monitoring is seen as a ’non-productive’ use of the IP addresses. This research addresses the problem brought forth by this IPv4 address space exhaustion in relation to Internet Background Radiation (IBR) monitoring. In order to address the research questions, this research developed four mathematical models: Absolute Mean Accuracy Percentage Score (AMAPS), Symmetric Absolute Mean Accuracy Percentage Score (SAMAPS), Standardised Mean Absolute Error (SMAE), and Standardised Mean Absolute Scaled Error (SMASE). These models are used to evaluate the research objectives and quantify the variations that exist between different samples. The sample sizes represent different lens sizes of the telescopes. The study has brought to light a time series plot that shows the expected proportion of unique source IP addresses collected over time. The study also imputed data using the smaller /24 IPv4 net-block subnets to regenerate the missing data points using bootstrapping to create confidence intervals (CI). The findings from the simulated data supports the findings computed from the models. The CI offers a boost to decision making. Through a series of experiments with monthly and quarterly datasets, the study proposed a 95% - 99% confidence level to be used. It was known that large network telescopes collect more threat intelligence data than small-sized network telescopes, however, no study, to the best of our knowledge, has ever quantified such a knowledge gap. With the findings from the study, small-sized network telescope users can now use their network telescopes with full knowledge of gap that exists in the data collected between different network telescopes. , Thesis (PhD) -- Faculty of Science, Computer Science, 2023
- Full Text:
- Date Issued: 2023-03-31
A review of the Siyakhula Living Lab’s network solution for Internet in marginalized communities
- Muchatibaya, Hilbert Munashe
- Authors: Muchatibaya, Hilbert Munashe
- Date: 2022-10-14
- Subjects: Information and communication technologies for development , Information technology South Africa , Access network , User experience , Local area networks (Computer networks) South Africa
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/364943 , vital:65664
- Description: Changes within Information and Communication Technology (ICT) over the past decade required a review of the network layer component deployed in the Siyakhula Living Lab (SLL), a long-term joint venture between the Telkom Centres of Excellence hosted at University of Fort Hare and Rhodes University in South Africa. The SLL overall solution for the sustainable internet in poor communities consists of three main components – the computing infrastructure layer, the network layer, and the e-services layer. At the core of the network layer is the concept of BI, a high-speed local area network realized through easy-to deploy wireless technologies that establish point-to-multipoint connections among schools within a limited geographical area. Schools within the broadband island become then Digital Access Nodes (DANs), with computing infrastructure that provides access to the network. The review, reported in this thesis, aimed at determining whether the model for the network layer was still able to meet the needs of marginalized communities in South Africa, given the recent changes in ICT. The research work used the living lab methodology – a grassroots, user-driven approach that emphasizes co-creation between the beneficiaries and external entities (researchers, industry partners and the government) - to do viability tests on the solution for the network component. The viability tests included lab and field experiments, to produce the qualitative and quantitative data needed to propose an updated blueprint. The results of the review found that the network topology used in the SLL’s network, the BI, is still viable, while WiMAX is now outdated. Also, the in-network web cache, Squid, is no longer effective, given the switch to HTTPS and the pervasive presence of advertising. The solution to the first issue is outdoor Wi-Fi, a proven solution easily deployable in grass-roots fashion. The second issue can be mitigated by leveraging Squid’s ‘bumping’ and splicing features; deploying a browser extension to make picture download optional; and using Pihole, a DNS sinkhole. Hopefully, the revised solution could become a component of South African Government’s broadband plan, “SA Connect”. , Thesis (MSc) -- Faculty of Science, Computer Science, 2022
- Full Text:
- Date Issued: 2022-10-14
- Authors: Muchatibaya, Hilbert Munashe
- Date: 2022-10-14
- Subjects: Information and communication technologies for development , Information technology South Africa , Access network , User experience , Local area networks (Computer networks) South Africa
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/364943 , vital:65664
- Description: Changes within Information and Communication Technology (ICT) over the past decade required a review of the network layer component deployed in the Siyakhula Living Lab (SLL), a long-term joint venture between the Telkom Centres of Excellence hosted at University of Fort Hare and Rhodes University in South Africa. The SLL overall solution for the sustainable internet in poor communities consists of three main components – the computing infrastructure layer, the network layer, and the e-services layer. At the core of the network layer is the concept of BI, a high-speed local area network realized through easy-to deploy wireless technologies that establish point-to-multipoint connections among schools within a limited geographical area. Schools within the broadband island become then Digital Access Nodes (DANs), with computing infrastructure that provides access to the network. The review, reported in this thesis, aimed at determining whether the model for the network layer was still able to meet the needs of marginalized communities in South Africa, given the recent changes in ICT. The research work used the living lab methodology – a grassroots, user-driven approach that emphasizes co-creation between the beneficiaries and external entities (researchers, industry partners and the government) - to do viability tests on the solution for the network component. The viability tests included lab and field experiments, to produce the qualitative and quantitative data needed to propose an updated blueprint. The results of the review found that the network topology used in the SLL’s network, the BI, is still viable, while WiMAX is now outdated. Also, the in-network web cache, Squid, is no longer effective, given the switch to HTTPS and the pervasive presence of advertising. The solution to the first issue is outdoor Wi-Fi, a proven solution easily deployable in grass-roots fashion. The second issue can be mitigated by leveraging Squid’s ‘bumping’ and splicing features; deploying a browser extension to make picture download optional; and using Pihole, a DNS sinkhole. Hopefully, the revised solution could become a component of South African Government’s broadband plan, “SA Connect”. , Thesis (MSc) -- Faculty of Science, Computer Science, 2022
- Full Text:
- Date Issued: 2022-10-14
A systematic methodology to evaluating optimised machine learning based network intrusion detection systems
- Authors: Chindove, Hatitye Ethridge
- Date: 2022-10-14
- Subjects: Intrusion detection systems (Computer security) , Machine learning , Computer networks Security measures , Principal components analysis
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/362774 , vital:65361
- Description: A network intrusion detection system (NIDS) is essential for mitigating computer network attacks in various scenarios. However, the increasing complexity of computer networks and attacks makes classifying unseen or novel network traffic challenging. Supervised machine learning techniques (ML) used in a NIDS can be affected by different scenarios. Thus, dataset recency, size, and applicability are essential factors when selecting and tuning a machine learning classifier. This thesis explores developing and optimising several supervised ML algorithms with relatively new datasets constructed to depict real-world scenarios. The methodology includes empirical analyses of systematic ML-based NIDS for a near real-world network system to improve intrusion detection. The thesis is experimental heavy for model assessment. Data preparation methods are explored, followed by feature engineering techniques. The model evaluation process involves three experiments testing against a validation, un-trained, and retrained set. They compare several traditional machine learning and deep learning classifiers to identify the best NIDS model. Results show that the focus on feature scaling, feature selection methods and ML algo- rithm hyper-parameter tuning per model is an essential optimisation component. Distance based ML algorithm performed much better with quantile transformation whilst the tree based algorithms performed better without scaling. Permutation importance performs as a feature selection method compared to feature extraction using Principal Component Analysis (PCA) when applied against all ML algorithms explored. Random forests, Sup- port Vector Machines and recurrent neural networks consistently achieved the best results with high macro f1-score results of 90% 81% and 73% for the CICIDS 2017 dataset; and 72% 68% and 73% against the CICIDS 2018 dataset. , Thesis (MSc) -- Faculty of Science, Computer Science, 2022
- Full Text:
- Date Issued: 2022-10-14
- Authors: Chindove, Hatitye Ethridge
- Date: 2022-10-14
- Subjects: Intrusion detection systems (Computer security) , Machine learning , Computer networks Security measures , Principal components analysis
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/362774 , vital:65361
- Description: A network intrusion detection system (NIDS) is essential for mitigating computer network attacks in various scenarios. However, the increasing complexity of computer networks and attacks makes classifying unseen or novel network traffic challenging. Supervised machine learning techniques (ML) used in a NIDS can be affected by different scenarios. Thus, dataset recency, size, and applicability are essential factors when selecting and tuning a machine learning classifier. This thesis explores developing and optimising several supervised ML algorithms with relatively new datasets constructed to depict real-world scenarios. The methodology includes empirical analyses of systematic ML-based NIDS for a near real-world network system to improve intrusion detection. The thesis is experimental heavy for model assessment. Data preparation methods are explored, followed by feature engineering techniques. The model evaluation process involves three experiments testing against a validation, un-trained, and retrained set. They compare several traditional machine learning and deep learning classifiers to identify the best NIDS model. Results show that the focus on feature scaling, feature selection methods and ML algo- rithm hyper-parameter tuning per model is an essential optimisation component. Distance based ML algorithm performed much better with quantile transformation whilst the tree based algorithms performed better without scaling. Permutation importance performs as a feature selection method compared to feature extraction using Principal Component Analysis (PCA) when applied against all ML algorithms explored. Random forests, Sup- port Vector Machines and recurrent neural networks consistently achieved the best results with high macro f1-score results of 90% 81% and 73% for the CICIDS 2017 dataset; and 72% 68% and 73% against the CICIDS 2018 dataset. , Thesis (MSc) -- Faculty of Science, Computer Science, 2022
- Full Text:
- Date Issued: 2022-10-14
An Investigation into Speaker and Headphone-Based Immersive Audio for VR and Digital Gaming Applications
- Authors: Marais, Kyle Donald
- Date: 2022-10-14
- Subjects: Uncatalogued
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/365246 , vital:65720
- Description: Thesis embargoed. Possible release date set for early 2024. , Thesis (MSc) -- Faculty of Science, Computer Science, 2022
- Full Text:
- Date Issued: 2022-10-14
- Authors: Marais, Kyle Donald
- Date: 2022-10-14
- Subjects: Uncatalogued
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/365246 , vital:65720
- Description: Thesis embargoed. Possible release date set for early 2024. , Thesis (MSc) -- Faculty of Science, Computer Science, 2022
- Full Text:
- Date Issued: 2022-10-14
Evolving IoT honeypots
- Authors: Genov, Todor Stanislavov
- Date: 2022-10-14
- Subjects: Internet of things , Malware (Computer software) , QEMU , Honeypot , Cowrie
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/362819 , vital:65365
- Description: The Internet of Things (IoT) is the emerging world where arbitrary objects from our everyday lives gain basic computational and networking capabilities to become part of the Internet. Researchers are estimating between 25 and 35 billion devices will be part of Internet by 2022. Unlike conventional computers where one hardware platform (Intel x86) and three operating systems (Windows, Linux and OS X) dominate the market, the IoT landscape is far more heterogeneous. To meet the growth demand the number of The System-on-Chip (SoC) manufacturers has seen a corresponding exponential growth making embedded platforms based on ARM, MIPS or SH4 processors abundant. The pursuit for market share is further leading to a price war and cost-cutting ultimately resulting in cheap systems with limited hardware resources and capabilities. The frugality of IoT hardware has a domino effect. Due to resource constraints vendors are packaging devices with custom, stripped-down Linux-based firmwares optimized for performing the device’s primary function. Device management, monitoring and security features are by and far absent from IoT devices. This created an asymmetry favouring attackers and disadvantaging defenders. This research sets out to reduce the opacity and identify a viable strategy, tactics and tooling for gaining insight into the IoT threat landscape by leveraging honeypots to build and deploy an evolving world-wide Observatory, based on cloud platforms, to help with studying attacker behaviour and collecting IoT malware samples. The research produces useful tools and techniques for identifying behavioural differences between Medium-Interaction honeypots and real devices by replaying interactive attacker sessions collected from the Honeypot Network. The behavioural delta is used to evolve the Honeypot Network and improve its collection capabilities. Positive results are obtained with respect to effectiveness of the above technique. Findings by other researchers in the field are also replicated. The complete dataset and source code used for this research is made publicly available on the Open Science Framework website at https://osf.io/vkcrn/. , Thesis (MSc) -- Faculty of Science, Computer Science, 2022
- Full Text:
- Date Issued: 2022-10-14
- Authors: Genov, Todor Stanislavov
- Date: 2022-10-14
- Subjects: Internet of things , Malware (Computer software) , QEMU , Honeypot , Cowrie
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/362819 , vital:65365
- Description: The Internet of Things (IoT) is the emerging world where arbitrary objects from our everyday lives gain basic computational and networking capabilities to become part of the Internet. Researchers are estimating between 25 and 35 billion devices will be part of Internet by 2022. Unlike conventional computers where one hardware platform (Intel x86) and three operating systems (Windows, Linux and OS X) dominate the market, the IoT landscape is far more heterogeneous. To meet the growth demand the number of The System-on-Chip (SoC) manufacturers has seen a corresponding exponential growth making embedded platforms based on ARM, MIPS or SH4 processors abundant. The pursuit for market share is further leading to a price war and cost-cutting ultimately resulting in cheap systems with limited hardware resources and capabilities. The frugality of IoT hardware has a domino effect. Due to resource constraints vendors are packaging devices with custom, stripped-down Linux-based firmwares optimized for performing the device’s primary function. Device management, monitoring and security features are by and far absent from IoT devices. This created an asymmetry favouring attackers and disadvantaging defenders. This research sets out to reduce the opacity and identify a viable strategy, tactics and tooling for gaining insight into the IoT threat landscape by leveraging honeypots to build and deploy an evolving world-wide Observatory, based on cloud platforms, to help with studying attacker behaviour and collecting IoT malware samples. The research produces useful tools and techniques for identifying behavioural differences between Medium-Interaction honeypots and real devices by replaying interactive attacker sessions collected from the Honeypot Network. The behavioural delta is used to evolve the Honeypot Network and improve its collection capabilities. Positive results are obtained with respect to effectiveness of the above technique. Findings by other researchers in the field are also replicated. The complete dataset and source code used for this research is made publicly available on the Open Science Framework website at https://osf.io/vkcrn/. , Thesis (MSc) -- Faculty of Science, Computer Science, 2022
- Full Text:
- Date Issued: 2022-10-14
Leveraging LTSP to deploy a sustainable e-infrastructure for poor communities in South Africa
- Authors: Zvidzayi, Tichaona Manyara
- Date: 2022-10-14
- Subjects: Linux Terminal Server Project , Network computers , Thin client , Fat client , Cyberinfrastructure , Poverty reduction
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/365577 , vital:65761
- Description: Poverty alleviation is one of the main challenges the South African government is facing. Information and knowledge are key strategic resources for both social and economic development, and nowadays they most often rely on Information and Communication Technologies (ICTs). Poor communities have limited or no access to functioning e-infrastructure, which underpins ICT. The Siyakhula Living Lab (SLL) is a joint project between the universities of Rhodes and Fort Hare that has been running for over 15 years now. The SLL solution is currently implemented in schools in the Eastern Cape’s Dwesa-Mbhashe municipality as well as schools in Makhanda (formerly Grahamstown). Over the years, a number of blueprints for the meaningful connection of poor communities was developed. The research reported in this thesis sought to review and improve the Siyakhula Living Lab (SLL) blueprint regarding fixed computing infrastructure (as opposed to networking and applications). The review confirmed the viability of the GNU/Linux Terminal Server Project (LTSP) based computing infrastructure deployed in schools to serve the surrounding community. In 2019 LTSP was redesigned and rewritten to improve on the previous version. Amongst other improvements, LTSP19+ has a smaller memory footprint and supports a graphical way to prepare and maintain the client’s image using virtual machines. These improvements increase the potential life of ICT projects implementing the SLL solution, increasing the participation of members of the community (especially teachers) to the maintenance of the computing installations. The review recommends the switching from thin clients deployments to full ("thick") clients deployments, still booting from the network and mounting their file systems on a central server. The switch is motivated by reasons that go from cost-effectiveness to the ability to survive the sudden unavailability of the central server. From experience in the previous deployment, electrical power surge protection should be mandatory. Also, UPS to protect the file system of the central server should be configured to start the shutdown immediately on electrical power loss in order to protect the life of the UPS battery (and make it possible to use cheaper UPS that report only on network power loss). The research study contributed to one real-life computing infrastructure deployment in the Ntsika school in Makhanda and one re-deployment in the Ngwane school in the Dwesa-Mbhashe area. For about two years, the research also supported continuous maintenance for the Ntsika, Ngwane and Mpume schools. , Thesis (MSc) -- Faculty of Science, Computer Science, 2022
- Full Text:
- Date Issued: 2022-10-14
- Authors: Zvidzayi, Tichaona Manyara
- Date: 2022-10-14
- Subjects: Linux Terminal Server Project , Network computers , Thin client , Fat client , Cyberinfrastructure , Poverty reduction
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/365577 , vital:65761
- Description: Poverty alleviation is one of the main challenges the South African government is facing. Information and knowledge are key strategic resources for both social and economic development, and nowadays they most often rely on Information and Communication Technologies (ICTs). Poor communities have limited or no access to functioning e-infrastructure, which underpins ICT. The Siyakhula Living Lab (SLL) is a joint project between the universities of Rhodes and Fort Hare that has been running for over 15 years now. The SLL solution is currently implemented in schools in the Eastern Cape’s Dwesa-Mbhashe municipality as well as schools in Makhanda (formerly Grahamstown). Over the years, a number of blueprints for the meaningful connection of poor communities was developed. The research reported in this thesis sought to review and improve the Siyakhula Living Lab (SLL) blueprint regarding fixed computing infrastructure (as opposed to networking and applications). The review confirmed the viability of the GNU/Linux Terminal Server Project (LTSP) based computing infrastructure deployed in schools to serve the surrounding community. In 2019 LTSP was redesigned and rewritten to improve on the previous version. Amongst other improvements, LTSP19+ has a smaller memory footprint and supports a graphical way to prepare and maintain the client’s image using virtual machines. These improvements increase the potential life of ICT projects implementing the SLL solution, increasing the participation of members of the community (especially teachers) to the maintenance of the computing installations. The review recommends the switching from thin clients deployments to full ("thick") clients deployments, still booting from the network and mounting their file systems on a central server. The switch is motivated by reasons that go from cost-effectiveness to the ability to survive the sudden unavailability of the central server. From experience in the previous deployment, electrical power surge protection should be mandatory. Also, UPS to protect the file system of the central server should be configured to start the shutdown immediately on electrical power loss in order to protect the life of the UPS battery (and make it possible to use cheaper UPS that report only on network power loss). The research study contributed to one real-life computing infrastructure deployment in the Ntsika school in Makhanda and one re-deployment in the Ngwane school in the Dwesa-Mbhashe area. For about two years, the research also supported continuous maintenance for the Ntsika, Ngwane and Mpume schools. , Thesis (MSc) -- Faculty of Science, Computer Science, 2022
- Full Text:
- Date Issued: 2022-10-14
Simplified menu-driven data analysis tool with macro-like automation
- Authors: Kazembe, Luntha
- Date: 2022-10-14
- Subjects: Data analysis , Macro instructions (Electronic computers) , Quantitative research Software , Python (Computer program language) , Scripting languages (Computer science)
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/362905 , vital:65373
- Description: This study seeks to improve the data analysis process for individuals and small businesses with limited resources by developing a simplified data analysis software tool that allows users to carry out data analysis effectively and efficiently. Design considerations were identified to address limitations common in such environments, these included making the tool easy-to-use, requiring only a basic understanding of the data analysis process, designing the tool in manner that minimises computing resource requirements and user interaction and implementing it using Python which is open-source, effective and efficient in processing data. We develop a prototype simplified data analysis tool as a proof-of-concept. The tool has two components, namely, core elements which provide functionality for the data anal- ysis process including data collection, transformations, analysis and visualizations, and automation and performance enhancements to improve the data analysis process. The automation enhancements consist of the record and playback macro feature while the performance enhancements include multiprocessing and multi-threading abilities. The data analysis software was developed to analyse various alpha-numeric data formats by using a variety of statistical and mathematical techniques. The record and playback macro feature enhances the data analysis process by saving users time and computing resources when analysing large volumes of data or carrying out repetitive data analysis tasks. The feature has two components namely, the record component that is used to record data analysis steps and the playback component used to execute recorded steps. The simplified data analysis tool has parallelization designed and implemented which allows users to carry out two or more analysis tasks at a time, this improves productivity as users can do other tasks while the tool is processing data using recorded steps in the background. The tool was created and subsequently tested using common analysis scenarios applied to network data, log data and stock data. Results show that decision-making requirements such as accurate information, can be satisfied using this analysis tool. Based on the functionality implemented, similar analysis functionality to that provided by Microsoft Excel is available, but in a simplified manner. Moreover, a more sophisticated macro functionality is provided for the execution of repetitive tasks using the recording feature. Overall, the study found that the simplified data analysis tool is functional, usable, scalable, efficient and can carry out multiple analysis tasks simultaneously. , Thesis (MSc) -- Faculty of Science, Computer Science, 2022
- Full Text:
- Date Issued: 2022-10-14
- Authors: Kazembe, Luntha
- Date: 2022-10-14
- Subjects: Data analysis , Macro instructions (Electronic computers) , Quantitative research Software , Python (Computer program language) , Scripting languages (Computer science)
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/362905 , vital:65373
- Description: This study seeks to improve the data analysis process for individuals and small businesses with limited resources by developing a simplified data analysis software tool that allows users to carry out data analysis effectively and efficiently. Design considerations were identified to address limitations common in such environments, these included making the tool easy-to-use, requiring only a basic understanding of the data analysis process, designing the tool in manner that minimises computing resource requirements and user interaction and implementing it using Python which is open-source, effective and efficient in processing data. We develop a prototype simplified data analysis tool as a proof-of-concept. The tool has two components, namely, core elements which provide functionality for the data anal- ysis process including data collection, transformations, analysis and visualizations, and automation and performance enhancements to improve the data analysis process. The automation enhancements consist of the record and playback macro feature while the performance enhancements include multiprocessing and multi-threading abilities. The data analysis software was developed to analyse various alpha-numeric data formats by using a variety of statistical and mathematical techniques. The record and playback macro feature enhances the data analysis process by saving users time and computing resources when analysing large volumes of data or carrying out repetitive data analysis tasks. The feature has two components namely, the record component that is used to record data analysis steps and the playback component used to execute recorded steps. The simplified data analysis tool has parallelization designed and implemented which allows users to carry out two or more analysis tasks at a time, this improves productivity as users can do other tasks while the tool is processing data using recorded steps in the background. The tool was created and subsequently tested using common analysis scenarios applied to network data, log data and stock data. Results show that decision-making requirements such as accurate information, can be satisfied using this analysis tool. Based on the functionality implemented, similar analysis functionality to that provided by Microsoft Excel is available, but in a simplified manner. Moreover, a more sophisticated macro functionality is provided for the execution of repetitive tasks using the recording feature. Overall, the study found that the simplified data analysis tool is functional, usable, scalable, efficient and can carry out multiple analysis tasks simultaneously. , Thesis (MSc) -- Faculty of Science, Computer Science, 2022
- Full Text:
- Date Issued: 2022-10-14
The implementation of a mobile application to decrease occupational sitting through goal setting and social comparison
- Authors: Tsaoane, Moipone Lipalesa
- Date: 2022-10-14
- Subjects: Sedentary behavior , Sitting position , Feedback , Mobile apps , Behavior modification , Agile software development
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/365544 , vital:65758
- Description: Background: Feedback proves to be a valuable tool in behaviour change as it is said to increase compliance and improve the effectiveness of interventions. Interventions that focus on decreasing sedentary behaviour as an independent factor from physical activity are necessary, especially for office workers who spend most of their day seated. There is insufficient knowledge regarding the effectiveness of feedback as a tool to decrease sedentary behaviour. This project implemented a tool that can be used to determine this. To take advantage of the cost-effectiveness and scalability of digital technologies, a mobile application was selected as the mode of delivery. Method: The application was designed as an intervention, using the Theoretical Domains Framework. It was then implemented into a fully functioning application through an agile development process, using Xam- arin.Forms framework. Due to challenges with this framework, a second application was developed using the React Native framework. Pilot studies were used for testing, with the final one consisting of Rhodes University employees. Results: The Xamarin.Forms application proved to be unfeasible; some users experienced fatal errors and crashes. The React Native application worked as desired and produced accurate and consistent step count readings, proving feasible from a functionality standpoint. The agile methodology enabled the developer to focus on implementing and testing one component at a time, which made the development process more manageable. Conclusion: Future work must conduct empirical studies to determine if feedback is an effective tool compared to a control group and which type of feedback (between goal-setting and social comparison) is most effective. , Thesis (MSc) -- Faculty of Science, Computer Science, 2022
- Full Text:
- Date Issued: 2022-10-14
- Authors: Tsaoane, Moipone Lipalesa
- Date: 2022-10-14
- Subjects: Sedentary behavior , Sitting position , Feedback , Mobile apps , Behavior modification , Agile software development
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/365544 , vital:65758
- Description: Background: Feedback proves to be a valuable tool in behaviour change as it is said to increase compliance and improve the effectiveness of interventions. Interventions that focus on decreasing sedentary behaviour as an independent factor from physical activity are necessary, especially for office workers who spend most of their day seated. There is insufficient knowledge regarding the effectiveness of feedback as a tool to decrease sedentary behaviour. This project implemented a tool that can be used to determine this. To take advantage of the cost-effectiveness and scalability of digital technologies, a mobile application was selected as the mode of delivery. Method: The application was designed as an intervention, using the Theoretical Domains Framework. It was then implemented into a fully functioning application through an agile development process, using Xam- arin.Forms framework. Due to challenges with this framework, a second application was developed using the React Native framework. Pilot studies were used for testing, with the final one consisting of Rhodes University employees. Results: The Xamarin.Forms application proved to be unfeasible; some users experienced fatal errors and crashes. The React Native application worked as desired and produced accurate and consistent step count readings, proving feasible from a functionality standpoint. The agile methodology enabled the developer to focus on implementing and testing one component at a time, which made the development process more manageable. Conclusion: Future work must conduct empirical studies to determine if feedback is an effective tool compared to a control group and which type of feedback (between goal-setting and social comparison) is most effective. , Thesis (MSc) -- Faculty of Science, Computer Science, 2022
- Full Text:
- Date Issued: 2022-10-14
A multispectral and machine learning approach to early stress classification in plants
- Authors: Poole, Louise Carmen
- Date: 2022-04-06
- Subjects: Machine learning , Neural networks (Computer science) , Multispectral imaging , Image processing , Plant stress detection
- Language: English
- Type: Master's thesis , text
- Identifier: http://hdl.handle.net/10962/232410 , vital:49989
- Description: Crop loss and failure can impact both a country’s economy and food security, often to devastating effects. As such, the importance of successfully detecting plant stresses early in their development is essential to minimize spread and damage to crop production. Identification of the stress and the stress-causing agent is the most critical and challenging step in plant and crop protection. With the development of and increase in ease of access to new equipment and technology in recent years, the use of spectroscopy in the early detection of plant diseases has become notably popular. This thesis narrows down the most suitable multispectral imaging techniques and machine learning algorithms for early stress detection. Datasets were collected of visible images and multispectral images. Dehydration was selected as the plant stress type for the main experiments, and data was collected from six plant species typically used in agriculture. Key contributions of this thesis include multispectral and visible datasets showing plant dehydration as well as a separate preliminary dataset on plant disease. Promising results on dehydration showed statistically significant accuracy improvements in the multispectral imaging compared to visible imaging for early stress detection, with multispectral input obtaining a 92.50% accuracy over visible input’s 77.50% on general plant species. The system was effective at stress detection on known plant species, with multispectral imaging introducing greater improvement to early stress detection than advanced stress detection. Furthermore, strong species discrimination was achieved when exclusively testing either early or advanced dehydration against healthy species. , Thesis (MSc) -- Faculty of Science, Ichthyology & Fisheries Sciences, 2022
- Full Text:
- Date Issued: 2022-04-06
- Authors: Poole, Louise Carmen
- Date: 2022-04-06
- Subjects: Machine learning , Neural networks (Computer science) , Multispectral imaging , Image processing , Plant stress detection
- Language: English
- Type: Master's thesis , text
- Identifier: http://hdl.handle.net/10962/232410 , vital:49989
- Description: Crop loss and failure can impact both a country’s economy and food security, often to devastating effects. As such, the importance of successfully detecting plant stresses early in their development is essential to minimize spread and damage to crop production. Identification of the stress and the stress-causing agent is the most critical and challenging step in plant and crop protection. With the development of and increase in ease of access to new equipment and technology in recent years, the use of spectroscopy in the early detection of plant diseases has become notably popular. This thesis narrows down the most suitable multispectral imaging techniques and machine learning algorithms for early stress detection. Datasets were collected of visible images and multispectral images. Dehydration was selected as the plant stress type for the main experiments, and data was collected from six plant species typically used in agriculture. Key contributions of this thesis include multispectral and visible datasets showing plant dehydration as well as a separate preliminary dataset on plant disease. Promising results on dehydration showed statistically significant accuracy improvements in the multispectral imaging compared to visible imaging for early stress detection, with multispectral input obtaining a 92.50% accuracy over visible input’s 77.50% on general plant species. The system was effective at stress detection on known plant species, with multispectral imaging introducing greater improvement to early stress detection than advanced stress detection. Furthermore, strong species discrimination was achieved when exclusively testing either early or advanced dehydration against healthy species. , Thesis (MSc) -- Faculty of Science, Ichthyology & Fisheries Sciences, 2022
- Full Text:
- Date Issued: 2022-04-06
A decision-making model to guide securing blockchain deployments
- Authors: Cronje, Gerhard Roets
- Date: 2021-10-29
- Subjects: Blockchains (Databases) , Bitcoin , Cryptocurrencies , Distributed databases , Computer networks Security measures , Computer networks Security measures Decision making , Ethereum
- Language: English
- Type: Masters theses , text
- Identifier: http://hdl.handle.net/10962/188865 , vital:44793
- Description: Satoshi Nakamoto, the pseudo-identity accredit with the paper that sparked the implementation of Bitcoin, is famously quoted as remarking, electronically of course, that “If you don’t believe it or don’t get it, I don’t have time to try and convince you, sorry” (Tsapis, 2019, p. 1). What is noticeable, 12 years after the famed Satoshi paper that initiated Bitcoin (Nakamoto, 2008), is that blockchain at the very least has staying power and potentially wide application. A lesser known figure Marc Kenisberg, founder of Bitcoin Chaser which is one of the many companies formed around the Bitcoin ecosystem, summarised it well saying “…Blockchain is the tech - Bitcoin is merely the first mainstream manifestation of its potential” (Tsapis, 2019, p. 1). With blockchain still trying to reach its potential and still maturing on its way towards a mainstream technology the main question that arises for security professionals is how do I ensure we do it securely? This research seeks to address that question by proposing a decision-making model that can be used by a security professional to guide them through ensuring appropriate security for blockchain deployments. This research is certainly not the first attempt at discussing the security of the blockchain and will not be the last, as the technology around blockchain and distributed ledger technology is still rapidly evolving. What this research does try to achieve is not to delve into extremely specific areas of blockchain security, or get bogged down in technical details, but to provide a reference framework that aims to cover all the major areas to be considered. The approach followed was to review the literature regarding blockchain and to identify the main security areas to be addressed. It then proposes a decision-making model and tests the model against a fictitious but relevant real-world example. It concludes with learnings from this research. The reader can be the judge, but the model aims to be a practical valuable resource to be used by any security professional, to navigate the security aspects logically and understandably when being involved in a blockchain deployment. In contrast to the Satoshi quote, this research tries to convince the reader and assist him/her in understanding the security choices related to every blockchain deployment. , Thesis (MSc) -- Faculty of Science, Computer Science, 2021
- Full Text:
- Date Issued: 2021-10-29
- Authors: Cronje, Gerhard Roets
- Date: 2021-10-29
- Subjects: Blockchains (Databases) , Bitcoin , Cryptocurrencies , Distributed databases , Computer networks Security measures , Computer networks Security measures Decision making , Ethereum
- Language: English
- Type: Masters theses , text
- Identifier: http://hdl.handle.net/10962/188865 , vital:44793
- Description: Satoshi Nakamoto, the pseudo-identity accredit with the paper that sparked the implementation of Bitcoin, is famously quoted as remarking, electronically of course, that “If you don’t believe it or don’t get it, I don’t have time to try and convince you, sorry” (Tsapis, 2019, p. 1). What is noticeable, 12 years after the famed Satoshi paper that initiated Bitcoin (Nakamoto, 2008), is that blockchain at the very least has staying power and potentially wide application. A lesser known figure Marc Kenisberg, founder of Bitcoin Chaser which is one of the many companies formed around the Bitcoin ecosystem, summarised it well saying “…Blockchain is the tech - Bitcoin is merely the first mainstream manifestation of its potential” (Tsapis, 2019, p. 1). With blockchain still trying to reach its potential and still maturing on its way towards a mainstream technology the main question that arises for security professionals is how do I ensure we do it securely? This research seeks to address that question by proposing a decision-making model that can be used by a security professional to guide them through ensuring appropriate security for blockchain deployments. This research is certainly not the first attempt at discussing the security of the blockchain and will not be the last, as the technology around blockchain and distributed ledger technology is still rapidly evolving. What this research does try to achieve is not to delve into extremely specific areas of blockchain security, or get bogged down in technical details, but to provide a reference framework that aims to cover all the major areas to be considered. The approach followed was to review the literature regarding blockchain and to identify the main security areas to be addressed. It then proposes a decision-making model and tests the model against a fictitious but relevant real-world example. It concludes with learnings from this research. The reader can be the judge, but the model aims to be a practical valuable resource to be used by any security professional, to navigate the security aspects logically and understandably when being involved in a blockchain deployment. In contrast to the Satoshi quote, this research tries to convince the reader and assist him/her in understanding the security choices related to every blockchain deployment. , Thesis (MSc) -- Faculty of Science, Computer Science, 2021
- Full Text:
- Date Issued: 2021-10-29
Improved concurrent Java processes
- Ntlahla, Mbalentle Apelele Wiseman
- Authors: Ntlahla, Mbalentle Apelele Wiseman
- Date: 2021-10-29
- Subjects: Java (Computer program language) , Computer multitasking , Sequential processing (Computer science) , Parallel programming (Computer science) , Simultaneous multithreading processors
- Language: English
- Type: Master's theses , text
- Identifier: http://hdl.handle.net/10962/192129 , vital:45198
- Description: The rise in the number of cores in a processor has resulted in computer programmers needing to write concurrent programs to utilize the extra available processors. Concurrent programming can utilize the extra processors available in a multi-core architecture. However, writing concurrent programs introduces complexities that are not encountered in sequential programming (race conditions, deadlocks, starvation, liveness, etc., are some of the complexities that come with concurrent programming). These complexities require programming languages to provide functionality to help programmers with writing concurrent programs. The Java language is designed to support concurrent programming, mostly through threads. The support is provided through the Java programming language itself and Java class libraries. Although concurrent processes are important and have their own advantages over concurrent threads Java has limited support for concurrent processes. In this thesis we attempt to provide the same support that Java has for threads through the java.util.concurrent library to processes. This is attempted to be done through a Java library (za.co.jcp). The library will provide synchronisation methods of multiple processes, Java process shared variables, atomic variables, process-safe data structures, and a process executors framework similar to that of the executor framework provided by Java for threads. The two libraries' similarities, and performance is analyzed. The analysis between the two libraries is performed to compare the code portability, ease of use, and performance difference between the two libraries. The results from the project have shown that it is possible for Java to provide support for concurrency through processes and not only threads. In addition from the benchmarks performed the performance of the za.co.jcp library is not significantly slower than the current java.util.concurrent thread library. This means that Java concurrent applications will also now be able to use cooperating processes rather than be confined to using threads. , Thesis (MSc) -- Faculty of Science, Computer Science, 2021
- Full Text:
- Date Issued: 2021-10-29
- Authors: Ntlahla, Mbalentle Apelele Wiseman
- Date: 2021-10-29
- Subjects: Java (Computer program language) , Computer multitasking , Sequential processing (Computer science) , Parallel programming (Computer science) , Simultaneous multithreading processors
- Language: English
- Type: Master's theses , text
- Identifier: http://hdl.handle.net/10962/192129 , vital:45198
- Description: The rise in the number of cores in a processor has resulted in computer programmers needing to write concurrent programs to utilize the extra available processors. Concurrent programming can utilize the extra processors available in a multi-core architecture. However, writing concurrent programs introduces complexities that are not encountered in sequential programming (race conditions, deadlocks, starvation, liveness, etc., are some of the complexities that come with concurrent programming). These complexities require programming languages to provide functionality to help programmers with writing concurrent programs. The Java language is designed to support concurrent programming, mostly through threads. The support is provided through the Java programming language itself and Java class libraries. Although concurrent processes are important and have their own advantages over concurrent threads Java has limited support for concurrent processes. In this thesis we attempt to provide the same support that Java has for threads through the java.util.concurrent library to processes. This is attempted to be done through a Java library (za.co.jcp). The library will provide synchronisation methods of multiple processes, Java process shared variables, atomic variables, process-safe data structures, and a process executors framework similar to that of the executor framework provided by Java for threads. The two libraries' similarities, and performance is analyzed. The analysis between the two libraries is performed to compare the code portability, ease of use, and performance difference between the two libraries. The results from the project have shown that it is possible for Java to provide support for concurrency through processes and not only threads. In addition from the benchmarks performed the performance of the za.co.jcp library is not significantly slower than the current java.util.concurrent thread library. This means that Java concurrent applications will also now be able to use cooperating processes rather than be confined to using threads. , Thesis (MSc) -- Faculty of Science, Computer Science, 2021
- Full Text:
- Date Issued: 2021-10-29
Peer-to-peer energy trading system using IoT and a low-computation blockchain network
- Authors: Ncube, Tyron
- Date: 2021-10-29
- Subjects: Blockchains (Databases) , Internet of things , Renewable energy sources , Smart power grids , Peer-to-peer architecture (Computer networks) , Energy trading system
- Language: English
- Type: Master's theses , text
- Identifier: http://hdl.handle.net/10962/192119 , vital:45197
- Description: The use of renewable energy is increasing every year as it is seen as a viable and sustain- able long-term alternative to fossil-based sources of power. Emerging technologies are being merged with existing renewable energy systems to address some of the challenges associated with renewable energy, such as reliability and limited storage facilities for the generated energy. The Internet of Things (IoT) has made it possible for consumers to make money by selling off excess energy back to the utility company through smart grids that allow bi-directional communication between the consumer and the utility company. The major drawback of this is that the utility company still plays a central role in this setup as they are the only buyer of this excess energy generated from renewable energy sources. This research intends to use blockchain technology by leveraging its decentralized architecture to enable other individuals to be able to purchase this excess energy. Blockchain technology is first explained in detail, and its main features, such as consensus mechanisms, are examined. This evaluation of blockchain technology gives rise to some design questions that are taken into consideration to create a low-energy, low-computation Ethereum-based blockchain network that is the foundation for a peer-to-peer energy trading system. The peer-to-peer energy trading system makes use of smart meters to collect data about energy usage and gives users a web-based interface where they can transact with each other. A smart contract is also designed to facilitate payments for transactions. Lastly, the system is tested by carrying out transactions and transferring energy from one node in the system to another. , Thesis (MSc) -- Faculty of Science, Computer Science, 2021
- Full Text:
- Date Issued: 2021-10-29
- Authors: Ncube, Tyron
- Date: 2021-10-29
- Subjects: Blockchains (Databases) , Internet of things , Renewable energy sources , Smart power grids , Peer-to-peer architecture (Computer networks) , Energy trading system
- Language: English
- Type: Master's theses , text
- Identifier: http://hdl.handle.net/10962/192119 , vital:45197
- Description: The use of renewable energy is increasing every year as it is seen as a viable and sustain- able long-term alternative to fossil-based sources of power. Emerging technologies are being merged with existing renewable energy systems to address some of the challenges associated with renewable energy, such as reliability and limited storage facilities for the generated energy. The Internet of Things (IoT) has made it possible for consumers to make money by selling off excess energy back to the utility company through smart grids that allow bi-directional communication between the consumer and the utility company. The major drawback of this is that the utility company still plays a central role in this setup as they are the only buyer of this excess energy generated from renewable energy sources. This research intends to use blockchain technology by leveraging its decentralized architecture to enable other individuals to be able to purchase this excess energy. Blockchain technology is first explained in detail, and its main features, such as consensus mechanisms, are examined. This evaluation of blockchain technology gives rise to some design questions that are taken into consideration to create a low-energy, low-computation Ethereum-based blockchain network that is the foundation for a peer-to-peer energy trading system. The peer-to-peer energy trading system makes use of smart meters to collect data about energy usage and gives users a web-based interface where they can transact with each other. A smart contract is also designed to facilitate payments for transactions. Lastly, the system is tested by carrying out transactions and transferring energy from one node in the system to another. , Thesis (MSc) -- Faculty of Science, Computer Science, 2021
- Full Text:
- Date Issued: 2021-10-29
Remote fidelity of Container-Based Network Emulators
- Authors: Peach, Schalk Willem
- Date: 2021-10-29
- Subjects: Computer networks Security measures , Intrusion detection systems (Computer security) , Computer security , Host-based intrusion detection systems (Computer security) , Emulators (Computer programs) , Computer network protocols , Container-Based Network Emulators (CBNEs) , Network Experimentation Platforms (NEPs)
- Language: English
- Type: Master's theses , text
- Identifier: http://hdl.handle.net/10962/192141 , vital:45199
- Description: This thesis examines if Container-Based Network Emulators (CBNEs) are able to instantiate emulated nodes that provide sufficient realism to be used in information security experiments. The realism measure used is based on the information available from the point of view of a remote attacker. During the evaluation of a Container-Based Network Emulator (CBNE) as a platform to replicate production networks for information security experiments, it was observed that nmap fingerprinting returned Operating System (OS) family and version results inconsistent with that of the host Operating System (OS). CBNEs utilise Linux namespaces, the technology used for containerisation, to instantiate \emulated" hosts for experimental networks. Linux containers partition resources of the host OS to create lightweight virtual machines that share a single OS kernel. As all emulated hosts share the same kernel in a CBNE network, there is a reasonable expectation that the fingerprints of the host OS and emulated hosts should be the same. Based on how CBNEs instantiate emulated networks and that fingerprinting returned inconsistent results, it was hypothesised that the technologies used to construct CBNEs are capable of influencing fingerprints generated by utilities such as nmap. It was predicted that hosts emulated using different CBNEs would show deviations in remotely generated fingerprints when compared to fingerprints generated for the host OS. An experimental network consisting of two emulated hosts and a Layer 2 switch was instantiated on multiple CBNEs using the same host OS. Active and passive fingerprinting was conducted between the emulated hosts to generate fingerprints and OS family and version matches. Passive fingerprinting failed to produce OS family and version matches as the fingerprint databases for these utilities are no longer maintained. For active fingerprinting the OS family results were consistent between tested systems and the host OS, though OS version results reported was inconsistent. A comparison of the generated fingerprints revealed that for certain CBNEs fingerprint features related to network stack optimisations of the host OS deviated from other CBNEs and the host OS. The hypothesis that CBNEs can influence remotely generated fingerprints was partially confirmed. One CBNE system modified Linux kernel networking options, causing a deviation from fingerprints generated for other tested systems and the host OS. The hypothesis was also partially rejected as the technologies used by CBNEs do not influence the remote fidelity of emulated hosts. , Thesis (MSc) -- Faculty of Science, Computer Science, 2021
- Full Text:
- Date Issued: 2021-10-29
- Authors: Peach, Schalk Willem
- Date: 2021-10-29
- Subjects: Computer networks Security measures , Intrusion detection systems (Computer security) , Computer security , Host-based intrusion detection systems (Computer security) , Emulators (Computer programs) , Computer network protocols , Container-Based Network Emulators (CBNEs) , Network Experimentation Platforms (NEPs)
- Language: English
- Type: Master's theses , text
- Identifier: http://hdl.handle.net/10962/192141 , vital:45199
- Description: This thesis examines if Container-Based Network Emulators (CBNEs) are able to instantiate emulated nodes that provide sufficient realism to be used in information security experiments. The realism measure used is based on the information available from the point of view of a remote attacker. During the evaluation of a Container-Based Network Emulator (CBNE) as a platform to replicate production networks for information security experiments, it was observed that nmap fingerprinting returned Operating System (OS) family and version results inconsistent with that of the host Operating System (OS). CBNEs utilise Linux namespaces, the technology used for containerisation, to instantiate \emulated" hosts for experimental networks. Linux containers partition resources of the host OS to create lightweight virtual machines that share a single OS kernel. As all emulated hosts share the same kernel in a CBNE network, there is a reasonable expectation that the fingerprints of the host OS and emulated hosts should be the same. Based on how CBNEs instantiate emulated networks and that fingerprinting returned inconsistent results, it was hypothesised that the technologies used to construct CBNEs are capable of influencing fingerprints generated by utilities such as nmap. It was predicted that hosts emulated using different CBNEs would show deviations in remotely generated fingerprints when compared to fingerprints generated for the host OS. An experimental network consisting of two emulated hosts and a Layer 2 switch was instantiated on multiple CBNEs using the same host OS. Active and passive fingerprinting was conducted between the emulated hosts to generate fingerprints and OS family and version matches. Passive fingerprinting failed to produce OS family and version matches as the fingerprint databases for these utilities are no longer maintained. For active fingerprinting the OS family results were consistent between tested systems and the host OS, though OS version results reported was inconsistent. A comparison of the generated fingerprints revealed that for certain CBNEs fingerprint features related to network stack optimisations of the host OS deviated from other CBNEs and the host OS. The hypothesis that CBNEs can influence remotely generated fingerprints was partially confirmed. One CBNE system modified Linux kernel networking options, causing a deviation from fingerprints generated for other tested systems and the host OS. The hypothesis was also partially rejected as the technologies used by CBNEs do not influence the remote fidelity of emulated hosts. , Thesis (MSc) -- Faculty of Science, Computer Science, 2021
- Full Text:
- Date Issued: 2021-10-29
Building the field component of a smart irrigation system: A detailed experience of a computer science graduate
- Authors: Pipile, Yamnkelani Yonela
- Date: 2021-10
- Subjects: Irrigation efficiency Computer-aided design South Africa , Irrigation projects Computer-aided design South Africa , Internet of things , Machine-to-machine communications , Smart water grids South Africa , Raspberry Pi (Computer) , Arduino (Programmable controller) , ZigBee , MQTT (MQ Telemetry Transport) , MQTT-SN , XBee
- Language: English
- Type: Master's theses , text
- Identifier: http://hdl.handle.net/10962/191814 , vital:45167
- Description: South Africa is a semi-arid area with an average annual rainfall of approximately 450mm, 60 per cent of which goes towards irrigation. Current irrigation systems generally apply water in a uniform manner across a field, which is both inefficient and can kill the plants. The Internet of Things (IoT), an emerging technology involving the utilization of sensors and actuators to build complex feedback systems, present an opportunity to build a smart irrigation solution. This research project illustrates the development of the field components of a water monitoring system using off the shelf and inexpensive components, exploring at the same time how easy or difficult it would be for a general Computer Science graduate to use hardware components and associated tools within the IoT area. The problem was initially broken down through a classical top-down process, in order to identify the components such as micro-computers, micro- controllers, sensors and network connections, that would be needed to build the solution. I then selected the Raspberry Pi 3, the Arduino Arduino Uno, the MH-Sensor-Series hygrometer, the MQTT messaging protocol, and the ZigBee communication protocol as implemented in the XBee S2C. Once the components were identified, the work followed a bottom-up approach: I studied the components in isolation and relative to each other, through a structured series of experiments, with each experiment addressing a specific component and examining how easy was to use the component. While each experiment allowed the author to acquire and deepen her understanding of each component, and progressively built a more sophisticated prototype, towards the complete solution. I found the vast majority of the identified components and tools to be easy to use, well documented, and most importantly, mature for consumption by our target user, until I encountered the MQTT-SN (MQTT-Sensor Network) implementation, not as mature as the rest. This resulted in us designing and implementing a light-weight, general ZigBee/MQTT gateway, named “yoGa” (Yonella's Gateway) from the author. At the end of the research, I was able to build the field components of a smart irrigation system using the selected tools, including the yoGa gateway, proving practically that a Computer Science graduate from a South African University can become productive in the emerging IoT area. , Thesis (MSc) -- Faculty of Science, Computer Science, 2021
- Full Text:
- Date Issued: 2021-10
- Authors: Pipile, Yamnkelani Yonela
- Date: 2021-10
- Subjects: Irrigation efficiency Computer-aided design South Africa , Irrigation projects Computer-aided design South Africa , Internet of things , Machine-to-machine communications , Smart water grids South Africa , Raspberry Pi (Computer) , Arduino (Programmable controller) , ZigBee , MQTT (MQ Telemetry Transport) , MQTT-SN , XBee
- Language: English
- Type: Master's theses , text
- Identifier: http://hdl.handle.net/10962/191814 , vital:45167
- Description: South Africa is a semi-arid area with an average annual rainfall of approximately 450mm, 60 per cent of which goes towards irrigation. Current irrigation systems generally apply water in a uniform manner across a field, which is both inefficient and can kill the plants. The Internet of Things (IoT), an emerging technology involving the utilization of sensors and actuators to build complex feedback systems, present an opportunity to build a smart irrigation solution. This research project illustrates the development of the field components of a water monitoring system using off the shelf and inexpensive components, exploring at the same time how easy or difficult it would be for a general Computer Science graduate to use hardware components and associated tools within the IoT area. The problem was initially broken down through a classical top-down process, in order to identify the components such as micro-computers, micro- controllers, sensors and network connections, that would be needed to build the solution. I then selected the Raspberry Pi 3, the Arduino Arduino Uno, the MH-Sensor-Series hygrometer, the MQTT messaging protocol, and the ZigBee communication protocol as implemented in the XBee S2C. Once the components were identified, the work followed a bottom-up approach: I studied the components in isolation and relative to each other, through a structured series of experiments, with each experiment addressing a specific component and examining how easy was to use the component. While each experiment allowed the author to acquire and deepen her understanding of each component, and progressively built a more sophisticated prototype, towards the complete solution. I found the vast majority of the identified components and tools to be easy to use, well documented, and most importantly, mature for consumption by our target user, until I encountered the MQTT-SN (MQTT-Sensor Network) implementation, not as mature as the rest. This resulted in us designing and implementing a light-weight, general ZigBee/MQTT gateway, named “yoGa” (Yonella's Gateway) from the author. At the end of the research, I was able to build the field components of a smart irrigation system using the selected tools, including the yoGa gateway, proving practically that a Computer Science graduate from a South African University can become productive in the emerging IoT area. , Thesis (MSc) -- Faculty of Science, Computer Science, 2021
- Full Text:
- Date Issued: 2021-10
An exploratory investigation into an Integrated Vulnerability and Patch Management Framework
- Authors: Carstens, Duane
- Date: 2021-04
- Subjects: Computer security , Computer security -- Management , Computer networks -- Security measures , Patch Management , Integrated Vulnerability
- Language: English
- Type: thesis , text , Masters , MSc
- Identifier: http://hdl.handle.net/10962/177940 , vital:42892
- Description: In the rapidly changing world of cybersecurity, the constant increase of vulnerabilities continues to be a prevalent issue for many organisations. Malicious actors are aware that most organisations cannot timeously patch known vulnerabilities and are ill-prepared to protect against newly created vulnerabilities where a signature or an available patch has not yet been created. Consequently, information security personnel face ongoing challenges to mitigate these risks. In this research, the problem of remediation in a world of increasing vulnerabilities is considered. The current paradigm of vulnerability and patch management is reviewed using a pragmatic approach to all associated variables of these services / practices and, as a result, what is working and what is not working in terms of remediation is understood. In addition to the analysis, a taxonomy is created to provide a graphical representation of all associated variables to vulnerability and patch management based on existing literature. Frameworks currently being utilised in the industry to create an effective engagement model between vulnerability and patch management services are considered. The link between quantifying a threat, vulnerability and consequence; what Microsoft has available for patching; and the action plan for resulting vulnerabilities is explored. Furthermore, the processes and means of communication between each of these services are investigated to ensure there is effective remediation of vulnerabilities, ultimately improving the security risk posture of an organisation. In order to effectively measure the security risk posture, progress is measured between each of these services through a single averaged measurement metric. The outcome of the research highlights influencing factors that impact successful vulnerability management, in line with identified themes from the research taxonomy. These influencing factors are however significantly undermined due to resources within the same organisations not having a clear and consistent understanding of their role, organisational capabilities and objectives for effective vulnerability and patch management within their organisations. , Thesis (MSc) -- Faculty of Science, Computer Science, 2021
- Full Text:
- Date Issued: 2021-04
- Authors: Carstens, Duane
- Date: 2021-04
- Subjects: Computer security , Computer security -- Management , Computer networks -- Security measures , Patch Management , Integrated Vulnerability
- Language: English
- Type: thesis , text , Masters , MSc
- Identifier: http://hdl.handle.net/10962/177940 , vital:42892
- Description: In the rapidly changing world of cybersecurity, the constant increase of vulnerabilities continues to be a prevalent issue for many organisations. Malicious actors are aware that most organisations cannot timeously patch known vulnerabilities and are ill-prepared to protect against newly created vulnerabilities where a signature or an available patch has not yet been created. Consequently, information security personnel face ongoing challenges to mitigate these risks. In this research, the problem of remediation in a world of increasing vulnerabilities is considered. The current paradigm of vulnerability and patch management is reviewed using a pragmatic approach to all associated variables of these services / practices and, as a result, what is working and what is not working in terms of remediation is understood. In addition to the analysis, a taxonomy is created to provide a graphical representation of all associated variables to vulnerability and patch management based on existing literature. Frameworks currently being utilised in the industry to create an effective engagement model between vulnerability and patch management services are considered. The link between quantifying a threat, vulnerability and consequence; what Microsoft has available for patching; and the action plan for resulting vulnerabilities is explored. Furthermore, the processes and means of communication between each of these services are investigated to ensure there is effective remediation of vulnerabilities, ultimately improving the security risk posture of an organisation. In order to effectively measure the security risk posture, progress is measured between each of these services through a single averaged measurement metric. The outcome of the research highlights influencing factors that impact successful vulnerability management, in line with identified themes from the research taxonomy. These influencing factors are however significantly undermined due to resources within the same organisations not having a clear and consistent understanding of their role, organisational capabilities and objectives for effective vulnerability and patch management within their organisations. , Thesis (MSc) -- Faculty of Science, Computer Science, 2021
- Full Text:
- Date Issued: 2021-04