Evaluation of the effectiveness of small aperture network telescopes as IBR data sources
- Authors: Chindipha, Stones Dalitso
- Date: 2023-03-31
- Subjects: Computer networks Monitoring , Computer networks Security measures , Computer bootstrapping , Time-series analysis , Regression analysis , Mathematical models
- Language: English
- Type: Academic theses , Doctoral theses , text
- Identifier: http://hdl.handle.net/10962/366264 , vital:65849 , DOI https://doi.org/10.21504/10962/366264
- Description: The use of network telescopes to collect unsolicited network traffic by monitoring unallocated address space has been in existence for over two decades. Past research has shown that there is a lot of activity happening in this unallocated space that needs monitoring as it carries threat intelligence data that has proven to be very useful in the security field. Prior to the emergence of the Internet of Things (IoT), commercialisation of IP addresses and widespread of mobile devices, there was a large pool of IPv4 addresses and thus reserving IPv4 addresses to be used for monitoring unsolicited activities going in the unallocated space was not a problem. Now, preservation of such IPv4 addresses just for monitoring is increasingly difficult as there is not enough free addresses in the IPv4 address space to be used for just monitoring. This is the case because such monitoring is seen as a ’non-productive’ use of the IP addresses. This research addresses the problem brought forth by this IPv4 address space exhaustion in relation to Internet Background Radiation (IBR) monitoring. In order to address the research questions, this research developed four mathematical models: Absolute Mean Accuracy Percentage Score (AMAPS), Symmetric Absolute Mean Accuracy Percentage Score (SAMAPS), Standardised Mean Absolute Error (SMAE), and Standardised Mean Absolute Scaled Error (SMASE). These models are used to evaluate the research objectives and quantify the variations that exist between different samples. The sample sizes represent different lens sizes of the telescopes. The study has brought to light a time series plot that shows the expected proportion of unique source IP addresses collected over time. The study also imputed data using the smaller /24 IPv4 net-block subnets to regenerate the missing data points using bootstrapping to create confidence intervals (CI). The findings from the simulated data supports the findings computed from the models. The CI offers a boost to decision making. Through a series of experiments with monthly and quarterly datasets, the study proposed a 95% - 99% confidence level to be used. It was known that large network telescopes collect more threat intelligence data than small-sized network telescopes, however, no study, to the best of our knowledge, has ever quantified such a knowledge gap. With the findings from the study, small-sized network telescope users can now use their network telescopes with full knowledge of gap that exists in the data collected between different network telescopes. , Thesis (PhD) -- Faculty of Science, Computer Science, 2023
- Full Text:
- Date Issued: 2023-03-31
- Authors: Chindipha, Stones Dalitso
- Date: 2023-03-31
- Subjects: Computer networks Monitoring , Computer networks Security measures , Computer bootstrapping , Time-series analysis , Regression analysis , Mathematical models
- Language: English
- Type: Academic theses , Doctoral theses , text
- Identifier: http://hdl.handle.net/10962/366264 , vital:65849 , DOI https://doi.org/10.21504/10962/366264
- Description: The use of network telescopes to collect unsolicited network traffic by monitoring unallocated address space has been in existence for over two decades. Past research has shown that there is a lot of activity happening in this unallocated space that needs monitoring as it carries threat intelligence data that has proven to be very useful in the security field. Prior to the emergence of the Internet of Things (IoT), commercialisation of IP addresses and widespread of mobile devices, there was a large pool of IPv4 addresses and thus reserving IPv4 addresses to be used for monitoring unsolicited activities going in the unallocated space was not a problem. Now, preservation of such IPv4 addresses just for monitoring is increasingly difficult as there is not enough free addresses in the IPv4 address space to be used for just monitoring. This is the case because such monitoring is seen as a ’non-productive’ use of the IP addresses. This research addresses the problem brought forth by this IPv4 address space exhaustion in relation to Internet Background Radiation (IBR) monitoring. In order to address the research questions, this research developed four mathematical models: Absolute Mean Accuracy Percentage Score (AMAPS), Symmetric Absolute Mean Accuracy Percentage Score (SAMAPS), Standardised Mean Absolute Error (SMAE), and Standardised Mean Absolute Scaled Error (SMASE). These models are used to evaluate the research objectives and quantify the variations that exist between different samples. The sample sizes represent different lens sizes of the telescopes. The study has brought to light a time series plot that shows the expected proportion of unique source IP addresses collected over time. The study also imputed data using the smaller /24 IPv4 net-block subnets to regenerate the missing data points using bootstrapping to create confidence intervals (CI). The findings from the simulated data supports the findings computed from the models. The CI offers a boost to decision making. Through a series of experiments with monthly and quarterly datasets, the study proposed a 95% - 99% confidence level to be used. It was known that large network telescopes collect more threat intelligence data than small-sized network telescopes, however, no study, to the best of our knowledge, has ever quantified such a knowledge gap. With the findings from the study, small-sized network telescope users can now use their network telescopes with full knowledge of gap that exists in the data collected between different network telescopes. , Thesis (PhD) -- Faculty of Science, Computer Science, 2023
- Full Text:
- Date Issued: 2023-03-31
Subjective measurements of persistence of time series
- Authors: Poswayo, Sihle
- Date: 2018
- Subjects: Time-series analysis , Space and time Time -- Philosophy
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10948/34372 , vital:33361
- Description: In this paper we suggest the use of subjective judgements to measure persistence in time series by comparing pairs of graphs with different Hurst exponents. The group of respondents consisted of 40 volunteers who were asked to identify the more jagged out of two graphs presented to them (that is, less persistent). The respondents were approached as a group and requested to work independently in the completion of ques- tionnaires administered to them. The respondents were supervised by the researchers. The graphs were simulated using time series package of Mathematica R [26]. The re- sponses were processed using an algorithm based on the Thurstone-Mosteller model for paired comparisons [29]. The results of the analysis show that the human eye is capable of distinguishing graphs of time series with Hurst exponent difference as small as only 0.02.
- Full Text:
- Date Issued: 2018
- Authors: Poswayo, Sihle
- Date: 2018
- Subjects: Time-series analysis , Space and time Time -- Philosophy
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10948/34372 , vital:33361
- Description: In this paper we suggest the use of subjective judgements to measure persistence in time series by comparing pairs of graphs with different Hurst exponents. The group of respondents consisted of 40 volunteers who were asked to identify the more jagged out of two graphs presented to them (that is, less persistent). The respondents were approached as a group and requested to work independently in the completion of ques- tionnaires administered to them. The respondents were supervised by the researchers. The graphs were simulated using time series package of Mathematica R [26]. The re- sponses were processed using an algorithm based on the Thurstone-Mosteller model for paired comparisons [29]. The results of the analysis show that the human eye is capable of distinguishing graphs of time series with Hurst exponent difference as small as only 0.02.
- Full Text:
- Date Issued: 2018
Good's casualty for time series: a regime-switching framework
- Authors: Mlambo, Farai Fredric
- Date: 2014
- Subjects: Time-series analysis , Econometrics
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/6018 , vital:21025
- Description: Causal analysis is a significant role-playing field in the applied sciences such as statistics, econometrics, and technometrics. Particularly, probability-raising models have warranted significant research interest. Most of the discussions in this area are philosophical in nature. Contemporarily, the econometric causality theory, developed by C.J.W. Granger, is popular in practical, time series causal applications. While this type of causality technique has many strong features, it has serious limitations. The processes studied, in particular, should be stationary and causal relationships are restricted to be linear. However, we cannot classify regime-switching processes as linear and stationary. I.J. Good proposed a probabilistic, event-type explication of causality that circumvents some of the limitations of Granger’s methodology. This work uses the probability raising causality ideology, as postulated by Good, to propose some causal analysis methodology applicable in a stochastic, non-stationary domain. There is a proposal made for a Good’s causality test, by transforming the originally specified probabilistic causality theory from random events to a stochastic, regime-switching framework. The researcher performed methodological validation via causality simulations for a Markov, regime-switching model. The proposed test can be used to detect whether none stochastic process is causal to the observed behaviour of another, probabilistically. In particular, the regime-switch causality explication proposed herein is pivotal to the results articulated. This research also examines the power of the proposed test by using simulations, and outlines some steps that one may take in using the test in a practical setting.
- Full Text:
- Date Issued: 2014
- Authors: Mlambo, Farai Fredric
- Date: 2014
- Subjects: Time-series analysis , Econometrics
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/6018 , vital:21025
- Description: Causal analysis is a significant role-playing field in the applied sciences such as statistics, econometrics, and technometrics. Particularly, probability-raising models have warranted significant research interest. Most of the discussions in this area are philosophical in nature. Contemporarily, the econometric causality theory, developed by C.J.W. Granger, is popular in practical, time series causal applications. While this type of causality technique has many strong features, it has serious limitations. The processes studied, in particular, should be stationary and causal relationships are restricted to be linear. However, we cannot classify regime-switching processes as linear and stationary. I.J. Good proposed a probabilistic, event-type explication of causality that circumvents some of the limitations of Granger’s methodology. This work uses the probability raising causality ideology, as postulated by Good, to propose some causal analysis methodology applicable in a stochastic, non-stationary domain. There is a proposal made for a Good’s causality test, by transforming the originally specified probabilistic causality theory from random events to a stochastic, regime-switching framework. The researcher performed methodological validation via causality simulations for a Markov, regime-switching model. The proposed test can be used to detect whether none stochastic process is causal to the observed behaviour of another, probabilistically. In particular, the regime-switch causality explication proposed herein is pivotal to the results articulated. This research also examines the power of the proposed test by using simulations, and outlines some steps that one may take in using the test in a practical setting.
- Full Text:
- Date Issued: 2014
An analysis of neural networks and time series techniques for demand forecasting
- Authors: Winn, David
- Date: 2007
- Subjects: Time-series analysis , Neural networks (Computer science) , Artificial intelligence , Marketing -- Management , Marketing -- Data processing , Marketing -- Statistical methods , Consumer behaviour
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: vital:5572 , http://hdl.handle.net/10962/d1004362 , Time-series analysis , Neural networks (Computer science) , Artificial intelligence , Marketing -- Management , Marketing -- Data processing , Marketing -- Statistical methods , Consumer behaviour
- Description: This research examines the plausibility of developing demand forecasting techniques which are consistently and accurately able to predict demand. Time Series Techniques and Artificial Neural Networks are both investigated. Deodorant sales in South Africa are specifically studied in this thesis. Marketing techniques which are used to influence consumer buyer behaviour are considered, and these factors are integrated into the forecasting models wherever possible. The results of this research suggest that Artificial Neural Networks can be developed which consistently outperform industry forecasting targets as well as Time Series forecasts, suggesting that producers could reduce costs by adopting this more effective method.
- Full Text:
- Date Issued: 2007
- Authors: Winn, David
- Date: 2007
- Subjects: Time-series analysis , Neural networks (Computer science) , Artificial intelligence , Marketing -- Management , Marketing -- Data processing , Marketing -- Statistical methods , Consumer behaviour
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: vital:5572 , http://hdl.handle.net/10962/d1004362 , Time-series analysis , Neural networks (Computer science) , Artificial intelligence , Marketing -- Management , Marketing -- Data processing , Marketing -- Statistical methods , Consumer behaviour
- Description: This research examines the plausibility of developing demand forecasting techniques which are consistently and accurately able to predict demand. Time Series Techniques and Artificial Neural Networks are both investigated. Deodorant sales in South Africa are specifically studied in this thesis. Marketing techniques which are used to influence consumer buyer behaviour are considered, and these factors are integrated into the forecasting models wherever possible. The results of this research suggest that Artificial Neural Networks can be developed which consistently outperform industry forecasting targets as well as Time Series forecasts, suggesting that producers could reduce costs by adopting this more effective method.
- Full Text:
- Date Issued: 2007
- «
- ‹
- 1
- ›
- »