Designing a proxemic natural user interface to support information sharing among co-located mobile devices
- Authors: Lee Son, Timothy
- Date: 2016
- Subjects: User interfaces (Computer systems) Mobile computing
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/12845 , vital:27126
- Description: Existing information sharing methods used by mobile devices require the user to repeat a series of steps to share one or more selected files with another individual, where the entire process is repeated for sharing the same file(s) with multiple individuals. Due to constant advancements in mobile computing, mobile devices are able to provide new, more intuitive, and easier solutions to sharing information. Natural User Interfaces (NUIs) primarily focus on the reuse of existing knowledge (from other applications or activities) or human abilities (such as touch, speech, and gestures) to provide a more accurate and usable solution to existing human computer interaction (HCI) systems. The interaction techniques of NUIs have transformed these human abilities. The main research objective was to design a proxemic NUI to provide an accurate and usable solution to support information sharing among co-located mobile devices. The development of MotionShare supported multiple devices to share information simultaneously using NUI interaction techniques. An initial calibration setup allowed MotionShare to calculate the approximate positions and orientations of every device in the environment. Novel NUI interaction techniques were implemented because of the known positions of these devices. MotionShare was evaluated using two evaluation techniques, namely analytical and experimental. The results showed device positioning to have a mean precision, trueness, and recall of 72.21%, 91.39%, and 71.63% respectively. The results showed MotionShare gestures to have a recall of 90.50% and 100.00% for the point gesture and the touch gesture respectively. The experimental technique consisted of a pilot study (formative evaluation) and a usability evaluation (summative evaluation). The results of the usability evaluation showed high user satisfaction and statistical analysis, which revealed MotionShare to achieve the main research objective. These results also showed that participants preferred the touch gesture to the point gesture, but expressed both gestures can be utilised for the tasks of MotionShare.
- Full Text:
- Date Issued: 2016
- Authors: Lee Son, Timothy
- Date: 2016
- Subjects: User interfaces (Computer systems) Mobile computing
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/12845 , vital:27126
- Description: Existing information sharing methods used by mobile devices require the user to repeat a series of steps to share one or more selected files with another individual, where the entire process is repeated for sharing the same file(s) with multiple individuals. Due to constant advancements in mobile computing, mobile devices are able to provide new, more intuitive, and easier solutions to sharing information. Natural User Interfaces (NUIs) primarily focus on the reuse of existing knowledge (from other applications or activities) or human abilities (such as touch, speech, and gestures) to provide a more accurate and usable solution to existing human computer interaction (HCI) systems. The interaction techniques of NUIs have transformed these human abilities. The main research objective was to design a proxemic NUI to provide an accurate and usable solution to support information sharing among co-located mobile devices. The development of MotionShare supported multiple devices to share information simultaneously using NUI interaction techniques. An initial calibration setup allowed MotionShare to calculate the approximate positions and orientations of every device in the environment. Novel NUI interaction techniques were implemented because of the known positions of these devices. MotionShare was evaluated using two evaluation techniques, namely analytical and experimental. The results showed device positioning to have a mean precision, trueness, and recall of 72.21%, 91.39%, and 71.63% respectively. The results showed MotionShare gestures to have a recall of 90.50% and 100.00% for the point gesture and the touch gesture respectively. The experimental technique consisted of a pilot study (formative evaluation) and a usability evaluation (summative evaluation). The results of the usability evaluation showed high user satisfaction and statistical analysis, which revealed MotionShare to achieve the main research objective. These results also showed that participants preferred the touch gesture to the point gesture, but expressed both gestures can be utilised for the tasks of MotionShare.
- Full Text:
- Date Issued: 2016
A business intelligence framework for supporting strategic sustainability information management in higher education
- Authors: Haupt, Ross
- Date: 2016
- Subjects: Business intelligence , Strategic planning -- Management
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/5319 , vital:20832
- Description: In the higher education sector, a number of Higher Education Institutions (HEIs) are playing a leading role in promoting sustainable initiatives. Effectively managing these initiatives however can be a complex task and requires data and information from multiple aspects of operations. In an HEI, operating sustainably means ensuring financial sustainability, social sustainability, environmental sustainability and educational sustainability. In order to manage sustainability effectively, HEIs require an integrated tool that can provide information on all areas of sustainability. HEIs face a number of challenges in effectively managing sustainability information, such as siloed data and information, and poor sharing and communication of information. Business Intelligence (BI) can assist in overcoming many of the challenges faced by organisations in effectively managing strategic sustainability information. This study investigates both the constraints to effective sustainability information management and the challenges of BI. A BI framework to support effective strategic sustainability information management is proposed. Nelson Mandela Metropolitan University (NMMU) is one such HEI, which is affected by the challenges of managing strategic sustainability information. NMMU is therefore used as a case study in this research. A BI solution, Sustainable BI, was developed based on the proposed framework. The main goal of sustainable BI is to provide strategic management at NMMU with a tool that can provide integrated sustainability information that can assist in overcoming the challenges in effectively managing strategic sustainability information. Sustainable BI was evaluated by strategic management at NMMU who are responsible for managing sustainability at NMMU. The evaluation took place through a usability study. The study revealed to what extent Sustainable BI could effectively manage strategic sustainability information at NMMU. The BI framework was iteratively improved on based on the results of the evaluations. The contributions from this study are a model for sustainability management, a BI Framework to support strategic sustainability information management and a BI solution, Sustainable BI.
- Full Text:
- Date Issued: 2016
- Authors: Haupt, Ross
- Date: 2016
- Subjects: Business intelligence , Strategic planning -- Management
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/5319 , vital:20832
- Description: In the higher education sector, a number of Higher Education Institutions (HEIs) are playing a leading role in promoting sustainable initiatives. Effectively managing these initiatives however can be a complex task and requires data and information from multiple aspects of operations. In an HEI, operating sustainably means ensuring financial sustainability, social sustainability, environmental sustainability and educational sustainability. In order to manage sustainability effectively, HEIs require an integrated tool that can provide information on all areas of sustainability. HEIs face a number of challenges in effectively managing sustainability information, such as siloed data and information, and poor sharing and communication of information. Business Intelligence (BI) can assist in overcoming many of the challenges faced by organisations in effectively managing strategic sustainability information. This study investigates both the constraints to effective sustainability information management and the challenges of BI. A BI framework to support effective strategic sustainability information management is proposed. Nelson Mandela Metropolitan University (NMMU) is one such HEI, which is affected by the challenges of managing strategic sustainability information. NMMU is therefore used as a case study in this research. A BI solution, Sustainable BI, was developed based on the proposed framework. The main goal of sustainable BI is to provide strategic management at NMMU with a tool that can provide integrated sustainability information that can assist in overcoming the challenges in effectively managing strategic sustainability information. Sustainable BI was evaluated by strategic management at NMMU who are responsible for managing sustainability at NMMU. The evaluation took place through a usability study. The study revealed to what extent Sustainable BI could effectively manage strategic sustainability information at NMMU. The BI framework was iteratively improved on based on the results of the evaluations. The contributions from this study are a model for sustainability management, a BI Framework to support strategic sustainability information management and a BI solution, Sustainable BI.
- Full Text:
- Date Issued: 2016
Assessing the productivity of selective container terminals in Africa using Data Envelopment Analysis (DEA)
- Mienie, Barend Jacobus, Brettenny, Warren
- Authors: Mienie, Barend Jacobus , Brettenny, Warren
- Date: 2016
- Subjects: Data envelopment analysis -- Africa Employees -- Rating of -- Africa
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/12054 , vital:27026
- Description: Data envelopment analysis (DEA) is used to assess the efficiency of 15 container terminals in Africa. The models proposed by Charnes, Cooper and Rhodes (1978) and Banker, Charnes and Cooper (1984) are used to determine and rank the efficiencies of the container terminals for 2013 and 2014. The results show that selected South African container terminals can improve on their operations relative to some of their neighbours to the North. Bootstrapping methods are used to investigate and clarify the results. The Malmquist Productivity Index (MPI) model is used to track and explain changes in efficiency over the period of assessment.
- Full Text:
- Date Issued: 2016
- Authors: Mienie, Barend Jacobus , Brettenny, Warren
- Date: 2016
- Subjects: Data envelopment analysis -- Africa Employees -- Rating of -- Africa
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/12054 , vital:27026
- Description: Data envelopment analysis (DEA) is used to assess the efficiency of 15 container terminals in Africa. The models proposed by Charnes, Cooper and Rhodes (1978) and Banker, Charnes and Cooper (1984) are used to determine and rank the efficiencies of the container terminals for 2013 and 2014. The results show that selected South African container terminals can improve on their operations relative to some of their neighbours to the North. Bootstrapping methods are used to investigate and clarify the results. The Malmquist Productivity Index (MPI) model is used to track and explain changes in efficiency over the period of assessment.
- Full Text:
- Date Issued: 2016
Sustainability reporting guidelines for higher educational institutions in South Africa
- Authors: Zietsman, Jaco
- Date: 2019
- Subjects: Education, Higher -- South Africa , Corporation reports Sustainability
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/33384 , vital:32754
- Description: In the higher education sector, a number of Higher Education Institutions (HEIs) are playing a leading role in promoting sustainable initiatives. Managing these initiatives effectively can be a complex task and requires data and information from multiple sources. HEIs must ensure financial sustainability, social sustainability, environmental sustainability and educational sustainability. HEIs in South Africa are required to produce a sustainability report for the Department of Higher Education and Training (DHET) on an annual basis. HEIs are not required to use a specific set of guidelines to create a report that complies with the DHET reporting requirements. HEIs face a number of challenges in effectively managing and reporting on sustainability information, such as poor sharing and communication of information and combining information from different sources to form an integrated report. Well-structured guidelines that adheres to institution standards and governmental reporting requirements can effectively streamline the sustainability reporting process. This study investigates the requirements and challenges of effective sustainability reporting for HEIs in South Africa. A set of Global Reporting Initiative (GRI) G4 guidelines were reworked to support effective sustainability reporting by South African HEIs. Nelson Mandela University is one such HEI, which is affected by the challenges of managing and reporting on strategic sustainability information. Nelson Mandela University was therefore used as a case study in this research study. An in-depth study was done exploring how prominent international universities apply the GRI guidelines to contribute and generate integrated sustainability reports for their specific HEIs and general reporting needs and requirements. Additionally, an in-depth study of the German integrated reporting guidelines for HEI’s was conducted. Furthermore, a study of the South African DHET reporting requirements was conducted to explore the similarities that exists between the GRI (G4) guidelines and DHET requirements. The guidelines were evaluated by Nelson Mandela University personnel and academics. The final product consists of a set of GRI guidelines that have been adapted to satisfy both GRI and DHET requirements for integrated sustainability reporting for South African HEIs. The contributions from this study are a set of GRI G4 guidelines and examples for integrated sustainability reporting and management for HEIs in South Africa. The set of adapted GRI guidelines for HEIs in South Africa was created with the assistance of the strategic management departments at Nelson Mandela University. The GRI guidelines have been reworded to be specifically applicable to South African HEIs and contain instructions and guidelines on how to generate an integrated sustainability report for a South African HEI.
- Full Text:
- Date Issued: 2019
- Authors: Zietsman, Jaco
- Date: 2019
- Subjects: Education, Higher -- South Africa , Corporation reports Sustainability
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/33384 , vital:32754
- Description: In the higher education sector, a number of Higher Education Institutions (HEIs) are playing a leading role in promoting sustainable initiatives. Managing these initiatives effectively can be a complex task and requires data and information from multiple sources. HEIs must ensure financial sustainability, social sustainability, environmental sustainability and educational sustainability. HEIs in South Africa are required to produce a sustainability report for the Department of Higher Education and Training (DHET) on an annual basis. HEIs are not required to use a specific set of guidelines to create a report that complies with the DHET reporting requirements. HEIs face a number of challenges in effectively managing and reporting on sustainability information, such as poor sharing and communication of information and combining information from different sources to form an integrated report. Well-structured guidelines that adheres to institution standards and governmental reporting requirements can effectively streamline the sustainability reporting process. This study investigates the requirements and challenges of effective sustainability reporting for HEIs in South Africa. A set of Global Reporting Initiative (GRI) G4 guidelines were reworked to support effective sustainability reporting by South African HEIs. Nelson Mandela University is one such HEI, which is affected by the challenges of managing and reporting on strategic sustainability information. Nelson Mandela University was therefore used as a case study in this research study. An in-depth study was done exploring how prominent international universities apply the GRI guidelines to contribute and generate integrated sustainability reports for their specific HEIs and general reporting needs and requirements. Additionally, an in-depth study of the German integrated reporting guidelines for HEI’s was conducted. Furthermore, a study of the South African DHET reporting requirements was conducted to explore the similarities that exists between the GRI (G4) guidelines and DHET requirements. The guidelines were evaluated by Nelson Mandela University personnel and academics. The final product consists of a set of GRI guidelines that have been adapted to satisfy both GRI and DHET requirements for integrated sustainability reporting for South African HEIs. The contributions from this study are a set of GRI G4 guidelines and examples for integrated sustainability reporting and management for HEIs in South Africa. The set of adapted GRI guidelines for HEIs in South Africa was created with the assistance of the strategic management departments at Nelson Mandela University. The GRI guidelines have been reworded to be specifically applicable to South African HEIs and contain instructions and guidelines on how to generate an integrated sustainability report for a South African HEI.
- Full Text:
- Date Issued: 2019
A Social Media Method for Eliciting Millennials’ Worldviews on the Coastal and Marine Environment
- Authors: Okuah, Obrukevwe Anehwe
- Date: 2020
- Subjects: Millennialism -- Environmental aspects
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/48588 , vital:40893
- Description: A lack of involvement by participants with traditional data collection methods for research has led to insufficient data regarding millennials’ worldviews on the coastal and marine environment. Understanding millennial worldviews could provide insights for policy interventions for sustainable use of the marine and coastal environment. The aim of this research is to design, develop and evaluate an appropriate social media method to elicit millennials’ worldviews on the coastal and marine environment. The methodology used for the research was Design Science Research (DSR), which is a legitimate approach to conducting research in the field of Information Systems. The methods used were a literature review, interviews with social media experts and Social Media Influencers (SMIs), and a focus group discussion with researchers from the field of social sciences. The proposed artefact (the method) can be used to provide guidance to researchers for engaging and eliciting opinions and worldviews of millennials on social media. The method includes a Social Media Influencer Model that illustrates the relationship between SMIs’ characteristics and techniques for engaging the public, and a Social Media Analytics (SMA) Process model that can guide researchers through the steps of eliciting worldviews from the public. Although there are several SMA techniques that can be used, the proposed method uses sentiment analysis as an SMA technique for deriving sentiments from social media data. The method was evaluated by researchers who require a social media method for eliciting millennials worldviews. The findings confirmed some of the techniques identified in literature as well as some additional techniques and processes. It was also evident that using this method could assist researchers for data collection and specifically to obtain worldviews on the marine and coastal environment. The contribution of this study is an artefact that fulfils the need for a social media method for data collection that is more convenient for researchers and millennials and can guide researchers through the steps of eliciting worldviews from the public.
- Full Text:
- Date Issued: 2020
- Authors: Okuah, Obrukevwe Anehwe
- Date: 2020
- Subjects: Millennialism -- Environmental aspects
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/48588 , vital:40893
- Description: A lack of involvement by participants with traditional data collection methods for research has led to insufficient data regarding millennials’ worldviews on the coastal and marine environment. Understanding millennial worldviews could provide insights for policy interventions for sustainable use of the marine and coastal environment. The aim of this research is to design, develop and evaluate an appropriate social media method to elicit millennials’ worldviews on the coastal and marine environment. The methodology used for the research was Design Science Research (DSR), which is a legitimate approach to conducting research in the field of Information Systems. The methods used were a literature review, interviews with social media experts and Social Media Influencers (SMIs), and a focus group discussion with researchers from the field of social sciences. The proposed artefact (the method) can be used to provide guidance to researchers for engaging and eliciting opinions and worldviews of millennials on social media. The method includes a Social Media Influencer Model that illustrates the relationship between SMIs’ characteristics and techniques for engaging the public, and a Social Media Analytics (SMA) Process model that can guide researchers through the steps of eliciting worldviews from the public. Although there are several SMA techniques that can be used, the proposed method uses sentiment analysis as an SMA technique for deriving sentiments from social media data. The method was evaluated by researchers who require a social media method for eliciting millennials worldviews. The findings confirmed some of the techniques identified in literature as well as some additional techniques and processes. It was also evident that using this method could assist researchers for data collection and specifically to obtain worldviews on the marine and coastal environment. The contribution of this study is an artefact that fulfils the need for a social media method for data collection that is more convenient for researchers and millennials and can guide researchers through the steps of eliciting worldviews from the public.
- Full Text:
- Date Issued: 2020
Good's casualty for time series: a regime-switching framework
- Authors: Mlambo, Farai Fredric
- Date: 2014
- Subjects: Time-series analysis , Econometrics
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/6018 , vital:21025
- Description: Causal analysis is a significant role-playing field in the applied sciences such as statistics, econometrics, and technometrics. Particularly, probability-raising models have warranted significant research interest. Most of the discussions in this area are philosophical in nature. Contemporarily, the econometric causality theory, developed by C.J.W. Granger, is popular in practical, time series causal applications. While this type of causality technique has many strong features, it has serious limitations. The processes studied, in particular, should be stationary and causal relationships are restricted to be linear. However, we cannot classify regime-switching processes as linear and stationary. I.J. Good proposed a probabilistic, event-type explication of causality that circumvents some of the limitations of Granger’s methodology. This work uses the probability raising causality ideology, as postulated by Good, to propose some causal analysis methodology applicable in a stochastic, non-stationary domain. There is a proposal made for a Good’s causality test, by transforming the originally specified probabilistic causality theory from random events to a stochastic, regime-switching framework. The researcher performed methodological validation via causality simulations for a Markov, regime-switching model. The proposed test can be used to detect whether none stochastic process is causal to the observed behaviour of another, probabilistically. In particular, the regime-switch causality explication proposed herein is pivotal to the results articulated. This research also examines the power of the proposed test by using simulations, and outlines some steps that one may take in using the test in a practical setting.
- Full Text:
- Date Issued: 2014
- Authors: Mlambo, Farai Fredric
- Date: 2014
- Subjects: Time-series analysis , Econometrics
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/6018 , vital:21025
- Description: Causal analysis is a significant role-playing field in the applied sciences such as statistics, econometrics, and technometrics. Particularly, probability-raising models have warranted significant research interest. Most of the discussions in this area are philosophical in nature. Contemporarily, the econometric causality theory, developed by C.J.W. Granger, is popular in practical, time series causal applications. While this type of causality technique has many strong features, it has serious limitations. The processes studied, in particular, should be stationary and causal relationships are restricted to be linear. However, we cannot classify regime-switching processes as linear and stationary. I.J. Good proposed a probabilistic, event-type explication of causality that circumvents some of the limitations of Granger’s methodology. This work uses the probability raising causality ideology, as postulated by Good, to propose some causal analysis methodology applicable in a stochastic, non-stationary domain. There is a proposal made for a Good’s causality test, by transforming the originally specified probabilistic causality theory from random events to a stochastic, regime-switching framework. The researcher performed methodological validation via causality simulations for a Markov, regime-switching model. The proposed test can be used to detect whether none stochastic process is causal to the observed behaviour of another, probabilistically. In particular, the regime-switch causality explication proposed herein is pivotal to the results articulated. This research also examines the power of the proposed test by using simulations, and outlines some steps that one may take in using the test in a practical setting.
- Full Text:
- Date Issued: 2014
A model for supporting environmental awereness in higher education using social media
- Authors: Tlebere, Thabo Eugene
- Date: 2013
- Subjects: Social media , Environmental education , Universities and colleges
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: vital:10496 , http://hdl.handle.net/10948/d1020820
- Description: University sustainability is a field of research that has been gaining increased interest in recent years. The reduction of environmental impact has become a strategic objective of universities globally. Universities have been prompted to take necessary action to ensure that their environmental impact is at a minimum. The environmental component of sustainability deals with the current conservation of the earth’s natural resources so that future generations can also have access to them. Human beings, due to their increasing needs, are accountable for the exploitation of natural resources. They are regarded as the main contributors to imbalances in the natural systems. Environmental concerns such as global warming, deforestations, disposal of wastes, and ozone reduction are the outcomes of the damage caused by humans on the environment. The aim of environmental education is to acquire remediation of the environment by making individuals aware of the environment and by educating them about how to live a more sustainable lifestyle. Environmental awareness is perceived as knowledge of the factors that affect the environment and having sensitivity towards the environment. Higher Education Institutions (HEIs) bear the responsibility of educating individuals about environmental issues since they provide education to future leaders in society who may have an influence on future conditions in the environment. Social media are capable of delivering information to a large spectrum of audiences at a low cost. The Pew Internet American Life Project reported that the number of adults who utilise social media has increased by 57 percent from 2005 to 2011. Several environmental activist organisations utilise social media to carry out environmental awareness campaigns. In this study two environmental awareness campaigns which were powered by social media were conducted to improve environmental awareness of individuals in a higher education environment. A Social media Model for ENvironmental Awareness (SMENA) was developed to facilitate the environmental awareness campaigns. The SMENA includes a website, social media as well as theoretical guidelines for creating environmental awareness campaigns, and for using social media for environmental awareness campaigns. A case study at the Nelson Mandela Metropolitan University (NMMU) was used to empirically evaluate SMENA. Students at the Department of Computer Sciences of NMMU were exposed to information about environmental issues through social media with the intention of improving their environmental knowledge and awareness. The SMENA website usability was rated positively and students enjoyed the blogs and information distributed by means of social media. The results of the study intervention were positive and showed that social media can be used to improve the environmental knowledge of students. This study provides a valuable contribution to both the field of environmental education and social media usage and acceptance. The guidelines and requirements for using social media to improve environmental awareness provided in this study can be used to assist educators and university management with addressing the problems of reducing environmental impact.
- Full Text:
- Date Issued: 2013
- Authors: Tlebere, Thabo Eugene
- Date: 2013
- Subjects: Social media , Environmental education , Universities and colleges
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: vital:10496 , http://hdl.handle.net/10948/d1020820
- Description: University sustainability is a field of research that has been gaining increased interest in recent years. The reduction of environmental impact has become a strategic objective of universities globally. Universities have been prompted to take necessary action to ensure that their environmental impact is at a minimum. The environmental component of sustainability deals with the current conservation of the earth’s natural resources so that future generations can also have access to them. Human beings, due to their increasing needs, are accountable for the exploitation of natural resources. They are regarded as the main contributors to imbalances in the natural systems. Environmental concerns such as global warming, deforestations, disposal of wastes, and ozone reduction are the outcomes of the damage caused by humans on the environment. The aim of environmental education is to acquire remediation of the environment by making individuals aware of the environment and by educating them about how to live a more sustainable lifestyle. Environmental awareness is perceived as knowledge of the factors that affect the environment and having sensitivity towards the environment. Higher Education Institutions (HEIs) bear the responsibility of educating individuals about environmental issues since they provide education to future leaders in society who may have an influence on future conditions in the environment. Social media are capable of delivering information to a large spectrum of audiences at a low cost. The Pew Internet American Life Project reported that the number of adults who utilise social media has increased by 57 percent from 2005 to 2011. Several environmental activist organisations utilise social media to carry out environmental awareness campaigns. In this study two environmental awareness campaigns which were powered by social media were conducted to improve environmental awareness of individuals in a higher education environment. A Social media Model for ENvironmental Awareness (SMENA) was developed to facilitate the environmental awareness campaigns. The SMENA includes a website, social media as well as theoretical guidelines for creating environmental awareness campaigns, and for using social media for environmental awareness campaigns. A case study at the Nelson Mandela Metropolitan University (NMMU) was used to empirically evaluate SMENA. Students at the Department of Computer Sciences of NMMU were exposed to information about environmental issues through social media with the intention of improving their environmental knowledge and awareness. The SMENA website usability was rated positively and students enjoyed the blogs and information distributed by means of social media. The results of the study intervention were positive and showed that social media can be used to improve the environmental knowledge of students. This study provides a valuable contribution to both the field of environmental education and social media usage and acceptance. The guidelines and requirements for using social media to improve environmental awareness provided in this study can be used to assist educators and university management with addressing the problems of reducing environmental impact.
- Full Text:
- Date Issued: 2013
Using data analysis and Information visualization techniques to support the effective analysis of large financial data sets
- Authors: Nyumbeka, Dumisani Joshua
- Date: 2016
- Subjects: Information visualization Finance -- Mathematical models , Database management
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/12983 , vital:27141
- Description: There have been a number of technological advances in the last ten years, which has resulted in the amount of data generated in organisations increasing by more than 200% during this period. This rapid increase in data means that if financial institutions are to derive significant value from this data, they need to identify new ways to analyse this data effectively. Due to the considerable size of the data, financial institutions also need to consider how to effectively visualise the data. Traditional tools such as relational database management systems have problems processing large amounts of data due to memory constraints, latency issues and the presence of both structured and unstructured data The aim of this research was to use data analysis and information visualisation techniques (IV) to support the effective analysis of large financial data sets. In order to visually analyse the data effectively, the underlying data model must produce results that are reliable. A large financial data set was identified, and used to demonstrate that IV techniques can be used to support the effective analysis of large financial data sets. A review of the literature on large financial data sets, visual analytics, existing data management and data visualisation tools identified the shortcomings of existing tools. This resulted in the determination of the requirements for the data management tool, and the IV tool. The data management tool identified was a data warehouse and the IV toolkit identified was Tableau. The IV techniques identified included the Overview, Dashboards and Colour Blending. The IV tool was implemented and published online and can be accessed through a web browser interface. The data warehouse and the IV tool were evaluated to determine their accuracy and effectiveness in supporting the effective analysis of the large financial data set. The experiment used to evaluate the data warehouse yielded positive results, showing that only about 4% of the records had incorrect data. The results of the user study were positive and no major usability issues were identified. The participants found the IV techniques effective for analysing the large financial data set.
- Full Text:
- Date Issued: 2016
- Authors: Nyumbeka, Dumisani Joshua
- Date: 2016
- Subjects: Information visualization Finance -- Mathematical models , Database management
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/12983 , vital:27141
- Description: There have been a number of technological advances in the last ten years, which has resulted in the amount of data generated in organisations increasing by more than 200% during this period. This rapid increase in data means that if financial institutions are to derive significant value from this data, they need to identify new ways to analyse this data effectively. Due to the considerable size of the data, financial institutions also need to consider how to effectively visualise the data. Traditional tools such as relational database management systems have problems processing large amounts of data due to memory constraints, latency issues and the presence of both structured and unstructured data The aim of this research was to use data analysis and information visualisation techniques (IV) to support the effective analysis of large financial data sets. In order to visually analyse the data effectively, the underlying data model must produce results that are reliable. A large financial data set was identified, and used to demonstrate that IV techniques can be used to support the effective analysis of large financial data sets. A review of the literature on large financial data sets, visual analytics, existing data management and data visualisation tools identified the shortcomings of existing tools. This resulted in the determination of the requirements for the data management tool, and the IV tool. The data management tool identified was a data warehouse and the IV toolkit identified was Tableau. The IV techniques identified included the Overview, Dashboards and Colour Blending. The IV tool was implemented and published online and can be accessed through a web browser interface. The data warehouse and the IV tool were evaluated to determine their accuracy and effectiveness in supporting the effective analysis of the large financial data set. The experiment used to evaluate the data warehouse yielded positive results, showing that only about 4% of the records had incorrect data. The results of the user study were positive and no major usability issues were identified. The participants found the IV techniques effective for analysing the large financial data set.
- Full Text:
- Date Issued: 2016
A web-based repository for student mobility data in Africa
- Authors: Ferreira, Darren Bradley
- Date: 2016
- Subjects: Student mobility Students, Transfer of -- South Africa -- Psychological aspects , Education, Higher Foreign study
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/12605 , vital:27098
- Description: The number of international students studying abroad has doubled since the year 2000 and there are nearly five million students that are enrolled outside their country of origin. Over the past ten years new insights and approaches to the internationalisation of higher education have arisen which has influenced global research and education. Student mobility data is a component of internationalisation data. internationally mobile students are defined as students who have crossed international borders from their countries with the objective to study. Currently, there are several international organisations and projects that manage student mobility data from various Higher Education Institutions (HEIs) across the globe and report on this data. Two of these organisations are the Open Doors Report and Project Atlas. The organisations collect Data for Africa, although it is not as detailed and useful as the data provided about other countries. Since the number of students studying abroad has doubled since the year 2000, the amount of student mobility data kept by data collection agencies and HEIs has also increased. The data collected is not always accurate and this poses a data management problem. This study conducted a survey sent to international offices at various HEIs in South Africa and Africa. The survey investigates the current state of student mobility data management in HEIs. The survey results revealed that the international offices are currently dissatisfied with student mobility data management and will be willing to provide international student data to an African data repository. This study proposes the design and development of a web-based student mobility data repository, known as the African International Portal (AIP). The study identified design guidelines and requirements for a web-based data repository. The requirements, design and design guidelines were used to guide the development of the prototype. Heuristic evaluations were conducted on the prototype in order to identify any major usability problems. Findings revealed that the overall perceptions of the prototype were positive and can be attributed to the design considerations and guidelines used during the development phase. The prototype was evaluated using a full usability evaluation that determined the usefulness, effectiveness and efficiency of the prototype when users are in the process of managing student mobility data. The results indicate that the participants found the AIP to be an effective, efficient and a satisfactory means of managing student mobility data.
- Full Text:
- Date Issued: 2016
- Authors: Ferreira, Darren Bradley
- Date: 2016
- Subjects: Student mobility Students, Transfer of -- South Africa -- Psychological aspects , Education, Higher Foreign study
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/12605 , vital:27098
- Description: The number of international students studying abroad has doubled since the year 2000 and there are nearly five million students that are enrolled outside their country of origin. Over the past ten years new insights and approaches to the internationalisation of higher education have arisen which has influenced global research and education. Student mobility data is a component of internationalisation data. internationally mobile students are defined as students who have crossed international borders from their countries with the objective to study. Currently, there are several international organisations and projects that manage student mobility data from various Higher Education Institutions (HEIs) across the globe and report on this data. Two of these organisations are the Open Doors Report and Project Atlas. The organisations collect Data for Africa, although it is not as detailed and useful as the data provided about other countries. Since the number of students studying abroad has doubled since the year 2000, the amount of student mobility data kept by data collection agencies and HEIs has also increased. The data collected is not always accurate and this poses a data management problem. This study conducted a survey sent to international offices at various HEIs in South Africa and Africa. The survey investigates the current state of student mobility data management in HEIs. The survey results revealed that the international offices are currently dissatisfied with student mobility data management and will be willing to provide international student data to an African data repository. This study proposes the design and development of a web-based student mobility data repository, known as the African International Portal (AIP). The study identified design guidelines and requirements for a web-based data repository. The requirements, design and design guidelines were used to guide the development of the prototype. Heuristic evaluations were conducted on the prototype in order to identify any major usability problems. Findings revealed that the overall perceptions of the prototype were positive and can be attributed to the design considerations and guidelines used during the development phase. The prototype was evaluated using a full usability evaluation that determined the usefulness, effectiveness and efficiency of the prototype when users are in the process of managing student mobility data. The results indicate that the participants found the AIP to be an effective, efficient and a satisfactory means of managing student mobility data.
- Full Text:
- Date Issued: 2016
A model for context awareness for mobile applications using multiple-input sources
- Authors: Pather, Direshin
- Date: 2015
- Subjects: Context-aware computing , Mobile apps , MIMO systems
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/2969 , vital:20378
- Description: Context-aware computing enables mobile applications to discover and benefit from valuable context information, such as user location, time of day and current activity. However, determining the users’ context throughout their daily activities is one of the main challenges of context-aware computing. With the increasing number of built-in mobile sensors and other input sources, existing context models do not effectively handle context information related to personal user context. The objective of this research was to develop an improved context-aware model to support the context awareness needs of mobile applications. An existing context-aware model was selected as the most complete model to use as a basis for the proposed model to support context awareness in mobile applications. The existing context-aware model was modified to address the shortcomings of existing models in dealing with context information related to personal user context. The proposed model supports four different context dimensions, namely Physical, User Activity, Health and User Preferences. A prototype, called CoPro was developed, based on the proposed model, to demonstrate the effectiveness of the model. Several experiments were designed and conducted to determine if CoPro was effective, reliable and capable. CoPro was considered effective as it produced low-level context as well as inferred context. The reliability of the model was confirmed by evaluating CoPro using Quality of Context (QoC) metrics such as Accuracy, Freshness, Certainty and Completeness. CoPro was also found to be capable of dealing with the limitations of the mobile computing platform such as limited processing power. The research determined that the proposed context-aware model can be used to successfully support context awareness in mobile applications. Design recommendations were proposed and future work will involve converting the CoPro prototype into middleware in the form of an API to provide easier access to context awareness support in mobile applications.
- Full Text:
- Date Issued: 2015
- Authors: Pather, Direshin
- Date: 2015
- Subjects: Context-aware computing , Mobile apps , MIMO systems
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/2969 , vital:20378
- Description: Context-aware computing enables mobile applications to discover and benefit from valuable context information, such as user location, time of day and current activity. However, determining the users’ context throughout their daily activities is one of the main challenges of context-aware computing. With the increasing number of built-in mobile sensors and other input sources, existing context models do not effectively handle context information related to personal user context. The objective of this research was to develop an improved context-aware model to support the context awareness needs of mobile applications. An existing context-aware model was selected as the most complete model to use as a basis for the proposed model to support context awareness in mobile applications. The existing context-aware model was modified to address the shortcomings of existing models in dealing with context information related to personal user context. The proposed model supports four different context dimensions, namely Physical, User Activity, Health and User Preferences. A prototype, called CoPro was developed, based on the proposed model, to demonstrate the effectiveness of the model. Several experiments were designed and conducted to determine if CoPro was effective, reliable and capable. CoPro was considered effective as it produced low-level context as well as inferred context. The reliability of the model was confirmed by evaluating CoPro using Quality of Context (QoC) metrics such as Accuracy, Freshness, Certainty and Completeness. CoPro was also found to be capable of dealing with the limitations of the mobile computing platform such as limited processing power. The research determined that the proposed context-aware model can be used to successfully support context awareness in mobile applications. Design recommendations were proposed and future work will involve converting the CoPro prototype into middleware in the form of an API to provide easier access to context awareness support in mobile applications.
- Full Text:
- Date Issued: 2015
Modelling of size-based portfolios using a mixture of normal distributions
- Authors: Janse Van Rensburg, Stéfan
- Date: 2009
- Subjects: Capital assets pricing model
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: vital:10569 , http://hdl.handle.net/10948/985 , Capital assets pricing model
- Description: From option pricing using the Black and Scholes model, to determining the signi cance of regression coe cients in a capital asset pricing model (CAPM), the assumption of normality was pervasive throughout the eld of nance. This was despite evidence that nancial returns were non-normal, skewed and heavy- tailed. In addition to non-normality, there remained questions about the e ect of rm size on returns. Studies examining these di erences were limited to ex- amining the mean return, with respect to an asset pricing model, and did not consider higher moments. Janse van Rensburg, Sharp and Friskin (in press) attempted to address both the problem of non-normality and size simultaneously. They (Janse van Rens- burg et al in press) tted a mixture of two normal distributions, with common mean but di erent variances, to a small capitalisation portfolio and a large cap- italisation portfolio. Comparison of the mixture distributions yielded valuable insight into the di erences between the small and large capitalisation portfolios' risk. Janse van Rensburg et al (in press), however, identi ed several shortcom- ings within their work. These included data problems, such as survivorship bias and the exclusion of dividends, and the questionable use of standard statistical tests in the presence of non-normality. This study sought to correct the problems noted in the paper by Janse van Rensburg et al (in press) and to expand upon their research. To this end survivorship bias was eliminated and an e ective dividend was included into the return calculations. Weekly data were used, rather than the monthly data of Janse van Rensburg et al (in press). More portfolios, over shorter holding periods, were considered. This allowed the authors to test whether Janse van Rensburg et al's (in press) ndings remained valid under conditions di erent to their original study. Inference was also based on bootstrapped statistics, in order to circumvent problems associated with non-normality. Additionally, several di erent speci cations of the normal mixture distribution were considered, as opposed to only the two-component scale mixture. In the following, Chapter 2 provided a literature review of previous studies on return distributions and size e ects. The data, data preparation and portfolio formation were discussed in Chapter 3. Chapter 4 gave an overview of the statistical methods and tests used throughout the study. The empirical results of these tests, prior to risk adjustment, were presented in Chapter 5. The impact of risk adjustment on the distribution of returns was documented in Chapter 6. The study ended, Chapter 7, with a summary of the results and suggestions for future research.
- Full Text:
- Date Issued: 2009
- Authors: Janse Van Rensburg, Stéfan
- Date: 2009
- Subjects: Capital assets pricing model
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: vital:10569 , http://hdl.handle.net/10948/985 , Capital assets pricing model
- Description: From option pricing using the Black and Scholes model, to determining the signi cance of regression coe cients in a capital asset pricing model (CAPM), the assumption of normality was pervasive throughout the eld of nance. This was despite evidence that nancial returns were non-normal, skewed and heavy- tailed. In addition to non-normality, there remained questions about the e ect of rm size on returns. Studies examining these di erences were limited to ex- amining the mean return, with respect to an asset pricing model, and did not consider higher moments. Janse van Rensburg, Sharp and Friskin (in press) attempted to address both the problem of non-normality and size simultaneously. They (Janse van Rens- burg et al in press) tted a mixture of two normal distributions, with common mean but di erent variances, to a small capitalisation portfolio and a large cap- italisation portfolio. Comparison of the mixture distributions yielded valuable insight into the di erences between the small and large capitalisation portfolios' risk. Janse van Rensburg et al (in press), however, identi ed several shortcom- ings within their work. These included data problems, such as survivorship bias and the exclusion of dividends, and the questionable use of standard statistical tests in the presence of non-normality. This study sought to correct the problems noted in the paper by Janse van Rensburg et al (in press) and to expand upon their research. To this end survivorship bias was eliminated and an e ective dividend was included into the return calculations. Weekly data were used, rather than the monthly data of Janse van Rensburg et al (in press). More portfolios, over shorter holding periods, were considered. This allowed the authors to test whether Janse van Rensburg et al's (in press) ndings remained valid under conditions di erent to their original study. Inference was also based on bootstrapped statistics, in order to circumvent problems associated with non-normality. Additionally, several di erent speci cations of the normal mixture distribution were considered, as opposed to only the two-component scale mixture. In the following, Chapter 2 provided a literature review of previous studies on return distributions and size e ects. The data, data preparation and portfolio formation were discussed in Chapter 3. Chapter 4 gave an overview of the statistical methods and tests used throughout the study. The empirical results of these tests, prior to risk adjustment, were presented in Chapter 5. The impact of risk adjustment on the distribution of returns was documented in Chapter 6. The study ended, Chapter 7, with a summary of the results and suggestions for future research.
- Full Text:
- Date Issued: 2009
Index optimisation for structural equation models (SEM)
- Authors: Stindt, Carmen
- Date: 2018
- Subjects: Structural equation modelling
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/17919 , vital:28518
- Description: Structural equation modelling (SEM), a statistical technique used extensively in quantitative marketing research and other domains, is an analytical approach used to model latent (unobservable) variables. Unlike distribution fitting where simple chi-squared goodness-of-fit assessment yields satisfactory results, model fit in SEM is more difficult. Descriptive goodness-of-fit indices have been developed over the past 50 years to assist in the assessment of model fit. The traditional assessment method requires reporting multiple indices, all of which should reflect an adequate model fit in order for the overall model fit to be deemed good. The choice of indices to report are left to the researcher’s discretion, leading to the indices used to differ considerably. The combination of using the traditional assessment method and differing indices often lead to conflicting results. This study proposes a composite index, combining frequently used indicators in an attempt to obtain a single index method for assessing model fit in SEM that performs better when compared to the traditional assessment method. Composite indices have been used in other domains as an improved method of assessing performance (Barr and Kantor, 2004). The composite index proposed is evaluated using a Monte Carlo simulation study under different experimental conditions. The experimental conditions investigated are sample size, estimation method and model misspecification. These experimental conditions are chosen to investigate as each has been shown to affect the traditional indices performances. The ideal fit indices should be able to detect model misspecification while being insensitive to sample size and estimation methods. This is not always the case with the traditional indices. The composite index proposed is shown to outperform the traditional assessment method under many of the experimental condition combinations. This provides evidence that composite indices may be a more beneficial method of assessing model fit in SEM.
- Full Text:
- Date Issued: 2018
- Authors: Stindt, Carmen
- Date: 2018
- Subjects: Structural equation modelling
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/17919 , vital:28518
- Description: Structural equation modelling (SEM), a statistical technique used extensively in quantitative marketing research and other domains, is an analytical approach used to model latent (unobservable) variables. Unlike distribution fitting where simple chi-squared goodness-of-fit assessment yields satisfactory results, model fit in SEM is more difficult. Descriptive goodness-of-fit indices have been developed over the past 50 years to assist in the assessment of model fit. The traditional assessment method requires reporting multiple indices, all of which should reflect an adequate model fit in order for the overall model fit to be deemed good. The choice of indices to report are left to the researcher’s discretion, leading to the indices used to differ considerably. The combination of using the traditional assessment method and differing indices often lead to conflicting results. This study proposes a composite index, combining frequently used indicators in an attempt to obtain a single index method for assessing model fit in SEM that performs better when compared to the traditional assessment method. Composite indices have been used in other domains as an improved method of assessing performance (Barr and Kantor, 2004). The composite index proposed is evaluated using a Monte Carlo simulation study under different experimental conditions. The experimental conditions investigated are sample size, estimation method and model misspecification. These experimental conditions are chosen to investigate as each has been shown to affect the traditional indices performances. The ideal fit indices should be able to detect model misspecification while being insensitive to sample size and estimation methods. This is not always the case with the traditional indices. The composite index proposed is shown to outperform the traditional assessment method under many of the experimental condition combinations. This provides evidence that composite indices may be a more beneficial method of assessing model fit in SEM.
- Full Text:
- Date Issued: 2018
An Internet of things model for field service automation
- Authors: Kapeso, Mando Mulabita
- Date: 2017
- Subjects: Internet of things Manufacturing processes -- Automation , Automation
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/18641 , vital:28698
- Description: Due to the competitive nature of the global economy, organisations are continuously seeking ways of cutting costs and increasing efficiency to gain a competitive advantage. Field service organisations that offer after sales support seek to gain a competitive advantage through downtime minimisation. Downtime is the time between service requests made by a customer or triggered by equipment failure and the completion of the service to rectify the problem by the field service team. Researchers have identified downtime as one of the key performance indicators for field service organisations. The lack of real-time access to information and inaccuracy of information are factors which contribute to the poor management of downtime. Various technology advancements have been adopted to address some of the challenges faced by field service organisations through automation. The emergence of an Internet of Things (IoT), has brought new enhancement possibilities to various industries, for instance, the manufacturing industry. The main research question that this study aims to address is “How can an Internet of Things be used to optimise field service automation?” The main research objective was to develop and evaluate a model for the optimisation of field services using an IoT’s features and technologies. The model aims at addressing challenges associated with the inaccuracy or/and lack of real-time access to information during downtime. The model developed is the theoretical artefact of the research methodology used in this study which is the Design Science Research Methodology (DSRM). The DSRM activities were adopted to fulfil the research objectives of this research. A literature review in the field services domain was conducted to establish the problems faced by field service organisations. Several interviews were held to verify the problems of FSM identified in literature and some potential solutions. During the design and development activity of the DSRM methodology, an IoT model for FSA was designed. The model consists of:The Four Layered Architecture; The Three Phase Data Flow Process; and Definition and descriptions of IoT-based elements and functions. The model was then used to drive the design, development, and evaluation of “proof of concept” prototype, the KapCha prototype. KapCha enables the optimisation of FSA using IoT techniques and features. The implementation of a sub-component of the KapCha system, in fulfilment of the research. The implementation of KapCha was applied to the context of a smart lighting environment in the case study. A two-phase evaluation was conducted to review both the theoretical model and the KapCha prototype. The model and KapCha prototype were evaluated using the Technical and Risk efficacy evaluation strategy from the Framework for Evaluation of Design Science (FEDS). The Technical Risk and Efficacy strategy made use of formative, artificial-summative and summative-naturalistic methods of evaluation. An artificial-summative evaluation was used to evaluate the design of the model. Iterative formative evaluations were conducted during the development of the KapCha. KapCha was then placed in a real-environment conditions and a summative-naturalistic evaluation was conducted. The summative-naturalistic evaluation was used to determine the performance of KapCha under real-world conditions to evaluate the extent it addresses FSA problems identified such as real-time communication and automated fault detection.
- Full Text:
- Date Issued: 2017
- Authors: Kapeso, Mando Mulabita
- Date: 2017
- Subjects: Internet of things Manufacturing processes -- Automation , Automation
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/18641 , vital:28698
- Description: Due to the competitive nature of the global economy, organisations are continuously seeking ways of cutting costs and increasing efficiency to gain a competitive advantage. Field service organisations that offer after sales support seek to gain a competitive advantage through downtime minimisation. Downtime is the time between service requests made by a customer or triggered by equipment failure and the completion of the service to rectify the problem by the field service team. Researchers have identified downtime as one of the key performance indicators for field service organisations. The lack of real-time access to information and inaccuracy of information are factors which contribute to the poor management of downtime. Various technology advancements have been adopted to address some of the challenges faced by field service organisations through automation. The emergence of an Internet of Things (IoT), has brought new enhancement possibilities to various industries, for instance, the manufacturing industry. The main research question that this study aims to address is “How can an Internet of Things be used to optimise field service automation?” The main research objective was to develop and evaluate a model for the optimisation of field services using an IoT’s features and technologies. The model aims at addressing challenges associated with the inaccuracy or/and lack of real-time access to information during downtime. The model developed is the theoretical artefact of the research methodology used in this study which is the Design Science Research Methodology (DSRM). The DSRM activities were adopted to fulfil the research objectives of this research. A literature review in the field services domain was conducted to establish the problems faced by field service organisations. Several interviews were held to verify the problems of FSM identified in literature and some potential solutions. During the design and development activity of the DSRM methodology, an IoT model for FSA was designed. The model consists of:The Four Layered Architecture; The Three Phase Data Flow Process; and Definition and descriptions of IoT-based elements and functions. The model was then used to drive the design, development, and evaluation of “proof of concept” prototype, the KapCha prototype. KapCha enables the optimisation of FSA using IoT techniques and features. The implementation of a sub-component of the KapCha system, in fulfilment of the research. The implementation of KapCha was applied to the context of a smart lighting environment in the case study. A two-phase evaluation was conducted to review both the theoretical model and the KapCha prototype. The model and KapCha prototype were evaluated using the Technical and Risk efficacy evaluation strategy from the Framework for Evaluation of Design Science (FEDS). The Technical Risk and Efficacy strategy made use of formative, artificial-summative and summative-naturalistic methods of evaluation. An artificial-summative evaluation was used to evaluate the design of the model. Iterative formative evaluations were conducted during the development of the KapCha. KapCha was then placed in a real-environment conditions and a summative-naturalistic evaluation was conducted. The summative-naturalistic evaluation was used to determine the performance of KapCha under real-world conditions to evaluate the extent it addresses FSA problems identified such as real-time communication and automated fault detection.
- Full Text:
- Date Issued: 2017
A framework for co-located collaborative business process modelling using touch technologies
- Authors: Snyman, Irene
- Date: 2013
- Subjects: Reengineering (Management) , Information technology
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: vital:10498 , http://hdl.handle.net/10948/d1021015
- Description: In recent years the field of Business Process Modelling (BPM) has gained increasing attention from both the business and research communities. One of the primary drivers for BPM is the improved understanding of Business Processes (BPs) and the competitive advantage gained over competitors. In addition, BPM can improve communication in an organisation and facilitate increased support for change management. BPM is a collaborative activity that needs to be carried out in a team environment, and Collaborative Business Process Modelling (CBPM) promotes improved readability, accuracy and quality of process models as well as a reduced workload for modellers. In spite of the increased popularity of CBPM, there is limited research related to the collaborative nature of the modelling tasks performed by modellers and specifically to the synchronisation of shared process models. In addition, tools and techniques to support CBPM do not support this synchronisation effectively or efficiently. This study proposes a conceptual framework for CBPM using touch technologies in a colocated collaborative environment. The main research problem addressed by this study is that modellers experience difficulties conducting BPM activities in a co-located collaborative environment. In order to address the research problem and clarify and elaborate on the problems of CBPM, a two-fold approach was undertaken. Firstly, after an in-depth literature review, a BPM survey was designed and then sent to modellers in South African Information Technology (IT) consulting companies in order to provide a more in-depth understanding of the status and challenges of CBPM in IT consulting organisations. The results revealed that available BPM software do not adequately cater for CBPM and software tools do not enforce versioning and synchronisation. In addition, hardware constraints were reported as well as problems with integrating different parts of the process model that the modellers were working on. The results of the survey also showed that the positive aspects of CBPM are that ideas could be shared and overall there is a better understanding of the BPs being modelled. The second part of the problem elaboration consisted of usability field studies with participants from both education and industry using a traditional popular BPM software tool, Enterprise Architect (EA). Whilst several benefits of CBPM were confirmed, several challenges were encountered, particularly with regard to the integration and synchronisation of models. To overcome the problems of CBPM, a framework was developed that allows for co-located CBPM using tablet PCs. The framework includes a developed prototype of the BPMTouch software which runs on tablet PCs, as well as some theoretical aspects of CBPM. The BPMTouch software supports effective and efficient CBPM and the synchronisation of process models since it allows multiple modellers to work together on one BP model, with each modeller using his/her own tablet. If one modeller makes changes to the model, the changes are immediately reflected on the tablets of the other modellers since the changes to the model are updated in real time. Modellers cannot draw on the same model simultaneously, however, everyone can see what the active modeller (active participant with the green flag) is doing. Other participants can then become the active modeller and make changes to the model once the flag has been released and re-allocated. The results from the field studies, industry surveys and usability evaluations were all incorporated into the BPMTouch software tool design and into the aspects of CBPM in order to assist with the process of co-located CBPM using touch technologies. Usability evaluations were carried out in which industry and student participants used BPMTouch to create an integrated model and simultaneously and synchronously create a process model. The evaluations of the BPMTouch prototype revealed that participants prefer this system over traditional BPM software since the BPMTouch removes the need for post modelling integration. The theoretical contribution of the framework consists of aspects proposing that organisations should take the potential benefits and challenges of CBPM into consideration and address the Critical Success Factors (CSFs) before embarking on a CBPM project. These aspects can help with decisions relating to CBPM. The use of this framework can improve the quality of process models, reduce the workload of modellers and in this way increase the success rate of CBPM projects.
- Full Text:
- Date Issued: 2013
- Authors: Snyman, Irene
- Date: 2013
- Subjects: Reengineering (Management) , Information technology
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: vital:10498 , http://hdl.handle.net/10948/d1021015
- Description: In recent years the field of Business Process Modelling (BPM) has gained increasing attention from both the business and research communities. One of the primary drivers for BPM is the improved understanding of Business Processes (BPs) and the competitive advantage gained over competitors. In addition, BPM can improve communication in an organisation and facilitate increased support for change management. BPM is a collaborative activity that needs to be carried out in a team environment, and Collaborative Business Process Modelling (CBPM) promotes improved readability, accuracy and quality of process models as well as a reduced workload for modellers. In spite of the increased popularity of CBPM, there is limited research related to the collaborative nature of the modelling tasks performed by modellers and specifically to the synchronisation of shared process models. In addition, tools and techniques to support CBPM do not support this synchronisation effectively or efficiently. This study proposes a conceptual framework for CBPM using touch technologies in a colocated collaborative environment. The main research problem addressed by this study is that modellers experience difficulties conducting BPM activities in a co-located collaborative environment. In order to address the research problem and clarify and elaborate on the problems of CBPM, a two-fold approach was undertaken. Firstly, after an in-depth literature review, a BPM survey was designed and then sent to modellers in South African Information Technology (IT) consulting companies in order to provide a more in-depth understanding of the status and challenges of CBPM in IT consulting organisations. The results revealed that available BPM software do not adequately cater for CBPM and software tools do not enforce versioning and synchronisation. In addition, hardware constraints were reported as well as problems with integrating different parts of the process model that the modellers were working on. The results of the survey also showed that the positive aspects of CBPM are that ideas could be shared and overall there is a better understanding of the BPs being modelled. The second part of the problem elaboration consisted of usability field studies with participants from both education and industry using a traditional popular BPM software tool, Enterprise Architect (EA). Whilst several benefits of CBPM were confirmed, several challenges were encountered, particularly with regard to the integration and synchronisation of models. To overcome the problems of CBPM, a framework was developed that allows for co-located CBPM using tablet PCs. The framework includes a developed prototype of the BPMTouch software which runs on tablet PCs, as well as some theoretical aspects of CBPM. The BPMTouch software supports effective and efficient CBPM and the synchronisation of process models since it allows multiple modellers to work together on one BP model, with each modeller using his/her own tablet. If one modeller makes changes to the model, the changes are immediately reflected on the tablets of the other modellers since the changes to the model are updated in real time. Modellers cannot draw on the same model simultaneously, however, everyone can see what the active modeller (active participant with the green flag) is doing. Other participants can then become the active modeller and make changes to the model once the flag has been released and re-allocated. The results from the field studies, industry surveys and usability evaluations were all incorporated into the BPMTouch software tool design and into the aspects of CBPM in order to assist with the process of co-located CBPM using touch technologies. Usability evaluations were carried out in which industry and student participants used BPMTouch to create an integrated model and simultaneously and synchronously create a process model. The evaluations of the BPMTouch prototype revealed that participants prefer this system over traditional BPM software since the BPMTouch removes the need for post modelling integration. The theoretical contribution of the framework consists of aspects proposing that organisations should take the potential benefits and challenges of CBPM into consideration and address the Critical Success Factors (CSFs) before embarking on a CBPM project. These aspects can help with decisions relating to CBPM. The use of this framework can improve the quality of process models, reduce the workload of modellers and in this way increase the success rate of CBPM projects.
- Full Text:
- Date Issued: 2013
An enterprise architecture for environmental information management and reporting
- Authors: Van der Hoogen, Anthea
- Date: 2013
- Subjects: Business enterprises -- Information technology -- Management , Sustainable development reporting , Environmental reporting
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: vital:10499 , http://hdl.handle.net/10948/d1021040
- Description: Organisations globally are communicating their environmental sustainability impact to stakeholders by means of the widely used sustainability report. A key benefit of environmental sustainability reporting is that organisations can gain a positive reputation when these reports are presented to stakeholders. Organisations in South Africa are faced with many challenges regarding managing sustainability information and producing an environmental sustainability report. Two of the primary challenges are the many diverse standards for sustainability reporting and data quality issues. Information Technology (IT) can be used to support and improve the process of sustainability reporting but it is important to align the environmental sustainability strategies with the strategies of business and also with the IT strategy to avoid silos of information and reporting. Enterprise Architecture (EA) can be used to solve alignment problems since it supports business-IT alignment. EA is defined by the International Standards Organisation (ISO) as “The fundamental concepts or properties of a system in its environment embodied in its elements, relationships, and in the principles of its design and evolution”. It can be argued, therefore, that EA can be used to support environmental sustainability information management and the reporting process by means of its support of improved business-IT alignment and ultimately integrated systems. The main objective of this study is to investigate how EA can be used to support environmental information management (EIM) and reporting. A survey study of thirty one prominent South African organisations was undertaken in order to investigate the status of their EA adoption and environmental reporting and EIM processes. An EA for EIM Toolkit and a set of guidelines are proposed which can provide support for EIM through the use of EA. These guidelines were proposed based on best-practice for each of the three process levels of an organisation, namely, the strategic level, the operational level and the technological level. The toolkit and guidelines were derived from theory and the results of the industry survey were then validated by an in-depth analysis of a case study consisting of multiple cases with key employees of seven South African organisations which have proved to be successful at EA and EIM and reporting. The results of the case study show that the EA for EIM Toolkit and related guidelines can assist organisations to align their environmental sustainability strategies with their organisational and IT strategies.
- Full Text:
- Date Issued: 2013
- Authors: Van der Hoogen, Anthea
- Date: 2013
- Subjects: Business enterprises -- Information technology -- Management , Sustainable development reporting , Environmental reporting
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: vital:10499 , http://hdl.handle.net/10948/d1021040
- Description: Organisations globally are communicating their environmental sustainability impact to stakeholders by means of the widely used sustainability report. A key benefit of environmental sustainability reporting is that organisations can gain a positive reputation when these reports are presented to stakeholders. Organisations in South Africa are faced with many challenges regarding managing sustainability information and producing an environmental sustainability report. Two of the primary challenges are the many diverse standards for sustainability reporting and data quality issues. Information Technology (IT) can be used to support and improve the process of sustainability reporting but it is important to align the environmental sustainability strategies with the strategies of business and also with the IT strategy to avoid silos of information and reporting. Enterprise Architecture (EA) can be used to solve alignment problems since it supports business-IT alignment. EA is defined by the International Standards Organisation (ISO) as “The fundamental concepts or properties of a system in its environment embodied in its elements, relationships, and in the principles of its design and evolution”. It can be argued, therefore, that EA can be used to support environmental sustainability information management and the reporting process by means of its support of improved business-IT alignment and ultimately integrated systems. The main objective of this study is to investigate how EA can be used to support environmental information management (EIM) and reporting. A survey study of thirty one prominent South African organisations was undertaken in order to investigate the status of their EA adoption and environmental reporting and EIM processes. An EA for EIM Toolkit and a set of guidelines are proposed which can provide support for EIM through the use of EA. These guidelines were proposed based on best-practice for each of the three process levels of an organisation, namely, the strategic level, the operational level and the technological level. The toolkit and guidelines were derived from theory and the results of the industry survey were then validated by an in-depth analysis of a case study consisting of multiple cases with key employees of seven South African organisations which have proved to be successful at EA and EIM and reporting. The results of the case study show that the EA for EIM Toolkit and related guidelines can assist organisations to align their environmental sustainability strategies with their organisational and IT strategies.
- Full Text:
- Date Issued: 2013
A blended learning toolkit that accommodates multiple learning styles
- Authors: Mills, Steven Christopher
- Date: 2019
- Subjects: Blended learning , Learning strategies Learning
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/30452 , vital:30945
- Description: The purpose of this study was to identify how blended learning can be designed and incorporated to accommodate multiple learning styles within modules in the Department of Computing Sciences. A design theory was created through an analysis of literature and exploration into the backgrounds of students and lecturers within the Department of Computing Sciences. The design theory is: Blended learning can be a useful approach to accommodate multiple learning styles. Guidelines, and by extension a toolkit, facilitate the development of blended learning and provide effective tools to enable lecturers to successfully incorporate blended learning into their modules. Design-Based Research (DBR) was followed in this study, using a mixed-methods and iterative approach to determine the accuracy of the design theory. For the first iteration, the toolkit was implemented in two modules within the Department of Computing Sciences and for the second iteration, four modules. DBR produces a theoretical contribution and a practical artefact. The most important theoretical contributions are the design theory and guidelines for incorporating blended learning that accommodates multiple learning styles. The practical artefacts are the toolkit and tools therein. The toolkit, which was accessed via a website, guides lecturers through the process of incorporating blended learning that accommodates multiple learning styles and provides them with the necessary tools to do so. The design theory was proven in the evaluation that used a questionnaire to understand the lecturers’ experiences regarding the toolkit and the design theory. Therefore, the guidelines for applying blended learning is a useful approach to address multiple learning styles.
- Full Text:
- Date Issued: 2019
- Authors: Mills, Steven Christopher
- Date: 2019
- Subjects: Blended learning , Learning strategies Learning
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/30452 , vital:30945
- Description: The purpose of this study was to identify how blended learning can be designed and incorporated to accommodate multiple learning styles within modules in the Department of Computing Sciences. A design theory was created through an analysis of literature and exploration into the backgrounds of students and lecturers within the Department of Computing Sciences. The design theory is: Blended learning can be a useful approach to accommodate multiple learning styles. Guidelines, and by extension a toolkit, facilitate the development of blended learning and provide effective tools to enable lecturers to successfully incorporate blended learning into their modules. Design-Based Research (DBR) was followed in this study, using a mixed-methods and iterative approach to determine the accuracy of the design theory. For the first iteration, the toolkit was implemented in two modules within the Department of Computing Sciences and for the second iteration, four modules. DBR produces a theoretical contribution and a practical artefact. The most important theoretical contributions are the design theory and guidelines for incorporating blended learning that accommodates multiple learning styles. The practical artefacts are the toolkit and tools therein. The toolkit, which was accessed via a website, guides lecturers through the process of incorporating blended learning that accommodates multiple learning styles and provides them with the necessary tools to do so. The design theory was proven in the evaluation that used a questionnaire to understand the lecturers’ experiences regarding the toolkit and the design theory. Therefore, the guidelines for applying blended learning is a useful approach to address multiple learning styles.
- Full Text:
- Date Issued: 2019
A forecasting model for photovoltaic module energy production
- Authors: Swanepoel, Paul
- Date: 2011
- Subjects: Photovoltaic power systems -- Forecasting
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: vital:10563 , http://hdl.handle.net/10948/1420 , Photovoltaic power systems -- Forecasting
- Description: Energy is of concern for governments and economies all over the world. As conventional methods of energy production are facing the prospect of depleting fossil fuel reserves, economies are facing energy risks. With this tension, various threats arise in terms of energy supply security. A shift from intensive fossil fuel consumption to alternative energy consumption combined with the calculated use of fossil fuels needs to be implemented. Using the energy radiated from the sun and converted to electricity through photovoltaic energy conversion is one of the alternative and renewable sources to address the limited fossil fuel dilemma. South Africa receives an abundance of sunlight irradiance, but limited knowledge of the implementation and possible energy yield of photovoltaic energy production in South Africa is available. Photovoltaic energy yield knowledge is vital in applications for farms, rural areas and remote transmitting devices where the construction of electricity grids are not cost effective. In this study various meteorological and energy parameters about photovoltaics were captured in Port Elizabeth (South Africa) and analyzed, with data being recorded every few seconds. A model for mean daily photovoltaic power output was developed and the relationships between the independent variables analyzed. A model was developed that can forecast mean daily photovoltaic power output using only temperature derived variables and time. The mean daily photovoltaic power model can then easily be used to forecast daily photovoltaic energy output using the number of sunlight seconds in a given day.
- Full Text:
- Date Issued: 2011
- Authors: Swanepoel, Paul
- Date: 2011
- Subjects: Photovoltaic power systems -- Forecasting
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: vital:10563 , http://hdl.handle.net/10948/1420 , Photovoltaic power systems -- Forecasting
- Description: Energy is of concern for governments and economies all over the world. As conventional methods of energy production are facing the prospect of depleting fossil fuel reserves, economies are facing energy risks. With this tension, various threats arise in terms of energy supply security. A shift from intensive fossil fuel consumption to alternative energy consumption combined with the calculated use of fossil fuels needs to be implemented. Using the energy radiated from the sun and converted to electricity through photovoltaic energy conversion is one of the alternative and renewable sources to address the limited fossil fuel dilemma. South Africa receives an abundance of sunlight irradiance, but limited knowledge of the implementation and possible energy yield of photovoltaic energy production in South Africa is available. Photovoltaic energy yield knowledge is vital in applications for farms, rural areas and remote transmitting devices where the construction of electricity grids are not cost effective. In this study various meteorological and energy parameters about photovoltaics were captured in Port Elizabeth (South Africa) and analyzed, with data being recorded every few seconds. A model for mean daily photovoltaic power output was developed and the relationships between the independent variables analyzed. A model was developed that can forecast mean daily photovoltaic power output using only temperature derived variables and time. The mean daily photovoltaic power model can then easily be used to forecast daily photovoltaic energy output using the number of sunlight seconds in a given day.
- Full Text:
- Date Issued: 2011
Automated statistical audit system for a government regulatory authority
- Authors: Xozwa, Thandolwethu
- Date: 2015
- Subjects: Auditing -- Statistical methods -- Data processing , Mathematica (Computer program language)
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/6061 , vital:21035
- Description: Governments all over the world are faced with numerous challenges while running their countries on a daily basis. The predominant challenges which arise are those which involve statistical methodologies. Official statistics to South Africa’s infrastructure are very important and because of this it is important that an effort is made to reduce the challenges that occur during the development of official statistics. For official statistics to be developed successfully quality standards need to be built into an organisational framework and form a system of architecture (Statistics New Zealand 2009:1). Therefore, this study seeks to develop a statistical methodology that is appropriate and scientifically correct using an automated statistical system for audits in government regulatory authorities. The study makes use of Mathematica to provide guidelines on how to develop and use an automated statistical audit system. A comprehensive literature study was conducted using existing secondary sources. A quantitative research paradigm was adopted for this study, to empirically assess the demographic characteristics of tenants of Social Housing Estates and their perceptions towards the rental units they inhabit. More specifically a descriptive study was undertaken. Furthermore, a sample size was selected by means of convenience sampling for a case study on SHRA to assess the respondent’s biographical information. From this sample, a pilot study was conducted investigating the general perceptions of the respondents regarding the physical conditions and quality of their units. The technical development of an automated statistical audit system was discussed. This process involved the development and use of a questionnaire design tool, statistical analysis and reporting and how Mathematica software served as a platform for developing the system. The findings of this study provide insights on how government regulatory authorities can best utilise automated statistical audits for regulation purposes and achieved this by developing an automated statistical audit system for government regulatory authorities. It is hoped that the findings of this study will provide government regulatory authorities with practical suggestions or solutions regarding the generating of official statistics for regulatory purposes, and that the suggestions for future research will inspire future researchers to further investigate automated statistical audit systems, statistical analysis, automated questionnaire development, and government regulatory authorities individually.
- Full Text:
- Date Issued: 2015
- Authors: Xozwa, Thandolwethu
- Date: 2015
- Subjects: Auditing -- Statistical methods -- Data processing , Mathematica (Computer program language)
- Language: English
- Type: Thesis , Masters , MCom
- Identifier: http://hdl.handle.net/10948/6061 , vital:21035
- Description: Governments all over the world are faced with numerous challenges while running their countries on a daily basis. The predominant challenges which arise are those which involve statistical methodologies. Official statistics to South Africa’s infrastructure are very important and because of this it is important that an effort is made to reduce the challenges that occur during the development of official statistics. For official statistics to be developed successfully quality standards need to be built into an organisational framework and form a system of architecture (Statistics New Zealand 2009:1). Therefore, this study seeks to develop a statistical methodology that is appropriate and scientifically correct using an automated statistical system for audits in government regulatory authorities. The study makes use of Mathematica to provide guidelines on how to develop and use an automated statistical audit system. A comprehensive literature study was conducted using existing secondary sources. A quantitative research paradigm was adopted for this study, to empirically assess the demographic characteristics of tenants of Social Housing Estates and their perceptions towards the rental units they inhabit. More specifically a descriptive study was undertaken. Furthermore, a sample size was selected by means of convenience sampling for a case study on SHRA to assess the respondent’s biographical information. From this sample, a pilot study was conducted investigating the general perceptions of the respondents regarding the physical conditions and quality of their units. The technical development of an automated statistical audit system was discussed. This process involved the development and use of a questionnaire design tool, statistical analysis and reporting and how Mathematica software served as a platform for developing the system. The findings of this study provide insights on how government regulatory authorities can best utilise automated statistical audits for regulation purposes and achieved this by developing an automated statistical audit system for government regulatory authorities. It is hoped that the findings of this study will provide government regulatory authorities with practical suggestions or solutions regarding the generating of official statistics for regulatory purposes, and that the suggestions for future research will inspire future researchers to further investigate automated statistical audit systems, statistical analysis, automated questionnaire development, and government regulatory authorities individually.
- Full Text:
- Date Issued: 2015
- «
- ‹
- 1
- ›
- »