Design patterns and software techniques for large-scale, open and reproducible data reduction
- Authors: Molenaar, Gijs Jan
- Date: 2021
- Subjects: Radio astronomy -- Data processing , Radio astronomy -- Data processing -- Software , Radio astronomy -- South Africa , ASTRODECONV2019 dataset , Radio telescopes -- South Africa , KERN (omputer software)
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/172169 , vital:42172 , 10.21504/10962/172169
- Description: The preparation for the construction of the Square Kilometre Array, and the introduction of its operational precursors, such as LOFAR and MeerKAT, mark the beginning of an exciting era for astronomy. Impressive new data containing valuable science just waiting for discovery is already being generated, and these devices will produce far more data than has ever been collected before. However, with every new data instrument, the data rates grow to unprecedented quantities of data, requiring novel new data-processing tools. In addition, creating science grade data from the raw data still requires significant expert knowledge for processing this data. The software used is often developed by a scientist who lacks proper training in software development skills, resulting in the software not progressing beyond a prototype stage in quality. In the first chapter, we explore various organisational and technical approaches to address these issues by providing a historical overview of the development of radioastronomy pipelines since the inception of the field in the 1940s. In that, the steps required to create a radio image are investigated. We used the lessons-learned to identify patterns in the challenges experienced, and the solutions created to address these over the years. The second chapter describes the mathematical foundations that are essential for radio imaging. In the third chapter, we discuss the production of the KERN Linux distribution, which is a set of software packages containing most radio astronomy software currently in use. Considerable effort was put into making sure that the contained software installs appropriately, all items next to one other on the same system. Where required and possible, bugs and portability fixes were solved and reported with the upstream maintainers. The KERN project also has a website, and issue tracker, where users can report bugs and maintainers can coordinate the packaging effort and new releases. The software packages can be used inside Docker and Singularity containers, enabling the installation of these packages on a wide variety of platforms. In the fourth and fifth chapters, we discuss methods and frameworks for combining the available data reduction tools into recomposable pipelines and introduce the Kliko specification and software. This framework was created to enable end-user astronomers to chain and containerise operations of software in KERN packages. Next, we discuss the Common Workflow Language (CommonWL), a similar but more advanced and mature pipeline framework invented by bio-informatics scientists. CommonWL is supported by a wide range of tools already; among other schedulers, visualisers and editors. Consequently, when a pipeline is made with CommonWL, it can be deployed and manipulated with a wide range of tools. In the final chapter, we attempt something unconventional, applying a generative adversarial network based on deep learning techniques to perform the task of sky brightness reconstruction. Since deep learning methods often require a large number of training samples, we constructed a CommonWL simulation pipeline for creating dirty images and corresponding sky models. This simulated dataset has been made publicly available as the ASTRODECONV2019 dataset. It is shown that this method is useful to perform the restoration and matches the performance of a single clean cycle. In addition, we incorporated domain knowledge by adding the point spread function to the network and by utilising a custom loss function during training. Although it was not possible to improve the cleaning performance of commonly used existing tools, the computational time performance of the approach looks very promising. We suggest that a smaller scope should be the starting point for further studies and optimising of the training of the neural network could produce the desired results.
- Full Text:
- Date Issued: 2021
- Authors: Molenaar, Gijs Jan
- Date: 2021
- Subjects: Radio astronomy -- Data processing , Radio astronomy -- Data processing -- Software , Radio astronomy -- South Africa , ASTRODECONV2019 dataset , Radio telescopes -- South Africa , KERN (omputer software)
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/172169 , vital:42172 , 10.21504/10962/172169
- Description: The preparation for the construction of the Square Kilometre Array, and the introduction of its operational precursors, such as LOFAR and MeerKAT, mark the beginning of an exciting era for astronomy. Impressive new data containing valuable science just waiting for discovery is already being generated, and these devices will produce far more data than has ever been collected before. However, with every new data instrument, the data rates grow to unprecedented quantities of data, requiring novel new data-processing tools. In addition, creating science grade data from the raw data still requires significant expert knowledge for processing this data. The software used is often developed by a scientist who lacks proper training in software development skills, resulting in the software not progressing beyond a prototype stage in quality. In the first chapter, we explore various organisational and technical approaches to address these issues by providing a historical overview of the development of radioastronomy pipelines since the inception of the field in the 1940s. In that, the steps required to create a radio image are investigated. We used the lessons-learned to identify patterns in the challenges experienced, and the solutions created to address these over the years. The second chapter describes the mathematical foundations that are essential for radio imaging. In the third chapter, we discuss the production of the KERN Linux distribution, which is a set of software packages containing most radio astronomy software currently in use. Considerable effort was put into making sure that the contained software installs appropriately, all items next to one other on the same system. Where required and possible, bugs and portability fixes were solved and reported with the upstream maintainers. The KERN project also has a website, and issue tracker, where users can report bugs and maintainers can coordinate the packaging effort and new releases. The software packages can be used inside Docker and Singularity containers, enabling the installation of these packages on a wide variety of platforms. In the fourth and fifth chapters, we discuss methods and frameworks for combining the available data reduction tools into recomposable pipelines and introduce the Kliko specification and software. This framework was created to enable end-user astronomers to chain and containerise operations of software in KERN packages. Next, we discuss the Common Workflow Language (CommonWL), a similar but more advanced and mature pipeline framework invented by bio-informatics scientists. CommonWL is supported by a wide range of tools already; among other schedulers, visualisers and editors. Consequently, when a pipeline is made with CommonWL, it can be deployed and manipulated with a wide range of tools. In the final chapter, we attempt something unconventional, applying a generative adversarial network based on deep learning techniques to perform the task of sky brightness reconstruction. Since deep learning methods often require a large number of training samples, we constructed a CommonWL simulation pipeline for creating dirty images and corresponding sky models. This simulated dataset has been made publicly available as the ASTRODECONV2019 dataset. It is shown that this method is useful to perform the restoration and matches the performance of a single clean cycle. In addition, we incorporated domain knowledge by adding the point spread function to the network and by utilising a custom loss function during training. Although it was not possible to improve the cleaning performance of commonly used existing tools, the computational time performance of the approach looks very promising. We suggest that a smaller scope should be the starting point for further studies and optimising of the training of the neural network could produce the desired results.
- Full Text:
- Date Issued: 2021
Machine learning methods for calibrating radio interferometric data
- Authors: Zitha, Simphiwe Nhlanhla
- Date: 2019
- Subjects: Calibration , Radio astronomy -- Data processing , Radio astronomy -- South Africa , Karoo Array Telescope (South Africa) , Radio telescopes -- South Africa , Common Astronomy Software Application (Computer software)
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/97096 , vital:31398
- Description: The applications of machine learning have created an opportunity to deal with complex problems currently encountered in radio astronomy data processing. Calibration is one of the most important data processing steps required to produce high dynamic range images. This process involves the determination of calibration parameters, both instrumental and astronomical, to correct the collected data. Typically, astronomers use a package such as Common Astronomy Software Applications (CASA) to compute the gain solutions based on regular observations of a known calibrator source. In this work we present applications of machine learning to first generation calibration (1GC), using the KAT-7 telescope environmental and pointing sensor data recorded during observations. Applying machine learning to 1GC, as opposed to calculating the gain solutions in CASA, has shown evidence of reducing computation, as well as accurately predict the 1GC gain solutions representing the behaviour of the antenna during an observation. These methods are computationally less expensive, however they have not fully learned to generalise in predicting accurate 1GC solutions by looking at environmental and pointing sensors. We call this multi-output regression model ZCal, which is based on random forest, decision trees, extremely randomized trees and K-nearest neighbor algorithms. The prediction error obtained during the testing of our model on testing data is ≈ 0.01 < rmse < 0.09 for gain amplitude per antenna, and 0.2 rad < rmse <0.5 rad for gain phase. This shows that the instrumental parameters used to train our model more strongly correlate with gain amplitude effects than phase.
- Full Text:
- Date Issued: 2019
- Authors: Zitha, Simphiwe Nhlanhla
- Date: 2019
- Subjects: Calibration , Radio astronomy -- Data processing , Radio astronomy -- South Africa , Karoo Array Telescope (South Africa) , Radio telescopes -- South Africa , Common Astronomy Software Application (Computer software)
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/97096 , vital:31398
- Description: The applications of machine learning have created an opportunity to deal with complex problems currently encountered in radio astronomy data processing. Calibration is one of the most important data processing steps required to produce high dynamic range images. This process involves the determination of calibration parameters, both instrumental and astronomical, to correct the collected data. Typically, astronomers use a package such as Common Astronomy Software Applications (CASA) to compute the gain solutions based on regular observations of a known calibrator source. In this work we present applications of machine learning to first generation calibration (1GC), using the KAT-7 telescope environmental and pointing sensor data recorded during observations. Applying machine learning to 1GC, as opposed to calculating the gain solutions in CASA, has shown evidence of reducing computation, as well as accurately predict the 1GC gain solutions representing the behaviour of the antenna during an observation. These methods are computationally less expensive, however they have not fully learned to generalise in predicting accurate 1GC solutions by looking at environmental and pointing sensors. We call this multi-output regression model ZCal, which is based on random forest, decision trees, extremely randomized trees and K-nearest neighbor algorithms. The prediction error obtained during the testing of our model on testing data is ≈ 0.01 < rmse < 0.09 for gain amplitude per antenna, and 0.2 rad < rmse <0.5 rad for gain phase. This shows that the instrumental parameters used to train our model more strongly correlate with gain amplitude effects than phase.
- Full Text:
- Date Issued: 2019
Measuring the RFI environment of the South African SKA site
- Authors: Manners, Paul John
- Date: 2007
- Subjects: Radio telescopes , Radio telescopes -- South Africa , Radio astronomy , Radio astronomy -- South Africa , Square Kilometer Array (Spacecraft) , Radio -- Interference -- Measurement
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:5474 , http://hdl.handle.net/10962/d1005259 , Radio telescopes , Radio telescopes -- South Africa , Radio astronomy , Radio astronomy -- South Africa , Square Kilometer Array (Spacecraft) , Radio -- Interference -- Measurement
- Description: The Square Kilometre Array (SKA) Project is an international effort to build the world’s largest radio telescope. It will be 100 times more sensitive than any other radio telescope currently in existence and will consist of thousands of dishes placed at baselines up to 3000 km. In addition to its increased sensitivity it will operate over a very wide frequency range (current specification is 100 MHz - 22 GHz) and will use frequency bands not primarily allocated to radio astronomy. Because of this the telescope needs to be located at a site with low levels of radio frequency interference (RFI). This implies a site that is remote and away from human activity. In bidding to host the SKA, South Africa was required to conduct an RFI survey at its proposed site for a period of 12 months. Apart from this core site, where more than half the SKA dishes may potentially be deployed, the measurement of remote sites in Southern Africa was also required. To conduct measurements at these sites, three mobile measurement systems were designed and built by the South African SKA Project. The design considerations, implementation and RFI measurements recorded during this campaign will be the focus for this dissertation.
- Full Text:
- Date Issued: 2007
- Authors: Manners, Paul John
- Date: 2007
- Subjects: Radio telescopes , Radio telescopes -- South Africa , Radio astronomy , Radio astronomy -- South Africa , Square Kilometer Array (Spacecraft) , Radio -- Interference -- Measurement
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:5474 , http://hdl.handle.net/10962/d1005259 , Radio telescopes , Radio telescopes -- South Africa , Radio astronomy , Radio astronomy -- South Africa , Square Kilometer Array (Spacecraft) , Radio -- Interference -- Measurement
- Description: The Square Kilometre Array (SKA) Project is an international effort to build the world’s largest radio telescope. It will be 100 times more sensitive than any other radio telescope currently in existence and will consist of thousands of dishes placed at baselines up to 3000 km. In addition to its increased sensitivity it will operate over a very wide frequency range (current specification is 100 MHz - 22 GHz) and will use frequency bands not primarily allocated to radio astronomy. Because of this the telescope needs to be located at a site with low levels of radio frequency interference (RFI). This implies a site that is remote and away from human activity. In bidding to host the SKA, South Africa was required to conduct an RFI survey at its proposed site for a period of 12 months. Apart from this core site, where more than half the SKA dishes may potentially be deployed, the measurement of remote sites in Southern Africa was also required. To conduct measurements at these sites, three mobile measurement systems were designed and built by the South African SKA Project. The design considerations, implementation and RFI measurements recorded during this campaign will be the focus for this dissertation.
- Full Text:
- Date Issued: 2007
- «
- ‹
- 1
- ›
- »