A Systematic Visualisation Framework for Radio-Imaging Pipelines
- Authors: Andati, Lexy Acherwa Livoyi
- Date: 2021-04
- Subjects: Radio interferometers , Radio astronomy -- Data processing , Radio astronomy -- Data processing -- Software , Jupyter
- Language: English
- Type: thesis , text , Masters , MSc
- Identifier: http://hdl.handle.net/10962/177338 , vital:42812
- Description: Pipelines for calibration and imaging of radio interferometric data produce many intermediate images and other data products (gain tables, etc.) These often contain valuable information about the quality of the data and the calibration, and can provide the user with valuable insights, if only visualised in the right way. However, the deluge of data that we’re experiencing with modern instruments means that most of these products are never looked at, and only the final images and data products are examined. Furthermore, the variety of imaging algorithms currently available, and the range of their options, means that very different results can be produced from the same set of original data. Proper understanding of this requires a systematic comparison that can be carried out both by individual users locally, and by the community globally. We address both problems by developing a systematic visualisation framework based around Jupyter notebooks, enriched with interactive plots based on the Bokeh and Datashader visualisation libraries. , Thesis (MSc) -- Faculty of Science, Department of Physics and Electronics, 2021
- Full Text:
- Date Issued: 2021-04
- Authors: Andati, Lexy Acherwa Livoyi
- Date: 2021-04
- Subjects: Radio interferometers , Radio astronomy -- Data processing , Radio astronomy -- Data processing -- Software , Jupyter
- Language: English
- Type: thesis , text , Masters , MSc
- Identifier: http://hdl.handle.net/10962/177338 , vital:42812
- Description: Pipelines for calibration and imaging of radio interferometric data produce many intermediate images and other data products (gain tables, etc.) These often contain valuable information about the quality of the data and the calibration, and can provide the user with valuable insights, if only visualised in the right way. However, the deluge of data that we’re experiencing with modern instruments means that most of these products are never looked at, and only the final images and data products are examined. Furthermore, the variety of imaging algorithms currently available, and the range of their options, means that very different results can be produced from the same set of original data. Proper understanding of this requires a systematic comparison that can be carried out both by individual users locally, and by the community globally. We address both problems by developing a systematic visualisation framework based around Jupyter notebooks, enriched with interactive plots based on the Bokeh and Datashader visualisation libraries. , Thesis (MSc) -- Faculty of Science, Department of Physics and Electronics, 2021
- Full Text:
- Date Issued: 2021-04
Accelerated implementations of the RIME for DDE calibration and source modelling
- Authors: Van Staden, Joshua
- Date: 2021
- Subjects: Radio astronomy , Radio inferometers , Radio inferometers -- Calibration , Radio astronomy -- Data processing , Radio inferometers -- Data processing , Radio inferometers -- Calibration -- Data processing
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/172422 , vital:42199
- Description: Second- and third-generation calibration methods filter out subtle effects in interferometer data, and therefore yield significantly higher dynamic ranges. The basis of these calibration techniques relies on building a model of the sky and corrupting it with models of the effects acting on the sources. The sensitivities of modern instruments call for more elaborate models to capture the level of detail that is required to achieve accurate calibration. This thesis implements two types of models to be used in for second- and third-generation calibration. The first model implemented is shapelets, which can be used to model radio source morphologies directly in uv space. The second model implemented is Zernike polynomials, which can be used to represent the primary beam of the antenna. We implement these models in the CODEX-AFRICANUS package and provide a set of unit tests for each model. Additionally, we compare our implementations against other methods of representing these objects and instrumental effects, namely NIFTY-GRIDDER against shapelets and a FITS-interpolation method against the Zernike polynomials. We find that to achieve sufficient accuracy, our implementation of the shapelet model has a higher runtime to that of the NIFTY-GRIDDER. However, the NIFTY-GRIDDER cannot simulate a component-based sky model while the shapelet model can. Additionally, the shapelet model is fully parametric, which allows for integration into a parameterised solver. We find that, while having a smaller memory footprint, our Zernike model has a greater computational complexity than that of the FITS-interpolated method. However, we find that the Zernike implementation has floating-point accuracy in its modelling, while the FITS-interpolated model loses some accuracy through the discretisation of the beam.
- Full Text:
- Date Issued: 2021
- Authors: Van Staden, Joshua
- Date: 2021
- Subjects: Radio astronomy , Radio inferometers , Radio inferometers -- Calibration , Radio astronomy -- Data processing , Radio inferometers -- Data processing , Radio inferometers -- Calibration -- Data processing
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/172422 , vital:42199
- Description: Second- and third-generation calibration methods filter out subtle effects in interferometer data, and therefore yield significantly higher dynamic ranges. The basis of these calibration techniques relies on building a model of the sky and corrupting it with models of the effects acting on the sources. The sensitivities of modern instruments call for more elaborate models to capture the level of detail that is required to achieve accurate calibration. This thesis implements two types of models to be used in for second- and third-generation calibration. The first model implemented is shapelets, which can be used to model radio source morphologies directly in uv space. The second model implemented is Zernike polynomials, which can be used to represent the primary beam of the antenna. We implement these models in the CODEX-AFRICANUS package and provide a set of unit tests for each model. Additionally, we compare our implementations against other methods of representing these objects and instrumental effects, namely NIFTY-GRIDDER against shapelets and a FITS-interpolation method against the Zernike polynomials. We find that to achieve sufficient accuracy, our implementation of the shapelet model has a higher runtime to that of the NIFTY-GRIDDER. However, the NIFTY-GRIDDER cannot simulate a component-based sky model while the shapelet model can. Additionally, the shapelet model is fully parametric, which allows for integration into a parameterised solver. We find that, while having a smaller memory footprint, our Zernike model has a greater computational complexity than that of the FITS-interpolated method. However, we find that the Zernike implementation has floating-point accuracy in its modelling, while the FITS-interpolated model loses some accuracy through the discretisation of the beam.
- Full Text:
- Date Issued: 2021
Machine learning methods for calibrating radio interferometric data
- Authors: Zitha, Simphiwe Nhlanhla
- Date: 2019
- Subjects: Calibration , Radio astronomy -- Data processing , Radio astronomy -- South Africa , Karoo Array Telescope (South Africa) , Radio telescopes -- South Africa , Common Astronomy Software Application (Computer software)
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/97096 , vital:31398
- Description: The applications of machine learning have created an opportunity to deal with complex problems currently encountered in radio astronomy data processing. Calibration is one of the most important data processing steps required to produce high dynamic range images. This process involves the determination of calibration parameters, both instrumental and astronomical, to correct the collected data. Typically, astronomers use a package such as Common Astronomy Software Applications (CASA) to compute the gain solutions based on regular observations of a known calibrator source. In this work we present applications of machine learning to first generation calibration (1GC), using the KAT-7 telescope environmental and pointing sensor data recorded during observations. Applying machine learning to 1GC, as opposed to calculating the gain solutions in CASA, has shown evidence of reducing computation, as well as accurately predict the 1GC gain solutions representing the behaviour of the antenna during an observation. These methods are computationally less expensive, however they have not fully learned to generalise in predicting accurate 1GC solutions by looking at environmental and pointing sensors. We call this multi-output regression model ZCal, which is based on random forest, decision trees, extremely randomized trees and K-nearest neighbor algorithms. The prediction error obtained during the testing of our model on testing data is ≈ 0.01 < rmse < 0.09 for gain amplitude per antenna, and 0.2 rad < rmse <0.5 rad for gain phase. This shows that the instrumental parameters used to train our model more strongly correlate with gain amplitude effects than phase.
- Full Text:
- Date Issued: 2019
- Authors: Zitha, Simphiwe Nhlanhla
- Date: 2019
- Subjects: Calibration , Radio astronomy -- Data processing , Radio astronomy -- South Africa , Karoo Array Telescope (South Africa) , Radio telescopes -- South Africa , Common Astronomy Software Application (Computer software)
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/97096 , vital:31398
- Description: The applications of machine learning have created an opportunity to deal with complex problems currently encountered in radio astronomy data processing. Calibration is one of the most important data processing steps required to produce high dynamic range images. This process involves the determination of calibration parameters, both instrumental and astronomical, to correct the collected data. Typically, astronomers use a package such as Common Astronomy Software Applications (CASA) to compute the gain solutions based on regular observations of a known calibrator source. In this work we present applications of machine learning to first generation calibration (1GC), using the KAT-7 telescope environmental and pointing sensor data recorded during observations. Applying machine learning to 1GC, as opposed to calculating the gain solutions in CASA, has shown evidence of reducing computation, as well as accurately predict the 1GC gain solutions representing the behaviour of the antenna during an observation. These methods are computationally less expensive, however they have not fully learned to generalise in predicting accurate 1GC solutions by looking at environmental and pointing sensors. We call this multi-output regression model ZCal, which is based on random forest, decision trees, extremely randomized trees and K-nearest neighbor algorithms. The prediction error obtained during the testing of our model on testing data is ≈ 0.01 < rmse < 0.09 for gain amplitude per antenna, and 0.2 rad < rmse <0.5 rad for gain phase. This shows that the instrumental parameters used to train our model more strongly correlate with gain amplitude effects than phase.
- Full Text:
- Date Issued: 2019
- «
- ‹
- 1
- ›
- »