Third generation calibrations for Meerkat Observation of Saraswati Supercluster
- Authors: Kincaid, Robert Daniel
- Date: 2022-10-14
- Subjects: Square Kilometre Array (Project) , Superclusters , Saraswati Supercluster , Radio astronomy , MeerKAT , Calibration
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/362916 , vital:65374
- Description: The international collaboration of the Square Kilometre Array (SKA), which is one of the largest and most challenging science projects of the 21st century, will bring a revolution in radio astronomy in terms of sensitivity and resolution. The recent launch of several new radio instruments, combined with the subsequent developments in calibration and imaging techniques, has dramatically advanced this field over the past few years, thus enhancing our knowledge of the radio universe. Various SKA pathfinders around the world have been developed (and more are planned for construction) that have laid down a firm foundation for the SKA in terms of science while additionally giving insight into the technological requirements required for the projected data outputs to become manageable. South Africa has recently built the new MeerKAT telescope, which is a SKA precursor forming an integral part of SKA-mid component. The MeerKAT instrument has unprecedented sensitivity that can cater for the required science goals of the current and future SKA era. It is noticeable from MeerKAT and other precursors that the data produced by these instruments are significantly challenging to calibrate and image. Calibration-related artefacts intrinsic to bright sources are of major concern since, they limit the Dynamic Range (DR) and image fidelity of the resulting images and cause flux suppression of extended sources. Diffuse radio sources from galaxy clusters in the form of halos, relics and most recently bridges on the Mpc scale, because of their diffuse nature combined with wide field of view (FoV) observations, make them particularly good candidates for testing the different approaches of calibration. , Thesis (MSc) -- Faculty of Science, Physics and Electronics, 2022
- Full Text:
- Date Issued: 2022-10-14
- Authors: Kincaid, Robert Daniel
- Date: 2022-10-14
- Subjects: Square Kilometre Array (Project) , Superclusters , Saraswati Supercluster , Radio astronomy , MeerKAT , Calibration
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/362916 , vital:65374
- Description: The international collaboration of the Square Kilometre Array (SKA), which is one of the largest and most challenging science projects of the 21st century, will bring a revolution in radio astronomy in terms of sensitivity and resolution. The recent launch of several new radio instruments, combined with the subsequent developments in calibration and imaging techniques, has dramatically advanced this field over the past few years, thus enhancing our knowledge of the radio universe. Various SKA pathfinders around the world have been developed (and more are planned for construction) that have laid down a firm foundation for the SKA in terms of science while additionally giving insight into the technological requirements required for the projected data outputs to become manageable. South Africa has recently built the new MeerKAT telescope, which is a SKA precursor forming an integral part of SKA-mid component. The MeerKAT instrument has unprecedented sensitivity that can cater for the required science goals of the current and future SKA era. It is noticeable from MeerKAT and other precursors that the data produced by these instruments are significantly challenging to calibrate and image. Calibration-related artefacts intrinsic to bright sources are of major concern since, they limit the Dynamic Range (DR) and image fidelity of the resulting images and cause flux suppression of extended sources. Diffuse radio sources from galaxy clusters in the form of halos, relics and most recently bridges on the Mpc scale, because of their diffuse nature combined with wide field of view (FoV) observations, make them particularly good candidates for testing the different approaches of calibration. , Thesis (MSc) -- Faculty of Science, Physics and Electronics, 2022
- Full Text:
- Date Issued: 2022-10-14
CubiCal: a fast radio interferometric calibration suite exploiting complex optimisation
- Authors: Kenyon, Jonathan
- Date: 2019
- Subjects: Interferometry , Radio astronomy , Python (Computer program language) , Square Kilometre Array (Project)
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/92341 , vital:30711
- Description: The advent of the Square Kilometre Array and its precursors marks the start of an exciting era for radio interferometry. However, with new instruments producing unprecedented quantities of data, many existing calibration algorithms and implementations will be hard-pressed to keep up. Fortunately, it has recently been shown that the radio interferometric calibration problem can be expressed concisely using the ideas of complex optimisation. The resulting framework exposes properties of the calibration problem which can be exploited to accelerate traditional non-linear least squares algorithms. We extend the existing work on the topic by considering the more general problem of calibrating a Jones chain: the product of several unknown gain terms. We also derive specialised solvers for performing phase-only, delay and pointing error calibration. In doing so, we devise a method for determining update rules for arbitrary, real-valued parametrisations of a complex gain. The solvers are implemented in an optimised Python package called CubiCal. CubiCal makes use of Cython to generate fast C and C++ routines for performing computationally demanding tasks whilst leveraging multiprocessing and shared memory to take advantage of modern, parallel hardware. The package is fully compatible with the measurement set, the most common format for interferometer data, and is well integrated with Montblanc - a third party package which implements optimised model visibility prediction. CubiCal's calibration routines are applied successfully to both simulated and real data for the field surrounding source 3C147. These tests include direction-independent and direction dependent calibration, as well as tests of the specialised solvers. Finally, we conduct extensive performance benchmarks and verify that CubiCal convincingly outperforms its most comparable competitor.
- Full Text:
- Date Issued: 2019
- Authors: Kenyon, Jonathan
- Date: 2019
- Subjects: Interferometry , Radio astronomy , Python (Computer program language) , Square Kilometre Array (Project)
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/92341 , vital:30711
- Description: The advent of the Square Kilometre Array and its precursors marks the start of an exciting era for radio interferometry. However, with new instruments producing unprecedented quantities of data, many existing calibration algorithms and implementations will be hard-pressed to keep up. Fortunately, it has recently been shown that the radio interferometric calibration problem can be expressed concisely using the ideas of complex optimisation. The resulting framework exposes properties of the calibration problem which can be exploited to accelerate traditional non-linear least squares algorithms. We extend the existing work on the topic by considering the more general problem of calibrating a Jones chain: the product of several unknown gain terms. We also derive specialised solvers for performing phase-only, delay and pointing error calibration. In doing so, we devise a method for determining update rules for arbitrary, real-valued parametrisations of a complex gain. The solvers are implemented in an optimised Python package called CubiCal. CubiCal makes use of Cython to generate fast C and C++ routines for performing computationally demanding tasks whilst leveraging multiprocessing and shared memory to take advantage of modern, parallel hardware. The package is fully compatible with the measurement set, the most common format for interferometer data, and is well integrated with Montblanc - a third party package which implements optimised model visibility prediction. CubiCal's calibration routines are applied successfully to both simulated and real data for the field surrounding source 3C147. These tests include direction-independent and direction dependent calibration, as well as tests of the specialised solvers. Finally, we conduct extensive performance benchmarks and verify that CubiCal convincingly outperforms its most comparable competitor.
- Full Text:
- Date Issued: 2019
Advanced radio interferometric simulation and data reduction techniques
- Authors: Makhathini, Sphesihle
- Date: 2018
- Subjects: Interferometry , Radio interferometers , Algorithms , Radio telescopes , Square Kilometre Array (Project) , Very Large Array (Observatory : N.M.) , Radio astronomy
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/57348 , vital:26875
- Description: This work shows how legacy and novel radio Interferometry software packages and algorithms can be combined to produce high-quality reductions from modern telescopes, as well as end-to-end simulations for upcoming instruments such as the Square Kilometre Array (SKA) and its pathfinders. We first use a MeqTrees based simulations framework to quantify how artefacts due to direction-dependent effects accumulate with time, and the consequences of this accumulation when observing the same field multiple times in order to reach the survey depth. Our simulations suggest that a survey like LADUMA (Looking at the Distant Universe with MeerKAT Array), which aims to achieve its survey depth of 16 µJy/beam in a 72 kHz at 1.42 GHz by observing the same field for 1000 hours, will be able to reach its target depth in the presence of these artefacts. We also present stimela, a system agnostic scripting framework for simulating, processing and imaging radio interferometric data. This framework is then used to write an end-to-end simulation pipeline in order to quantify the resolution and sensitivity of the SKA1-MID telescope (the first phase of the SKA mid-frequency telescope) as a function of frequency, as well as the scale-dependent sensitivity of the telescope. Finally, a stimela-based reduction pipeline is used to process data of the field around the source 3C147, taken by the Karl G. Jansky Very Large Array (VLA). The reconstructed image from this reduction has a typical 1a noise level of 2.87 µJy/beam, and consequently a dynamic range of 8x106:1, given the 22.58 Jy/beam flux Density of the source 3C147.
- Full Text:
- Date Issued: 2018
- Authors: Makhathini, Sphesihle
- Date: 2018
- Subjects: Interferometry , Radio interferometers , Algorithms , Radio telescopes , Square Kilometre Array (Project) , Very Large Array (Observatory : N.M.) , Radio astronomy
- Language: English
- Type: text , Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/57348 , vital:26875
- Description: This work shows how legacy and novel radio Interferometry software packages and algorithms can be combined to produce high-quality reductions from modern telescopes, as well as end-to-end simulations for upcoming instruments such as the Square Kilometre Array (SKA) and its pathfinders. We first use a MeqTrees based simulations framework to quantify how artefacts due to direction-dependent effects accumulate with time, and the consequences of this accumulation when observing the same field multiple times in order to reach the survey depth. Our simulations suggest that a survey like LADUMA (Looking at the Distant Universe with MeerKAT Array), which aims to achieve its survey depth of 16 µJy/beam in a 72 kHz at 1.42 GHz by observing the same field for 1000 hours, will be able to reach its target depth in the presence of these artefacts. We also present stimela, a system agnostic scripting framework for simulating, processing and imaging radio interferometric data. This framework is then used to write an end-to-end simulation pipeline in order to quantify the resolution and sensitivity of the SKA1-MID telescope (the first phase of the SKA mid-frequency telescope) as a function of frequency, as well as the scale-dependent sensitivity of the telescope. Finally, a stimela-based reduction pipeline is used to process data of the field around the source 3C147, taken by the Karl G. Jansky Very Large Array (VLA). The reconstructed image from this reduction has a typical 1a noise level of 2.87 µJy/beam, and consequently a dynamic range of 8x106:1, given the 22.58 Jy/beam flux Density of the source 3C147.
- Full Text:
- Date Issued: 2018
- «
- ‹
- 1
- ›
- »