- Title
- Self-attentive vision in evolutionary robotics
- Creator
- Botha, Bouwer
- Subject
- Evolutionary robotics
- Subject
- Robotics
- Subject
- Neural networks (Computer science)
- Date Issued
- 2024-04
- Date
- 2024-04
- Type
- Master's theses
- Type
- text
- Identifier
- http://hdl.handle.net/10948/63628
- Identifier
- vital:73566
- Description
- The autonomy of a robot refers to its ability to achieve a task in an environment with minimal human supervision. This may require autonomous solutions to be able to perceive their environment to inform their decisions. An inexpensive and highly informative way that robots can perceive the environment is through vision. The autonomy of a robot is reliant on the quality of the robotic controller. These controllers are the software interface between the robot and environment that determine the actions of the robot based on the perceived environment. Controllers are typically created using manual programming techniques, which become progressively more challenging with increasing complexity of both the robot and task. An alternative to manual programming is the use of machine learning techniques such as those used by Evolutionary Robotics (ER). ER is an area of research that investigates the automatic creation of controllers. Instead of manually programming a controller, an Evolutionary Algorithms can be used to evolve the controller through repeated interactions with the task environment. Employing the ER approach on camera-based controllers, however, has presented problems for conventional ER methods. Firstly, existing architectures that are capable of automatically processing images, have a large number of trained parameters. These architectures over-encumber the evolutionary process due to the large search space of possible configurations. Secondly, the evolution of complex controllers needs to be done in simulation, which requires either: (a) the construction of a photo-realistic virtual environment with accurate lighting, texturing and models or (b) potential reduction of the controller capability by simplifying the problem via image preprocessing. Any controller trained in simulation also raises the inherent concern of not being able to transfer to the real world. This study proposes a new technique for the evolution of camera-based controllers in ER, that aims to address the highlighted problems. The use of self-attention is proposed to facilitate the evolution of compact controllers that are able to evolve specialized sets of task-relevant features in unprocessed images by focussing on important image regions. Furthermore, a new neural network-based simulation approach, Generative Neuro-Augmented Vision (GNAV), is proposed to simplify simulation construction. GNAV makes use of random data collected in a simple virtual environment and the real world. A neural network is trained to overcome the visual discrepancies between these two environments. GNAV enables a controller to be trained in a simple simulated environment that appears similar to the real environment, while requiring minimal human supervision. The capabilities of the new technique were demonstrated using a series of real-world navigation tasks based on camera vision. Controllers utilizing the proposed self-attention mechanism were trained using GNAV and transferred to a real camera-equipped robot. The controllers were shown to be able to perform the same tasks in the real world.
- Description
- Thesis (MSc) -- Faculty of Science, School of Computer Science, Mathematics, Physics and Statistics, 2024
- Format
- computer
- Format
- online resource
- Format
- application/pdf
- Format
- 1 online resource (210 pages)
- Format
- Publisher
- Nelson Mandela University
- Publisher
- Faculty of Science
- Language
- English
- Rights
- Nelson Mandela University
- Rights
- All Rights Reserved
- Rights
- Open Access
- Hits: 663
- Visitors: 672
- Downloads: 14
Thumbnail | File | Description | Size | Format | |||
---|---|---|---|---|---|---|---|
View Details Download | SOURCE1 | Botha, B.pdf | 52 MB | Adobe Acrobat PDF | View Details Download |