From Images to Features: Unbiased Morphology Classification via Variational Auto-Encoders and Domain Adaptation

Quanfeng Xu, Shiyin Shen, Rafael S. de Souza, Mi Chen, Renhao Ye, Yumei She, Zhu Chen, Emille E. O. Ishida, Alberto Krone-Martins, Rupesh Durgesh

Research output: Contribution to journalArticlepeer-review

21 Downloads (Pure)

Abstract

We present a novel approach for the dimensionality reduction of galaxy images by leveraging a combination of variational auto-encoders (VAE) and domain adaptation (DA). We demonstrate the effectiveness of this approach using a sample of low redshift galaxies with detailed morphological type labels from the Galaxy-Zoo DECaLS project. We show that 40-dimensional latent variables can effectively reproduce most morphological features in galaxy images. To further validate the effectiveness of our approach, we utilised a classical random forest (RF) classifier on the 40-dimensional latent variables to make detailed morphology feature classifications. This approach performs similarly to a direct neural network application on galaxy images. We further enhance our model by tuning the VAE network via DA using galaxies in the overlapping footprint of DECaLS and BASS+MzLS, enabling the unbiased application of our model to galaxy images in both surveys. We observed that noise suppression during DA led to even better morphological feature extraction and classification performance. Overall, this combination of VAE and DA can be applied to achieve image dimensionality reduction, defect image identification, and morphology classification in large optical surveys.
Original languageEnglish
Pages (from-to)6391–6400
Number of pages10
JournalMonthly Notices of the Royal Astronomical Society
Volume526
Issue number4
Early online date17 Oct 2023
DOIs
Publication statusE-pub ahead of print - 17 Oct 2023

Keywords

  • astro-ph.GA
  • cs.LG

Fingerprint

Dive into the research topics of 'From Images to Features: Unbiased Morphology Classification via Variational Auto-Encoders and Domain Adaptation'. Together they form a unique fingerprint.

Cite this