Guo, X., Fang, G., Feng, H., & Zhang, R. (2024). Multi-Layer Perceptron for Predicting Galaxy Parameters (MLP-GaP): stellar masses and star formation rates. Research in Astronomy and Astrophysics, X(XX), 000–000.
This study aims to develop a machine learning-based tool, MLP-GaP, to efficiently and accurately predict galaxy stellar masses (M⋆) and star formation rates (SFRs) from multi-band photometric data, addressing the limitations of traditional SED fitting techniques in handling large datasets.
The researchers trained and tested MLP-GaP using a mock dataset of 120,000 galaxies generated with CIGALE, a spectral energy distribution fitting code. The dataset included redshifts, 9-band magnitudes, their associated errors, stellar masses, and SFRs. The MLP-GaP model, a 10-layer multi-layer perceptron, was trained using a segmented training method with Huber loss function and Adam optimizer. The model's performance was evaluated on a separate testing dataset by comparing its predictions to reference values and estimations from CIGALE. Additionally, MLP-GaP was applied to a real dataset of 288,809 galaxies to demonstrate its real-world applicability.
MLP-GaP presents a robust and efficient alternative to traditional SED fitting techniques for predicting galaxy stellar masses and SFRs from photometric data. Its high accuracy, computational efficiency, and consistency with established methods make it particularly well-suited for analyzing the massive datasets expected from future large-scale sky surveys.
This research significantly contributes to the field of astrophysics by providing a powerful tool for analyzing large-scale galaxy surveys. MLP-GaP's efficiency and accuracy will enable astronomers to extract valuable information about galaxy properties and evolution from the vast amounts of data generated by upcoming surveys like Euclid, LSST, and CSST.
While MLP-GaP shows promise, its reliance on mock datasets for training may introduce discrepancies compared to real galaxies. Future research should focus on:
To Another Language
from source content
arxiv.org
Key Insights Distilled From
by Xiaotong Guo... at arxiv.org 11-04-2024
https://arxiv.org/pdf/2411.00333.pdfDeeper Inquiries