A photometric machine-learning method to infer stellar metallicity

0Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Following its formation, a star’s metal content is one of the few factors that can significantly alter its evolution. Measurements of stellar metallicity ([Fe/H]) typically require a spectrum, but spectro-scopic surveys are limited to a few×106 targets; photometric surveys, on the other hand, have detected > 109 stars. I present a new machine-learning method to predict [Fe/H] from photometric colors measured by the Sloan Digital Sky Survey (SDSS). The training set consists of ∼120,000 stars with SDSS photometry and reliable [Fe/H] measurements from the SEGUE Stellar Parameters Pipeline (SSPP). For bright stars (g′ ≤ 18 mag), with 4500 K ≤ Teff ≤ 7000 K, corresponding to those with the most reliable SSPP estimates, I find that the model predicts [Fe/H] values with a root-mean-squared-error (RMSE) of ∼0.27 dex. The RMSE from this machine-learning method is similar to the scatter in [Fe/H] measurements from low-resolution spectra.

Cite

CITATION STYLE

APA

Miller, A. A. (2015). A photometric machine-learning method to infer stellar metallicity. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8999, pp. 231–236). Springer Verlag. https://doi.org/10.1007/978-3-319-16313-0_17

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free