A Machine Learning Approach for Dynamical Mass Measurements of Galaxy Clusters
Date of Original Version
Abstract or Description
We present a modern machine learning approach for cluster dynamical mass measurements that is a factor of two improvement over using a conventional scaling relation. Different methods are tested against a mock cluster catalog constructed using halos with mass ≥ 1014 M⊙h −1 from Multidark’s publicly-available N-body MDPL halo catalog. In the conventional method, we use a standard M(σv) power law scaling relation to infer cluster mass, M, from line-of-sight (LOS) galaxy velocity dispersion, σv. The resulting fractional mass error distribution is broad, with width ∆∈ ≈ 0.86 (68% scatter), and has extended high-error tails. The standard scaling relation can be simply enhanced by including higher-order moments of the LOS velocity distribution. Applying the kurtosis as a linear correction term to log(σv) reduces the width of the error distribution to ∆∈ ≈ 0.74 (15% improvement). Machine learning can be used to take full advantage of all the information in the velocity distribution. We employ the Support Distribution Machines (SDMs) algorithm that learns from distributions of data to predict single values. SDMs trained and tested on the distribution of LOS velocities yield ∆∈ ≈ 0.41 (52% improvement). Furthermore, the problematic tails of the mass error distribution are effectively eliminated
This document is currently not available here.
Astrophysical Journal (forthcoming).