MoËT, a novel model based on Mixture of Experts, consisting of decision tree experts and a generalized linear model gating function, which is more expressive than the standard decision tree and can be used in real-world supervised problems on which it outperforms other verifiable machine learning models.