well, maybe it’s time to plug a pre-print. this is a follow up to the conference paper published at ICASSP in 2020. tl;dr: more data, multi-task training, and we explain how it works (!). it’s not the best, but it’s good! arxiv.org/abs/2206….

*****
Written on