Code for Word-Class Embeddings (WCEs), a form of supervised embeddings especially suited for multiclass text classification. WCEs are meant to be used as extensions (i.e., by concatenation) to pre-trained embeddings (e.g., GloVe or word2vec) embeddings in order to improve the performance of neural classifiers.
Code to reproduce the experiments reported in the paper "Word-Class Embeddings for Multiclass Text Classification" (publised on Data Mining and Knowledge Discovery, 2021 -- a preprint is available here). This repo also includes a script to extract the word-class embedding matrix from any dataset so that you can use it in your model.