Some of the research projects I have done during graduate studies.
As part of the final project of NLP course and together with Uri Katz, we investigated the relation between contextual text representations (Average BERT and Sentence BERT) and the effectiveness of the active learning procedure. Our empirical study revealed a relation between the representation type and the AL performance, based on the representation ability to separate classes.
More details can be found in the project's paper and code.
2) Besov Smoothness Analysis of deep learning network layers with diffrent input (word) representations (September, 2019)
The final project of mathematical foundations to ML, together with Uri Katz we evaluated the "clusterability" of deep learning classifier layers while using different input representations (fasttext, word2vec, one-hot).
We measured the "clusterability" by a machinery of ‘weak-type’ Besov smoothness index that quantifies the geometry of the clustering in the feature space (Elisha & Dekel, 2017). We showed that the Besov smoothness index increased from layer to layer and that more informative representations resulted in higher smoothness index values for the last lyaers.
Further details can be found in the project's presentation and code.