Skip to content

ModernBERT #2250

Answered by MaartenGr
atelierJVA asked this question in Q&A
Dec 22, 2024 · 2 comments · 3 replies
Discussion options

You must be logged in to vote

Thank you for sharing. It is a great foundational/base model! That said, we typically do not use base BERT-like for creating the embeddings; we fine-tune them instead. This likely means we need to wait until someone fine-tunes it specifically use contrastive learning to create embeddings that are more suited towards tasks like clustering.

That said, if you want to use it as is, you can load it in a Transformers pipeline and use it in BERTopic: https://maartengr.github.io/BERTopic/getting_started/embeddings/embeddings.html#hugging-face-transformers

Replies: 2 comments 3 replies

Comment options

You must be logged in to vote
2 replies
@atelierJVA
Comment options

@MaartenGr
Comment options

Answer selected by atelierJVA
Comment options

You must be logged in to vote
1 reply
@MaartenGr
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants