Skip to content

Latest commit

 

History

History
42 lines (34 loc) · 1.37 KB

README.md

File metadata and controls

42 lines (34 loc) · 1.37 KB

CoRT Pre-trained Backbones

The model requires KorSci-BERT and KorSci-ELECTRA for experiments. Those models are not publicly available for everyone, therefore you have to request the rights for the model via following links below.

The models had been trained with Korean-written scientific research corpora.

KorSci-ELECTRA is free to download to everyone, but KorSci-BERT isn't.
Therefore you have to request the rights to the download the model.

The pre-trained models are built on TensorFlow 1.0 environment, therefore you have to change the codes making fit to TensorFlow 2.0 environment.

Directory Hierarchy

The KorSci-BERT and KorSci-ELECTRA directories must be renamed to korscibert and korscielectra respectively. The following directory hierarchy is default setup for default config.

- korscibert
  - model.ckpt-262500.data-00000-of-00001
  - model.ckpt-262500.index
  - model.ckpt-262500.meta
  - vocab_kisti.txt
  - tokenization_kisti.py
- korscielectra
  - data
    - models
      - korsci_base
        - checkpoint
        - korsci_base.data-00000-of-00001
        - korsci_base.index
        - korsci_base.meta
    - vocab.txt
  - model
    - tokenization.py