Skip to content
This repository has been archived by the owner on Aug 21, 2024. It is now read-only.

Commit

Permalink
Merge pull request #15 from monologg/release/0.6.0
Browse files Browse the repository at this point in the history
Release/0.6.0
  • Loading branch information
monologg authored Aug 20, 2024
2 parents 7628bce + 03513a9 commit d1e2a5a
Show file tree
Hide file tree
Showing 6 changed files with 15 additions and 16 deletions.
8 changes: 4 additions & 4 deletions .github/workflows/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,10 +9,10 @@ jobs:
steps:
- uses: actions/checkout@v2

- name: Set up Python 3.7
- name: Set up Python 3.8
uses: actions/setup-python@v2
with:
python-version: 3.7
python-version: 3.8

- name: Cache pip
uses: actions/cache@v2
Expand All @@ -39,10 +39,10 @@ jobs:
steps:
- uses: actions/checkout@v2

- name: Set up Python 3.7
- name: Set up Python 3.8
uses: actions/setup-python@v2
with:
python-version: 3.7
python-version: 3.8

- name: Cache pip
uses: actions/cache@v2
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -103,7 +103,7 @@ Embedding(8002, 768, padding_idx=1)

- `tokenization_kobert.py`를 랩핑한 파이썬 라이브러리
- KoBERT, DistilKoBERT를 Huggingface Transformers 라이브러리 형태로 제공
- `v0.5.1`에서는 `transformers v3.0` 이상으로 기본 설치합니다. (`transformers v4.0` 까지는 이슈 없이 사용 가능)
- `v0.5.1`이상부터는 `transformers v3.0` 이상으로 기본 설치합니다. (`transformers v4.0` 까지는 이슈 없이 사용 가능)

### Install Kobert-Transformers

Expand Down
8 changes: 4 additions & 4 deletions kobert_transformers/load_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,24 +2,24 @@


def get_kobert_model():
""" Return BertModel for Kobert """
"""Return BertModel for Kobert"""
model = BertModel.from_pretrained("monologg/kobert")
return model


def get_kobert_lm():
""" Return BertForMaskedLM for Kobert """
"""Return BertForMaskedLM for Kobert"""
model = BertForMaskedLM.from_pretrained("monologg/kobert-lm")
return model


def get_distilkobert_model():
""" Return DistilBertModel for DistilKobert """
"""Return DistilBertModel for DistilKobert"""
model = DistilBertModel.from_pretrained("monologg/distilkobert")
return model


def get_distilkobert_lm():
""" Return DistilBertForMaskedLM for DistilKobert """
"""Return DistilBertForMaskedLM for DistilKobert"""
model = DistilBertForMaskedLM.from_pretrained("monologg/distilkobert")
return model
3 changes: 1 addition & 2 deletions kobert_transformers/tokenization_kobert.py
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,6 @@ def __init__(
mask_token="[MASK]",
**kwargs,
):

# Build vocab
self.token2idx = dict()
self.idx2token = []
Expand Down Expand Up @@ -178,7 +177,7 @@ def _tokenize(self, text):
return new_pieces

def _convert_token_to_id(self, token):
""" Converts a token (str/unicode) in an id using the vocab. """
"""Converts a token (str/unicode) in an id using the vocab."""
return self.token2idx.get(token, self.token2idx[self.unk_token])

def _convert_id_to_token(self, index):
Expand Down
2 changes: 1 addition & 1 deletion kobert_transformers/version.txt
Original file line number Diff line number Diff line change
@@ -1 +1 @@
0.5.1
0.6.0
8 changes: 4 additions & 4 deletions requirements-dev.txt
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# for clean code :)
isort==5.7.0
black==20.8b1
flake8==3.8.4
isort==5.12.0
black==23.10.0
flake8==6.1.0

# for safe code :)
pytest==6.2.1
pytest==7.4.2

0 comments on commit d1e2a5a

Please sign in to comment.