Skip to content

Latest commit

 

History

History
13 lines (7 loc) · 487 Bytes

README.md

File metadata and controls

13 lines (7 loc) · 487 Bytes

Toxic-Comment-Classifier

This is a Toxic Comment Classifier model, which classifies text according to whether it exhibits offensive attributes (i.e. Insult, Obscene, Severe Toxicity).

Classifying all Comments.

image

After Classifying.

Result:

image