2023-07-09.14-06-19.mp4
- The UI has been created using material-ui react library.
- The toggle feature has been implemented in the question answer cards to hide and unhide the answers.
- Toast messages are displayed wherever necessary, to indicate a completion of a process, for a better accessibility.
- Google quiz link is readily generated for assessment of the topic.
- The text corpus is taken as input from the user and question-answer pairs are generated using the t-5 transformer model from hugging face for the task of context-aware question generation. (generates question and answer pairs using a context).
- Then these question-answer pairs are converted to normal sentences (For e.g.:
{"Question":"What is your name ?", "Answer":"Dhanesh"}
is converted to "My name is Dhanesh") using t-5 transformer fine-tuned on QA2D dataset. - The fine-tuned model is pushed in hugging-face hub for further usage.
- To reduce the inference time and reduce the model size, I have quantized the fine-tuned transformer using ONNX and fast-t5 library that reduced the size of the model from around 900MB to around 400MB. The models and relevant notebooks are uploaded in
Fine-tuning-notebook/Quantized-model
. - The answer word is blanked and displayed in the web app with a toggle feature.
- Additionally, a Google quiz link containing all these questions is created using google forms API in the backend, which can be shared to test the topic readily.
I used weights and biases to log metrics like epochs, train_acc, valid_acc, BLEU score etc. Click here to view them.
- First of all,
git clone
this repository and go to the appropriate folder in your local machine.
- Create an account on https://console.cloud.google.com/ and create a new project.
- In the
API & Services
section enable the API forGoogle Forms
. - Generate a new
OAuth 2.0 Client ID
and download the json file containingClient_ID
andClient_Secrets
. - Save this json as
credentials.json
in thebackend
folder.
- Navigate to the folder where this repository is cloned.
- Open the terminal and navigate to the
backend
directory using the command:cd backend
. - Install the required dependencies by running:
npm install
. - Start the backend server by running:
npm start
. - Install the Python dependencies by running:
pip install -r requirements.txt
. - Run the Flask application by executing:
python app.py
.
- Open a new terminal window.
- Navigate to the
frontend
directory using the command:cd frontend
. - Install the necessary dependencies by running:
npm install
. - Start the frontend development server by running:
npm start
.
That's it! You are now ready to use Text2Fillups and generate fill-in-the-blank questions from texts. Enjoy! ππ
- Dhanesh ,CSE , IIT Guwahati.
Don't forget to βοΈ star this repository to show your support and stay connected for future updates!