The implementation of DeBERTa
-
Updated
Sep 29, 2023 - Python
The implementation of DeBERTa
[NeurIPS‘2021] "TransGAN: Two Pure Transformers Can Make One Strong GAN, and That Can Scale Up", Yifan Jiang, Shiyu Chang, Zhangyang Wang
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
Implementation of Transformer Model in Tensorflow
[IGARSS'22]: A Transformer-Based Siamese Network for Change Detection
Seq2SeqSharp is a tensor based fast & flexible deep neural network framework written by .NET (C#). It has many highlighted features, such as automatic differentiation, different network types (Transformer, LSTM, BiLSTM and so on), multi-GPUs supported, cross-platforms (Windows, Linux, x86, x64, ARM), multimodal model for text and images and so on.
pretrained BERT model for cyber security text, learned CyberSecurity Knowledge
Multi-module Recurrent Convolutional Neural Network with Transformer Encoder for ECG Arrhythmia Classification
This repository contains PyTorch implementations of the models from the paper An Empirical Study MIME: MIMicking Emotions for Empathetic Response Generation.
CASPR is a deep learning framework applying transformer architecture to learn and predict from tabular data at scale.
The repo is for the Heart Disease classification project using Transformer Encoders in PyTorch.
Temporary remove unused tokens during training to save ram and speed.
This project aims to implement the Transformer Encoder blocks using various Positional Encoding methods.
题目知识点预测标注。Question knowledge point prediction.
Generating English Rock lyrics using BERT
Transformer Encoder with Multiscale Deep Learning for Pain Classification Using Physiological Signals
Official Pytorch implementation of (Roles and Utilization of Attention Heads in Transformer-based Neural Language Models), ACL 2020
Code for the ACL 2019 paper "Observing Dialogue in Therapy: Categorizing and Forecasting Behavioral Codes"
Vision Transformer Implementation in TensorFlow
Contextual embedding for text blobs.
Add a description, image, and links to the transformer-encoder topic page so that developers can more easily learn about it.
To associate your repository with the transformer-encoder topic, visit your repo's landing page and select "manage topics."