PatentTransformer is our codename for "Augmented Inventing." The ultimate goal of this project is to help inventors conceive better inventions and quality patents. We leverage Transformer-based models, such as GPT-2 and BERT for patent text generation and measurement. Our source code will be released soon.
- Patent Claim Generation by Fine-Tuning OpenAI GPT-2 (under review)
- Measuring Patent Claim Generation by Span Relevancy (To be published in the Proceedings of the Thirteenth International Workshop on Juris-informatics (JURISIN 2019), hosted by JSAI-isAI2019)
- Personalized Patent Claim Generation and Measurement (Best Doctoral Consortium paper at the 32nd International Conference on Legal Knowledge and Information Systems (JURIX 2019). To be published in the CEUR Workshop Proceedings)
- PatentBERT: Patent Classification with Fine-Tuning a pre-trained BERT Model (under review)
- PatentTransformer-2: Controlling Patent Text Generation by Structural Metadata (under review)