Service of wrapped DeepPavlov NER ML models for a quick entities extraction from cells of long tabular data
-
Updated
Jan 1, 2025 - Python
Service of wrapped DeepPavlov NER ML models for a quick entities extraction from cells of long tabular data
This project demonstrates text summarization using the BART (Bidirectional and Auto-Regressive Transformers) model. BART is a transformer model trained as a denoising autoencoder and is effective for text generation tasks such as summarization.
Blip Image Captioning + GPT-2 Happy Model: Generate joyful responses to image captions using state-of-the-art NLP and computer vision. Pretrained models and data preprocessing included for seamless integration. Explore the intersection of deep learning, sentiment analysis, and language generation
FluxPipeline is a prototype experimental project that provides a framework for working with the FLUX.1-schnell image generation model. This project is intended for educational and experimental purposes only.
Add a description, image, and links to the transformer-model topic page so that developers can more easily learn about it.
To associate your repository with the transformer-model topic, visit your repo's landing page and select "manage topics."