This is a PyQt-based application for performing Neural Style Transfer on images, videos, and live camera feeds. It leverages TensorFlow Hub for pre-trained models and CUDA for accelerated processing on compatible GPUs.
- Image Styling: Apply artistic styles to static images.
- Video Styling: Process and stylize videos frame by frame efficiently.
- Live Camera Feed: Real-time style transfer for live camera input.
- Model Flexibility: Select models from TensorFlow Hub or your system.
- Interactive UI:
- Select input images and videos via file dialogs.
- Preview and process results directly in the application.
- Output display with zoom for images and playback for videos.
- Processing Status: Animated popups indicating processing progress.
- Python 3.8+
- CUDA Toolkit (if using GPU acceleration)
- Required Python libraries (install via
requirements.txt
)
- Clone the repository:
git clone https://github.com/jayeshrdeotalu/Nest-Neural-Style-Transfer.git cd Nest-Neural-Style-Transfer
- Install the dependencies:
pip install -r requirements.txt
- Ensure CUDA is installed and properly configured for GPU acceleration (optional).
- Run the application:
python app.py
- Select the operation:
- Style an Image: Choose an input image and a style model.
- Style a Video: Select a video and apply the chosen style.
- Process the selected input:
- Click on "Process the Styling" to apply the effect.
- View the results directly in the application.
- Save outputs:
- Output is saved in the same input folder.
This project is licensed under the MIT License.
- TensorFlow Hub: For providing pre-trained models.
- PyQt: For the UI framework.
- CUDA: For GPU acceleration.