Multi Model Server is a tool for serving neural net models for inference
-
Updated
May 20, 2024 - Java
Multi Model Server is a tool for serving neural net models for inference
通用深度学习推理工具,可在生产环境中快速上线由TensorFlow、PyTorch、Caffe框架训练出的深度学习模型。
TypeQL: the power of programming, in your database
A universal scalable machine learning model deployment solution
A scalable, high-performance serving system for federated learning models
The Qualcomm® AI Hub apps are a collection of state-of-the-art machine learning models optimized for performance (latency, memory etc.) and ready to deploy on Qualcomm® devices.
This code is used to build & run a Docker container for performing predictions against a Spark ML Pipeline.
📚 🔬 PIA - Protein Inference Algorithms
A tool to help adapting code bases to NullAway type system.
Exposes a serialized machine learning model through a HTTP API.
Pure Java Llama2 inference with optional multi-GPU CUDA implementation
Crema: Credal Models Algorithms
Machine learning implement on android application 🐳
TypeDB: a strongly-typed database
Apache NiFi Processor For Apache MXNet Inference
Enumeration Algorithm in Bayesian Net made with Java
Add a description, image, and links to the inference topic page so that developers can more easily learn about it.
To associate your repository with the inference topic, visit your repo's landing page and select "manage topics."