server The Triton Inference Server provides an optimized cloud and edge inferencing solution. triton-inference-server3394796
seldon-core An MLOps framework to package, deploy, monitor and manage thousands of production machine learning modelsSeldonIO3005646
multi-model-server Multi Model Server is a tool for serving neural net models for inferenceawslabs858222