• Welcome to OnlyTech Forums
    An online community for the tech enthusiasts!
    Log in or Register

TorchServe

Bapun Raz

Staff member
Administrator
Joined
3 Nov 2010
Messages
22,774
Solutions
4
Reaction score
30,465
Deploying PyTorch models for inference at scale using TorchServe | Amazon Web Services

AWS is excited to share the experimental release of TorchServe, an open-source model serving library for PyTorch.

AWS developed TorchServe in partnership with Facebook. AWS and Facebook will maintain and continue contributing to TorchServe, along with the broader PyTorch community. With over 83% of the cloud-based PyTorch projects happening on AWS, we are excited to launch TorchServe to address the difficulty of deploying PyTorch models. With TorchServe, you can deploy PyTorch models in either eager or graph mode using TorchScript, serve multiple models simultaneously, version production models for A/B testing, load and unload models dynamically, and monitor detailed logs and customizable metrics.

pytorch/serve
 
Top Bottom