TorchServe

  • Thread starter Thread starter Bapun
  • Start date Start date
  • Replies Replies: Replies 0
  • Views Views: Views 397

Bapun

Staff member
Community Manager
Joined
3 Nov 2010
Messages
27,923
Solutions
9
Reaction score
38,500
Deploying PyTorch models for inference at scale using TorchServe | Amazon Web Services

AWS is excited to share the experimental release of TorchServe, an open-source model serving library for PyTorch.

AWS developed TorchServe in partnership with Facebook. AWS and Facebook will maintain and continue contributing to TorchServe, along with the broader PyTorch community. With over 83% of the cloud-based PyTorch projects happening on AWS, we are excited to launch TorchServe to address the difficulty of deploying PyTorch models. With TorchServe, you can deploy PyTorch models in either eager or graph mode using TorchScript, serve multiple models simultaneously, version production models for A/B testing, load and unload models dynamically, and monitor detailed logs and customizable metrics.

pytorch/serve
 
Back
Top Bottom