TorchServe

  • Thread starter Bapun
  • Start date
  • Replies: Replies 0
  • Views: Views 369

Bapun

Staff member
Community Manager
Joined
3 Nov 2010
Messages
26,549
Solutions
7
Reaction score
35,606
Deploying PyTorch models for inference at scale using TorchServe | Amazon Web Services

AWS is excited to share the experimental release of TorchServe, an open-source model serving library for PyTorch.

AWS developed TorchServe in partnership with Facebook. AWS and Facebook will maintain and continue contributing to TorchServe, along with the broader PyTorch community. With over 83% of the cloud-based PyTorch projects happening on AWS, we are excited to launch TorchServe to address the difficulty of deploying PyTorch models. With TorchServe, you can deploy PyTorch models in either eager or graph mode using TorchScript, serve multiple models simultaneously, version production models for A/B testing, load and unload models dynamically, and monitor detailed logs and customizable metrics.

pytorch/serve
 
Back
Top Bottom
AdBlock Detected

We get it, advertisements are annoying!

Sure, ad-blocking software does a great job at blocking ads, but it also blocks useful features of our website. For the best site experience please disable your AdBlocker.

I've Disabled AdBlock