20.03.2021 Views

Deep-Learning-with-PyTorch

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

446 CHAPTER 15 Deploying to production

What deploying to production means will vary with the use case:

• Perhaps the most natural deployment for the models we developed in part 2

would be to set up a network service providing access to our models. We’ll do

this in two versions using lightweight Python web frameworks: Flask (http://

flask.pocoo.org) and Sanic (https://sanicframework.org). The first is arguably

one of the most popular of these frameworks, and the latter is similar in spirit

but takes advantage of Python’s new async/await support for asynchronous

operations for efficiency.

• We can export our model to a well-standardized format that allows us to ship it

using optimized model processors, specialized hardware, or cloud services. For

PyTorch models, the Open Neural Network Exchange (ONNX) format fills this

role.

• We may wish to integrate our models into larger applications. For this it would

be handy if we were not limited to Python. Thus we will explore using PyTorch

models from C++ with the idea that this also is a stepping-stone to any language.

• Finally, for some things like the image zebraification we saw in chapter 2, it may

be nice to run our model on mobile devices. While it is unlikely that you will have

a CT module for your mobile, other medical applications like do-it-yourself skin

screenings may be more natural, and the user might prefer running on the

device versus having their skin sent to a cloud service. Luckily for us, PyTorch has

gained mobile support recently, and we will explore that.

As we learn how to implement these use cases, we will use the classifier from chapter

14 as our first example for serving, and then switch to the zebraification model for the

other bits of deployment.

15.1 Serving PyTorch models

We’ll begin with what it takes to put our model on a server. Staying true to our handson

approach, we’ll start with the simplest possible server. Once we have something

basic that works, we’ll take look at its shortfalls and take a stab at resolving them.

Finally, we’ll look at what is, at the time of writing, the future. Let’s get something that

listens on the network. 1

15.1.1 Our model behind a Flask server

Flask is one of the most widely used Python modules. It can be installed using pip: 2

pip install Flask

1

To play it safe, do not do this on an untrusted network.

2

Or pip3 for Python3. You also might want to run it from a Python virtual environment.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!