20.03.2021 Views

Deep-Learning-with-PyTorch

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

290 CHAPTER 11 Training a classification model to detect suspected tumors

Tail Backbone HEAD

Luna Model Architecture

ChaNnels: 64

Image: 2 x3 x3

ChaNnels: 32

Image: 4 x6 x6

ChaNnels: 16

Image: 8 x12 x12

ChaNnels: 8

Image: 16 x24 x24

ChaNnels: 1

Image: 32 x48x48

Filter

SmaLler

Output

Input

Image

Figure 11.5 The architecture of the LunaModel class consisting of a batch-normalization tail,

a four-block backbone, and a head comprised of a linear layer followed by softmax

11.4.1 The core convolutions

Classification models often have a structure that consists of a tail, a backbone (or

body), and a head. The tail is the first few layers that process the input to the network.

These early layers often have a different structure or organization than the rest of the

network, as they must adapt the input to the form expected by the backbone. Here we

use a simple batch normalization layer, though often the tail contains convolutional

layers as well. Such convolutional layers are often used to aggressively downsample the

size of the image; since our image size is already small, we don’t need to do that here.

Next, the backbone of the network typically contains the bulk of the layers, which

are usually arranged in series of blocks. Each block has the same (or at least a similar)

set of layers, though often the size of the expected input and the number of filters

changes from block to block. We will use a block that consists of two 3 × 3 convolutions,

each followed by an activation, with a max-pooling operation at the end of the

block. We can see this in the expanded view of figure 11.5 labeled Block[block1].

Here’s what the implementation of the block looks like in code.

Listing 11.6

model.py:67, class LunaBlock

class LunaBlock(nn.Module):

def __init__(self, in_channels, conv_channels):

super().__init__()

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!