- Page 1 and 2: Eli StevensLuca AntigaThomas Viehma
- Page 3 and 4: Deep Learningwith PyTorchELI STEVEN
- Page 5: To my wife (this book would not hav
- Page 9 and 10: CONTENTSvii453.10 NumPy interoperab
- Page 11 and 12: CONTENTSix87.4 Exercises 1917.5 Sum
- Page 13 and 14: CONTENTSxi121311.7 Running the trai
- Page 15: CONTENTSxiii14.8 Conclusion 439Behi
- Page 18 and 19: xviFOREWORDWith the publication of
- Page 20 and 21: xviiiPREFACETo our kids-of-the-80s
- Page 22 and 23: xxACKNOWLEDGMENTSDeirdre Hiam, our
- Page 24 and 25: xxiiABOUT THIS BOOKyour platform of
- Page 26 and 27: xxivABOUT THIS BOOKdo not start wit
- Page 28 and 29: xxviABOUT THIS BOOKamount of partic
- Page 30 and 31: about the cover illustrationThe fig
- Page 33 and 34: Introducing deeplearning and thePyT
- Page 35 and 36: The deep learning revolution5Deep l
- Page 37 and 38: Why PyTorch?7take a data source and
- Page 39 and 40: Why PyTorch?9NOTE The deep learning
- Page 41 and 42: An overview of how PyTorch supports
- Page 43 and 44: Hardware and software requirements1
- Page 45 and 46: Summary15NOTE Jupyter Notebooks are
- Page 47 and 48: A pretrained network that recognize
- Page 49 and 50: A pretrained network that recognize
- Page 51 and 52: A pretrained network that recognize
- Page 53 and 54: A pretrained network that recognize
- Page 55 and 56: A pretrained network that recognize
- Page 57 and 58:
A pretrained model that fakes it un
- Page 59 and 60:
A pretrained model that fakes it un
- Page 61 and 62:
A pretrained model that fakes it un
- Page 63 and 64:
A pretrained network that describes
- Page 65 and 66:
Torch Hub35Now, just for fun, let
- Page 67 and 68:
Conclusion37Note that entry points
- Page 69 and 70:
It starts with a tensorThis chapter
- Page 71 and 72:
The world as floating-point numbers
- Page 73 and 74:
Tensors: Multidimensional arrays433
- Page 75 and 76:
Tensors: Multidimensional arrays45#
- Page 77 and 78:
Named tensors47To make things concr
- Page 79 and 80:
Named tensors49For operations with
- Page 81 and 82:
Tensor element types51The default d
- Page 83 and 84:
Tensors: Scenic views of storage53
- Page 85 and 86:
Tensor metadata: Size, offset, and
- Page 87 and 88:
Tensor metadata: Size, offset, and
- Page 89 and 90:
Tensor metadata: Size, offset, and
- Page 91 and 92:
Tensor metadata: Size, offset, and
- Page 93 and 94:
Moving tensors to the GPU63PyTorch
- Page 95 and 96:
Generalized tensors are tensors, to
- Page 97 and 98:
Serializing tensors67While we can q
- Page 99 and 100:
Summary69• PyTorch has a comprehe
- Page 101 and 102:
Working with images71that you know
- Page 103 and 104:
Working with images73Listing 4.1# I
- Page 105 and 106:
3D images: Volumetric data75NOTE He
- Page 107 and 108:
Representing tabular data774.3 Repr
- Page 109 and 110:
Representing tabular data79The thir
- Page 111 and 112:
Representing tabular data814.3.3 Re
- Page 113 and 114:
Representing tabular data83[6],...,
- Page 115 and 116:
Representing tabular data85Note tha
- Page 117 and 118:
Working with time series87network f
- Page 119 and 120:
Working with time series89In a time
- Page 121 and 122:
Working with time series91the varia
- Page 123 and 124:
Representing text93There are multip
- Page 125 and 126:
Representing text95in bits of those
- Page 127 and 128:
Representing text97that is, we popu
- Page 129 and 130:
Representing text99use vectors of f
- Page 131 and 132:
Exercises101In non-text application
- Page 133 and 134:
The mechanicsof learningThis chapte
- Page 135 and 136:
A timeless lesson in modeling105its
- Page 137 and 138:
Learning is just parameter estimati
- Page 139 and 140:
Less loss is what we want109Is this
- Page 141 and 142:
Less loss is what we want111zero-di
- Page 143 and 144:
Down along the gradient1135.4 Down
- Page 145 and 146:
Down along the gradient115loSs1234F
- Page 147 and 148:
Down along the gradient117The compl
- Page 149 and 150:
Down along the gradient119learning_
- Page 151 and 152:
Down along the gradient121NOTE The
- Page 153 and 154:
PyTorch’s autograd: Backpropagati
- Page 155 and 156:
PyTorch’s autograd: Backpropagati
- Page 157 and 158:
PyTorch’s autograd: Backpropagati
- Page 159 and 160:
PyTorch’s autograd: Backpropagati
- Page 161 and 162:
PyTorch’s autograd: Backpropagati
- Page 163 and 164:
PyTorch’s autograd: Backpropagati
- Page 165 and 166:
PyTorch’s autograd: Backpropagati
- Page 167 and 168:
PyTorch’s autograd: Backpropagati
- Page 169 and 170:
Summary1395.6 ConclusionWe started
- Page 171 and 172:
Using a neuralnetwork to fit the da
- Page 173 and 174:
Artificial neurons143matter of fact
- Page 175 and 176:
Artificial neurons145Neural network
- Page 177 and 178:
Artificial neurons147This results i
- Page 179 and 180:
Artificial neurons149the input is s
- Page 181 and 182:
The PyTorch nn module151Let’s tak
- Page 183 and 184:
The PyTorch nn module153# ...for ho
- Page 185 and 186:
The PyTorch nn module155HEIGHTREDGR
- Page 187 and 188:
The PyTorch nn module157loss_val =
- Page 189 and 190:
Finally a neural network159nn provi
- Page 191 and 192:
Finally a neural network161This is
- Page 193 and 194:
Summary1636.6 Summary2 The third-ha
- Page 195 and 196:
A dataset of tiny images165We will
- Page 197 and 198:
A dataset of tiny images167ACTUALDA
- Page 199 and 200:
A dataset of tiny images169Among th
- Page 201 and 202:
A dataset of tiny images171hence, w
- Page 203 and 204:
Distinguishing birds from airplanes
- Page 205 and 206:
Distinguishing birds from airplanes
- Page 207 and 208:
Distinguishing birds from airplanes
- Page 209 and 210:
Distinguishing birds from airplanes
- Page 211 and 212:
Distinguishing birds from airplanes
- Page 213 and 214:
Distinguishing birds from airplanes
- Page 215 and 216:
Distinguishing birds from airplanes
- Page 217 and 218:
Distinguishing birds from airplanes
- Page 219 and 220:
Distinguishing birds from airplanes
- Page 221 and 222:
Exercises191transform from torchvis
- Page 223 and 224:
Using convolutionsto generalizeThis
- Page 225 and 226:
The case for convolutions195Of cour
- Page 227 and 228:
Convolutions in action197with multi
- Page 229 and 230:
Convolutions in action199one-half t
- Page 231 and 232:
Convolutions in action2010output0in
- Page 233 and 234:
Convolutions in action203solve our
- Page 235 and 236:
Convolutions in action205The recept
- Page 237 and 238:
Subclassing nn.Module207# In[25]:mo
- Page 239 and 240:
Subclassing nn.Module209First, our
- Page 241 and 242:
Subclassing nn.Module211def forward
- Page 243 and 244:
Training our convnet213Sums the los
- Page 245 and 246:
Training our convnet215no structure
- Page 247 and 248:
Model design217Even for our small n
- Page 249 and 250:
Model design219out = out.view(-1, 8
- Page 251 and 252:
Model design221right? The idea behi
- Page 253 and 254:
Model design223out = self.conv2_bat
- Page 255 and 256:
Model design225def forward(self, x)
- Page 257 and 258:
Model design227# In[55]:class ResBl
- Page 259 and 260:
Conclusion2291.000.95ACcuracy0.900.
- Page 261:
Summary2318.8 Summary• Convolutio
- Page 265 and 266:
Using PyTorchto fight cancerThis ch
- Page 267 and 268:
Preparing for a large-scale project
- Page 269 and 270:
What is a CT scan, exactly?239singl
- Page 271 and 272:
The project: An end-to-end detector
- Page 273 and 274:
The project: An end-to-end detector
- Page 275 and 276:
The project: An end-to-end detector
- Page 277 and 278:
The project: An end-to-end detector
- Page 279 and 280:
The project: An end-to-end detector
- Page 281 and 282:
The project: An end-to-end detector
- Page 283 and 284:
Summary253work we’ve done will pa
- Page 285 and 286:
255Step 1 (ch. 10):Data LoadingSTep
- Page 287 and 288:
Parsing LUNA’s annotation data257
- Page 289 and 290:
Parsing LUNA’s annotation data259
- Page 291 and 292:
Parsing LUNA’s annotation data261
- Page 293 and 294:
Loading individual CT scans263The n
- Page 295 and 296:
Locating a nodule using the patient
- Page 297 and 298:
Locating a nodule using the patient
- Page 299 and 300:
Locating a nodule using the patient
- Page 301 and 302:
A straightforward dataset implement
- Page 303 and 304:
A straightforward dataset implement
- Page 305 and 306:
A straightforward dataset implement
- Page 307 and 308:
Conclusion27710.5.4 Rendering the d
- Page 309 and 310:
Training aclassification modelto de
- Page 311 and 312:
A foundational model and training l
- Page 313 and 314:
The main entry point for our applic
- Page 315 and 316:
Pretraining setup and initializatio
- Page 317 and 318:
Pretraining setup and initializatio
- Page 319 and 320:
Our first-pass neural network desig
- Page 321 and 322:
Our first-pass neural network desig
- Page 323 and 324:
Our first-pass neural network desig
- Page 325 and 326:
Training and validating the model29
- Page 327 and 328:
Training and validating the model29
- Page 329 and 330:
Training and validating the model29
- Page 331 and 332:
Outputting performance metrics301In
- Page 333 and 334:
Outputting performance metrics303Ne
- Page 335 and 336:
Running the training script305If th
- Page 337 and 338:
Running the training script307>>> f
- Page 339 and 340:
Graphing training metrics with Tens
- Page 341 and 342:
Graphing training metrics with Tens
- Page 343 and 344:
Graphing training metrics with Tens
- Page 345 and 346:
Why isn’t the model learning to d
- Page 347 and 348:
Summary317• We will use PyTorch
- Page 349 and 350:
High-level plan for improvement319q
- Page 351 and 352:
Good dogs vs. bad guys: False posit
- Page 353 and 354:
Graphing the positives and negative
- Page 355 and 356:
Graphing the positives and negative
- Page 357 and 358:
Graphing the positives and negative
- Page 359 and 360:
Graphing the positives and negative
- Page 361 and 362:
Graphing the positives and negative
- Page 363 and 364:
Graphing the positives and negative
- Page 365 and 366:
What does an ideal dataset look lik
- Page 367 and 368:
What does an ideal dataset look lik
- Page 369 and 370:
What does an ideal dataset look lik
- Page 371 and 372:
What does an ideal dataset look lik
- Page 373 and 374:
What does an ideal dataset look lik
- Page 375 and 376:
Revisiting the problem of overfitti
- Page 377 and 378:
Preventing overfitting with data au
- Page 379 and 380:
Preventing overfitting with data au
- Page 381 and 382:
Preventing overfitting with data au
- Page 383 and 384:
Preventing overfitting with data au
- Page 385 and 386:
Exercises35512.8 Exercises1 The F1
- Page 387 and 388:
Using segmentationto find suspected
- Page 389 and 390:
Adding a second model to our projec
- Page 391 and 392:
Semantic segmentation: Per-pixel cl
- Page 393 and 394:
Semantic segmentation: Per-pixel cl
- Page 395 and 396:
Semantic segmentation: Per-pixel cl
- Page 397 and 398:
Updating the model for segmentation
- Page 399 and 400:
Updating the dataset for segmentati
- Page 401 and 402:
Updating the dataset for segmentati
- Page 403 and 404:
Updating the dataset for segmentati
- Page 405 and 406:
Updating the dataset for segmentati
- Page 407 and 408:
Updating the dataset for segmentati
- Page 409 and 410:
Updating the dataset for segmentati
- Page 411 and 412:
Updating the dataset for segmentati
- Page 413 and 414:
Updating the dataset for segmentati
- Page 415 and 416:
Updating the dataset for segmentati
- Page 417 and 418:
Updating the training script for se
- Page 419 and 420:
Updating the training script for se
- Page 421 and 422:
Updating the training script for se
- Page 423 and 424:
Updating the training script for se
- Page 425 and 426:
Updating the training script for se
- Page 427 and 428:
Updating the training script for se
- Page 429 and 430:
Results399We will update our classi
- Page 431 and 432:
Conclusion401Training datasetPlatea
- Page 433 and 434:
Summary403• It is possible to tra
- Page 435 and 436:
Towards the finish line40514.1 Towa
- Page 437 and 438:
Independence of the validation set4
- Page 439 and 440:
Bridging CT segmentation and nodule
- Page 441 and 442:
Bridging CT segmentation and nodule
- Page 443 and 444:
Bridging CT segmentation and nodule
- Page 445 and 446:
Bridging CT segmentation and nodule
- Page 447 and 448:
Predicting malignancy417We run the
- Page 449 and 450:
Predicting malignancy4191. Nodule C
- Page 451 and 452:
Predicting malignancy421Here, we al
- Page 453 and 454:
Predicting malignancy423Recall from
- Page 455 and 456:
Predicting malignancy425strict=Fals
- Page 457 and 458:
Predicting malignancy4271.00.80.60.
- Page 459 and 460:
Predicting malignancy429)metrics_t[
- Page 461 and 462:
Predicting malignancy431fine-tuning
- Page 463 and 464:
What we see when we diagnose433Let
- Page 465 and 466:
What next? Additional sources of in
- Page 467 and 468:
What next? Additional sources of in
- Page 469 and 470:
Conclusion439LUNA PAPERSThe LUNA Gr
- Page 471 and 472:
Summary44114.9 Exercises14.10 Summa
- Page 473:
Part 3DeploymentIn part 3, we’ll
- Page 476 and 477:
446 CHAPTER 15 Deploying to product
- Page 478 and 479:
448 CHAPTER 15 Deploying to product
- Page 480 and 481:
450 CHAPTER 15 Deploying to product
- Page 482 and 483:
452 CHAPTER 15 Deploying to product
- Page 484 and 485:
454 CHAPTER 15 Deploying to product
- Page 486 and 487:
456 CHAPTER 15 Deploying to product
- Page 488 and 489:
458 CHAPTER 15 Deploying to product
- Page 490 and 491:
460 CHAPTER 15 Deploying to product
- Page 492 and 493:
462 CHAPTER 15 Deploying to product
- Page 494 and 495:
464 CHAPTER 15 Deploying to product
- Page 496 and 497:
466 CHAPTER 15 Deploying to product
- Page 498 and 499:
468 CHAPTER 15 Deploying to product
- Page 500 and 501:
470 CHAPTER 15 Deploying to product
- Page 502 and 503:
472 CHAPTER 15 Deploying to product
- Page 504 and 505:
474 CHAPTER 15 Deploying to product
- Page 506 and 507:
476 CHAPTER 15 Deploying to product
- Page 509 and 510:
indexNumerics3D imagesdata represen
- Page 511 and 512:
INDEX 481classification model train
- Page 513 and 514:
INDEX 483end-to-end analysis (conti
- Page 515 and 516:
INDEX 485machine learning: gradient
- Page 517 and 518:
INDEX 487padded convolutions 292pad
- Page 519 and 520:
INDEX 489tensorboard program 309Ten
- Page 521 and 522:
“SUN”“SEASIdE”“SCENERY”