20.03.2021 Views

Deep-Learning-with-PyTorch

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

INDEX 487

padded convolutions 292

padding 362

padding flag 370

pandas library 41, 78, 377

parallelism 53

parameter estimation 106–109

choosing linear model

108–109

data gathering 107–108

data visualization 108

example problem 107

parameter groups 427

parameters 120, 145, 160, 188,

196, 225, 397

parameters() method

156, 188, 210

params tensor 124, 126, 129

parser.add_argument 352

patient coordinate system

266–267

converting between

millimeters and voxel

addresses 268–270

CT scan shape and voxel

sizes 267–268

extracting nodules from CT

scans 270–271

overview 265–267

penalization terms 134

permute method 73, 170

pickle library 397

pin_memory option 216

points tensor 46, 57, 64

points_gpu tensor 64

pointwise ops 53

pooling 203–204

pos_list 383, 418

pos_ndx 340

pos_t 382

positive loss 344

positive_mask 376

POST route 447

PR (Precision-Recall)

Curves 311

precision 326

implementing in

logMetrics 327–328

updating logging output to

include 332

predict method 209

Predicted Nodules 395, 416

prediction images 393

prediction_a 394

prediction_devtensor 390

prepcache script 376, 440

preprocess function 23

pretext tasks 436

pretrained keyword

argument 36

pretrained networks 423

describing content of

images 33–35

fabricating false images from

real images 27–33

CycleGAN 29–30

GAN game 28

generating images 30–33

recognizing subject of

images 17–27

AlexNet 20–22

inference 25–27

obtaining 19–20

ResNet 22–27

Torch Hub 35–37

principled augmentation 222

Project Gutenberg 94

PyLIDC library 417–418

pyplot.figure 431

Python, list indexing in 42

PyTorch 6

functional API 210–212

how supports deep learning

projects 10–13

keeping track of parameters

and submodules 209–210

reasons for using 7–9

PyTorch JIT 458–465

dual nature of PyTorch as

interface and backend 460

expectations 458–460

scripting gaps of

traceability 464–465

TorchScript 461–464

PyTorch models

enterprise serving of 476

exporting 455–458

ONNX 455–456

tracing 456–458

serving 446–454

Flask server 446–448

goals of deployment

448–449

request batching 449–454

PyTorch Serving 476

pytorch_android library 473

pytorch_android_torchvision

473

Q

quantization 475–476

quantized tensors 65

queue_lock 452

R

random sampling 53

random_float function 349

random.random() function 307

randperm function 134

range indexing 46

ratio_int 339–340

recall 324

implementing in

logMetrics 327–328

updating logging output to

include 332

recurrent neural network 34

RedisAI 476

reduced training 475

reduction ops 53

refine_names method 48

regression problems 107

regularization 219–223

augmentation and 435

batch normalization 222–223

dropout 220–222

weight penalties 219–220

ReLU (rectified linear

unit) 147, 224

rename method 48

request batching 449–454

from request to queue

452–453

implementation 451–452

running batches from

queue 453–454

RequestProcessor 450, 452

requireOnDisk_bool

parameter 260

requires_grad attribute 138

requires_grad–True

argument 124

residual networks 224

ResNet 19, 225

creating network instance 22

details about structure of

22–25

inference 25–27

resnet variable 23

resnet101 function 22

resnet18 function 36

ResNetGenerator class 30

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!