最完整的PyTorch數據科學家指南(1)

{"type":"doc","content":[{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"進行深度學習時您將需要的所有PyTorch功能。從實驗/研究的角度來看。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","marks":[{"type":"strong"}],"text":"PyTorch"},{"type":"text","text":" 已經成爲現在創建神經網絡的事實上的標準之一,我喜歡它的界面。但是,對於初學者來說,要獲得它有些困難。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"我記得幾年前經過一些廣泛的實驗之後才選擇PyTorch。實話實說,我花了很多時間才撿起來,但我很高興我從Keras搬到 PyTorch。"},{"type":"text","marks":[{"type":"strong"}],"text":" 憑藉其高度可定製性和python語法,"},{"type":"text","text":" PyTorch"},{"type":"text","marks":[{"type":"strong"}],"text":"可以與 "},{"type":"text","text":"他人一起工作,這是我的榮幸,我將其推薦給任何希望通過深度學習進行繁重工作的人。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"因此,在本PyTorch指南中, "},{"type":"text","marks":[{"type":"strong"}],"text":"我將嘗試減輕PyTorch對於初學者的痛苦,並介紹在使用Pytorch"},{"type":"text","text":" 創建任何神經網絡時需要的一些最重要的類和模塊。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"但是,這並不是說它僅針對初學者,因爲 "},{"type":"text","marks":[{"type":"strong"}],"text":"我還將談論"},{"type":"text","text":" "},{"type":"text","marks":[{"type":"strong"}],"text":"PyTorch提供的高可定製性,並談論自定義的Layers,Datasets,Dataloaders和Loss函數"},{"type":"text","text":"。"}]},{"type":"heading","attrs":{"align":null,"level":3},"content":[{"type":"text","text":"張量"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"張量是PyTorch的基本構建塊,簡單地說,它們是NumPy數組,但在GPU上。在這一部分中,我將列出一些在使用Tensors時可以使用的最常用的操作。這絕不是張量可以執行的詳盡操作列表,但是在進行更令人興奮的部分之前瞭解張量是有幫助的。"}]},{"type":"heading","attrs":{"align":null,"level":3},"content":[{"type":"text","text":"1.創建張量"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"我們可以通過多種方式創建PyTorch張量。這包括從NumPy數組轉換爲張量。下面只是一個要點,下面是一些示例,但是您可以  像使用NumPy數組一樣使用張量來做更多的事情。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"image","attrs":{"src":"https://static001.geekbang.org/infoq/e1/e1b264a6cf3ea1194952f9a360749ecb.png","alt":null,"title":null,"style":null,"href":null,"fromPaste":true,"pastePass":true}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"image","attrs":{"src":"https://static001.geekbang.org/infoq/8e/8e10ffaeaa154cd7296dfdb3315fd7a0.png","alt":null,"title":null,"style":null,"href":null,"fromPaste":true,"pastePass":true}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"heading","attrs":{"align":null,"level":3},"content":[{"type":"text","text":"2.張量運算"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"同樣,您可以對這些張量執行很多操作。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"image","attrs":{"src":"https://static001.geekbang.org/infoq/47/474c4d2882275451e0697c63594d4d42.png","alt":null,"title":null,"style":null,"href":null,"fromPaste":true,"pastePass":true}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"image","attrs":{"src":"https://static001.geekbang.org/infoq/df/df83865162c05f4f3c07aa516f188593.png","alt":null,"title":null,"style":null,"href":null,"fromPaste":true,"pastePass":true}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","marks":[{"type":"strong"}],"text":"注意: "},{"type":"text","text":"什麼是PyTorch變量?在以前的Pytorch版本中,Tensor和Variables曾經是不同的,並且提供了不同的功能,但是現在不贊成使用Variable API ,並且所有用於Tensors的變量 方法都可以使用。因此,如果您不瞭解它們,那很好,因爲它們不是必需的,如果您瞭解它們,則可以將它們忘記。"}]},{"type":"heading","attrs":{"align":null,"level":3},"content":[{"type":"text","text":"nn.模塊"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"但是話又說回來,如果Pytorch沒有提供很多現成的層,而這些層在各種神經網絡體系結構中非常頻繁地使用,則Pytorch不會被廣泛使用。一些例子是:nn.Linear,nn.Conv2d,nn.MaxPool2d,nn.ReLU,  nn.BatchNorm2d,nn.Dropout,nn.Embedding,  ,,  ,,,nn.GRU/nn.LSTMnn.Softmaxnn.LogSoftmaxnn.MultiheadAttentionnn.TransformerEncodernn.TransformerDecoder"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"image","attrs":{"src":"https://static001.geekbang.org/infoq/e2/e2551f8196a0cdc7e6e4311b0b635c18.png","alt":null,"title":null,"style":null,"href":null,"fromPaste":true,"pastePass":true}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"image","attrs":{"src":"https://static001.geekbang.org/infoq/2b/2bb2ad506c8331a273154851511fe975.png","alt":null,"title":null,"style":null,"href":null,"fromPaste":true,"pastePass":true}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"在這裏,我們定義了一個非常簡單的網絡,該網絡接受大小爲784的輸入,並以順序方式將其通過兩個線性層。但是要注意的是,我們可以在定義前向通過時定義任何類型的計算,這使得PyTorch高度可定製以用於研究目的。例如,在瘋狂的實驗模式下,我們可能使用了以下網絡,在該網絡上我們任意附加了圖層。在這裏,我們在將輸入再次添加回第二個線性層(跳過連接)之後,將輸出從第二個線性層再次發送回第一個線性層。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"image","attrs":{"src":"https://static001.geekbang.org/infoq/41/41287ad11cdd33195570ba26fb3f62b5.png","alt":null,"title":null,"style":null,"href":null,"fromPaste":true,"pastePass":true}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"我們還可以檢查神經網絡正向傳遞是否起作用。通常,我首先創建一些隨機輸入,然後將其通過我創建的網絡進行傳遞。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"image","attrs":{"src":"https://static001.geekbang.org/infoq/5e/5e86a844a2d5c77760d5d24e35eab53f.png","alt":null,"title":null,"style":null,"href":null,"fromPaste":true,"pastePass":true}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"heading","attrs":{"align":null,"level":3},"content":[{"type":"text","text":"關於層的一句話"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"Pytorch非常強大,您實際上可以使用自己創建任何新的實驗層 nn.Module。例如,而不是使用預定義的線性層 nn.Linear。從Pytorch以上,我們可以已經創建了 "},{"type":"text","marks":[{"type":"strong"}],"text":"定製線性層"},{"type":"text","text":"。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"image","attrs":{"src":"https://static001.geekbang.org/infoq/9d/9dfa2a178151f62c0ffd2b74400e65bf.png","alt":null,"title":null,"style":null,"href":null,"fromPaste":true,"pastePass":true}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"您將看到如何在中包裝權重張量。nn.Parameter.這樣做是爲了使張量被視爲模型參數。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"blockquote","content":[{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"參數是 "},{"type":"text","marks":[{"type":"strong"}],"text":"Tensor"},{"type":"text","text":"子類,當與"},{"type":"text","marks":[{"type":"strong"}],"text":"Module"},{"type":"text","text":"-一起使用時具有非常特殊的屬性 -當將它們分配爲模塊屬性時,它們會自動添加到其參數列表中,並將出現在 "},{"type":"text","marks":[{"type":"strong"}],"text":"parameters()"},{"type":"text","text":"迭代器中。"}]}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"稍後您將看到,model.parameters()迭代器將成爲優化器的輸入。但是稍後會更多。現在,我們現在可以在任何PyTorch網絡中使用此自定義層,就像其他任何層一樣。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"image","attrs":{"src":"https://static001.geekbang.org/infoq/9e/9e723da2cbf8103d929828d5618c2dbf.png","alt":null,"title":null,"style":null,"href":null,"fromPaste":true,"pastePass":true}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"但是話又說回來,如果Pytorch沒有提供很多現成的層,而這些層在各種神經網絡體系結構中非常頻繁地使用,則Pytorch不會被廣泛使用。一些例子是:nn.Linear,nn.Conv2d,nn.MaxPool2d,nn.ReLU,  nn.BatchNorm2d,nn.Dropout,nn.Embedding,nn.GRU/nn.LSTM,nn.Softmax,nn.LogSoftmax,nn.MultiheadAttention,nn.TransformerEncoder,nn.TransformerDecoder"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null},"content":[{"type":"text","text":"以上就是Torch的基礎操作,下一篇文章會爲同學們講解卷積部分的操作。"}]},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}},{"type":"paragraph","attrs":{"indent":0,"number":0,"align":null,"origin":null}}]}
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章