1.pytorch中给出的例子
https://github.com/pytorch/examples/blob/master/vae/main.py
实现过程非常简单:
class VAE(nn.Module): def __init__(self): super(VAE, self).__init__() self.fc1 = nn.Linear(784, 400)#第一层,推断 self.fc21 = nn.Linear(400, 20)#对应均值 self.fc22 = nn.Linear(400, 20)#对应方差 self.fc3 = nn.Linear(20, 400)#生成层1 self.fc4 = nn.Linear(400, 784)#生成层2 def encode(self, x): h1 = F.relu(self.fc1(x)) return self.fc21(h1), self.fc22(h1) def reparameterize(self, mu, logvar): std = torch.exp(0.5*logvar) eps = torch.randn_like(std) return mu + eps*std def decode(self, z): h3 = F.relu(self.fc3(z)) return torch.sigmoid(self.fc4(h3))#这里为什么选sigmoid而不是其他,需要斟酌 def forward(self, x): mu, logvar = self.encode(x.view(-1, 784)) z = self.reparameterize(mu, logvar)#对均值和方差进行重参数 return self.decode(z), mu, logvar
那我不明白了,这个https://github.com/wiseodd/generative-models里给的这些VAE实现有什么意义呢?还很难看懂
2.torch中Variable已弃用
https://pytorch.org/docs/stable/autograd.html
(1)已弃用,但是可以正常工作,Variable(tensor, requires_grad)会返回Tensors对象,而不是Variables对象
(2)var.data
is the same thing as tensor.data
.
(3)Methods such as var.backward(), var.detach(), var.register_hook() 都被转移到了Tensors上
(4)可以这样来创建自动求梯度的tensor:autograd_tensor = torch.randn((2, 3, 4), requires_grad=True)
3.