Skip to content

Instantly share code, notes, and snippets.

@ashunigion
Created June 10, 2019 16:23
Show Gist options
  • Save ashunigion/d1585afb1e1740218fcbdacfeb5bd77a to your computer and use it in GitHub Desktop.
Save ashunigion/d1585afb1e1740218fcbdacfeb5bd77a to your computer and use it in GitHub Desktop.
Model architecture to train word-embedding
class SkipGram(nn.Module):
def __init__(self, n_vocab, n_embed):
super().__init__()
# complete this SkipGram model
self.embed = nn.Embedding(n_vocab, n_embed)
self.output = nn.Linear(n_embed, n_vocab)
self.log_softmax = nn.LogSoftmax(dim=1)
def forward(self, x):
# define the forward behavior
x = self.embed(x)
scores = self.output(x)
log_ps = self.log_softmax(scores)
return log_ps
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment