Skip to content

Instantly share code, notes, and snippets.

# Title of Your Project [![Tweet](https://img.shields.io/twitter/url/http/shields.io.svg?style=social)](https://twitter.com/intent/tweet?text=Check%20out%20this%20cool%20project&url=https://github.com/Cool/Project&hashtags=project,opensource)
![Github License](https://img.shields.io/badge/license-MIT-green)
![Code Coverage](https://img.shields.io/badge/coverage-90%25-green)
![React Version](https://img.shields.io/badge/react-v16.12.0-blue.svg)
![example](https://mdn.mozillademos.org/files/10529/inspector.png)
#### Description of your project
@HarshTrivedi
HarshTrivedi / pad_packed_demo.py
Last active April 17, 2025 01:26 — forked from Tushar-N/pad_packed_demo.py
Minimal tutorial on packing (pack_padded_sequence) and unpacking (pad_packed_sequence) sequences in pytorch.
import torch
from torch import LongTensor
from torch.nn import Embedding, LSTM
from torch.autograd import Variable
from torch.nn.utils.rnn import pack_padded_sequence, pad_packed_sequence
## We want to run LSTM on a batch of 3 character sequences ['long_str', 'tiny', 'medium']
#
# Step 1: Construct Vocabulary
# Step 2: Load indexed data (list of instances, where each instance is list of character indices)