Prequential Approach to Compression

EE376A (Winter 2019)

Author
Jay Whang

Abstract
Recent advances in deep learning has led to many powerful generative models. Among them, autoregressive models have shown much success by achieving state-of-the-art likelihood on several benchmark datasets, as well as generating high quality samples. A particular application of interest of such models is in compression by combining it with arithmetic coding, an entropy coder that relies on autoregressive conditionals $latex p(x_{t+1}|\mathbf{x}^t)$ for compression. In this work, we present preliminary results on the compression rate of this scheme in the prequential setting, where the autoregressive predictor used by the arithmetic coder is trained during compression and decompression.

Report can be found at this link

Outreach Activity
Given the technical nature of my project, choosing what to present as part of my outreach effort at Nixon Elementary School was not an easy task. My primary goal was to bring something elementary school students can find exciting and become engaged. This ultimately led me to give up on the idea of doing something related to compression; I simply could not find an interesting way to present it.

Instead, based on my personal experience of learning about generative models for the first time, I decided to show what kind of cool things generative models can create. The first example was easy to choose, since I already had a language model trained on novels as part of my project work. Rather than only showing texts (or even worse, fitting some 2D Gaussian and showing scatter plots of samples), however, I went on to add images as well — of handwriting (MNIST) and Pikachu.

One important aspect I wanted to display was the ability to use the latent space to interpolate between samples. This ultimately led me to use a GAN, so I can easily map noise to samples. This turned out to be a success, as many students and their parents were intrigued by the animation of smoothly-changing samples I generated. Overall, I’m glad I was able to showcase latent-variable models to the local community, and I hope this will lead to more interested future students.

My booth at the outreach event, showcasing samples from various modalities.
The event was packed with enthusiastic families.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.