Skip to content

Non_Interactive – Software & ML

Menu
  • Contact
  • Non_Int
  • What is Non-Interactive?
Menu

Author: jbetker

Batch Normalization is a Hack

Posted on July 19, 2020January 1, 2021 by jbetker

Batch normalization has a simple goal: stabilize the gradients of large computational graphs. In doing so, this technique has enabled the deep learning renaissance that almost every major ML breakthrough in the last 5 years has relied on. The concept is sound: by regularizing the mean and variance of the inputs of nearly every layer…

Continue reading

Diving into Super Resolution

Posted on July 12, 2020January 1, 2021 by jbetker

After finishing my last project, I wanted to understand generative networks a bit better. In particular, GANs interest me because there doesn’t seem to be much research on them going on in the language modeling space. To build up my GAN chops, I decided to try to figure out image repair and super-resolution. My reasoning…

Continue reading

Fine-tuning XLNet For Generation Tasks

Posted on March 27, 2020January 1, 2021 by jbetker

About a month ago, I decided to take the plunge into learning how to fine tune a language generation model. One use-case of language generation that I found particularly compelling was abstractive document summarization. A lot of the papers currently available that deal with abstractive summarization and transformers work by truncating the input text to…

Continue reading

Learning to Learn: My Second Foray Into Machine Learning

Posted on March 21, 2020January 1, 2021 by jbetker

My desire to understand how the mind works started when I was choosing what I wanted to do in college, in 2000. Back then I was a nerdy kid who was pretty good with computers, but who had grown an insatiable interest for figuring out how the mind ticked. Not knowing a whole lot about…

Continue reading
  • Previous
  • 1
  • 2
  • 3
  • 4
  • 5
© 2025 Non_Interactive – Software & ML | Powered by Minimalist Blog WordPress Theme