• Encoding You

    Who we are is no longer just physical. From our clicks and scrolls to what we ignore, a new kind of identity is being formed: digital, dynamic, and deeply encoded in data. This post explores how AI is learning to capture, represent, and adapt to that identity.

    (more…)
  • Building a Python Code Generator with GPT-2

    In this article, I’ll walk through my recent project of building a neural network that can generate Python code from natural language descriptions. By leveraging the power of GPT-2 and training it on real-world Python code from the CodeSearchNet dataset, I was able to create a model that translates comments and documentation into functional code snippets.

    (more…)
  • Understanding Text-to-Image Generation with VQ-VAE and Transformers

    Text-to-image generation has become one of the most exciting areas in AI research, enabling computers to create visual content from textual descriptions. In this blog post, I’ll break down a fascinating implementation that combines Vector Quantized Variational Autoencoders (VQ-VAE) with Transformer models to generate images from text descriptions.

    (more…)
  • Feedforward Networks Explained

    Understanding Deep Feedforward Networks

    In a deep feedforward network, information moves in one direction—from the input x, through intermediate computations, and finally to the output y. This sequential flow of information is what distinguishes feedforward networks from recurrent architectures, where information cycles back.

    (more…)