Saikat's notes on AI
🏠🐦💼🧑‍💻
  • Hello world!
  • 🚀LLM
    • The Evolution of Language Models: From Word2Vec to GPT-4
      • [1] Word2Vec - Efficient Estimation of Word Representations in Vector Space
      • [2] Seq2Seq - Sequence to Sequence Learning with Neural Networks
      • [3] Attention Mechanism - Neural Machine Translation by Jointly Learning to Align and Translate
      • [4] Transformers - Attention Is All You Need
      • [5] GPT - Improving Language Understanding by Generative Pre-Training
      • [6] BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
      • [7] T5 - Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
      • [8] GPT2 - Language Models are Unsupervised Multitask Learners
  • Best LLM Resources on the internet
  • MPT-7B: A Revolutionary Leap in Language Models
  • From Rules to Vectors: How NLP Changed Over Time
Powered by GitBook
On this page

Was this helpful?

Hello world!

I'm curating a collection of AI papers, techniques, and concepts that I personally find interesting and would love to share with fellow enthusiasts like you. I'm not an expert, but I'm passionate about the field and always eager to learn more. As I come across new and noteworthy advancements, I add them to my notes and share my insights in a friendly, accessible way. I hope you find my notes helpful and that together, we can learn more about this fascinating field. Thanks for joining me on this journey of exploration!

NextThe Evolution of Language Models: From Word2Vec to GPT-4

Last updated 2 years ago

Was this helpful?