Pretraining Language Model with PyTorch
Aug 29, 2025•16 min read
Literally coding every function from scratch.
Literally coding every function from scratch.
A dive into OpenAI's first ever open-weight model after 5 years.
A practical workflow for adapting GPT-2 to emulate a specific influencer’s tone: continue pretraining on long-form transcripts, align with supervised fine-tuning on crafted Q&A, then polish with Direct Preference Optimization (DPO) to balance authenticity and safety.
Opinionated writing in its purest form—no clutter, no distractions.