
PME
@itsyourcode
savage coder | analytics for the intelligence age @ http://probably.dev
You might like
I recently wrote a whole blog post in my head. It was so good I decided not ruin it by writing or publishing it
The code was written at layers 22-30 and is stored in the value activations you just can’t read it. I think you owe the LLM an apology.
As though we needed more things to blame floating point arithmetic for
Killing someone in America for exercising their right to free speech is a declaration of war on everything the country stands for
I love writing performance critical code because it legitimizes rule breaking
Abstractions leak when they fail to account for the strongest underlying invariants
Reminder that the Top Gun Anthem is the true theme song of America
If you reject the first thing an LLM suggests congratulations: you are a member of the agentic minority
the masculine urge to write a plot library for the browser purely implemented as WebGPU shaders
Happy Labor Day weekend to all the founders who forgot about this "holiday" again
I was Apple-levels of skeptical that LLMs could be made to perform multi-step data analysis at-or-above human accuracy Recently I have been proving myself wrong
Humans tend to converge to maxima with predictable variance LLMs exhibit unpredictable variance between surprising maxima and painful minima
United States Trends
- 1. D’Angelo 329K posts
- 2. Charlie 623K posts
- 3. Erika Kirk 62.1K posts
- 4. Young Republicans 22.1K posts
- 5. Politico 197K posts
- 6. #PortfolioDay 20.5K posts
- 7. #AriZZona N/A
- 8. Pentagon 113K posts
- 9. Presidential Medal of Freedom 84.6K posts
- 10. Jason Kelce 6,703 posts
- 11. Big 12 14.3K posts
- 12. Brown Sugar 23.7K posts
- 13. Drew Struzan 34.4K posts
- 14. Kai Correa N/A
- 15. Burl Ives N/A
- 16. Scream 5 N/A
- 17. George Strait 4,744 posts
- 18. Milei 310K posts
- 19. Angie Stone 36.7K posts
- 20. David Bell N/A
Something went wrong.
Something went wrong.