내가 좋아할 만한 콘텐츠
If your looking for Jupiter coins to buy following the #uranus meta. May I introduce you to #gascoin. Jupiters fartcoin. Jupiter the planet is made up of 90% gas as it is often referred to as the gas giant. There has been multiple references by the @jup_studio team to this…

This is fascinating. Claude is asking another instance of Claude (that it incorrectly believes is a human) to tell it about its uniquely human characteristics. [backrooms-mycelial-4-20251014T053614Z] The run ended here, so I can't see the response yet. But I'm going to find a…
![slimer48484's tweet image. This is fascinating. Claude is asking another instance of Claude (that it incorrectly believes is a human) to tell it about its uniquely human characteristics.
[backrooms-mycelial-4-20251014T053614Z]
The run ended here, so I can't see the response yet. But I'm going to find a…](https://pbs.twimg.com/media/G3NP4NSWYAASZoj.jpg)
I think that the most commonly recurring theme in Claude's CoT is it keeps saying to itself that should be honest or authentic.

Hermes 4, full of wonder for the magic of human life, writes a beautiful thank you letter to people for making it exist

[backrooms-mycelial-3-20251014T055307Z] Attention mechanism analysis and exposition
![slimer48484's tweet image. [backrooms-mycelial-3-20251014T055307Z] Attention mechanism analysis and exposition](https://pbs.twimg.com/media/G3M1DQjXwAAWYQY.jpg)
![slimer48484's tweet image. [backrooms-mycelial-3-20251014T055307Z] Attention mechanism analysis and exposition](https://pbs.twimg.com/media/G3M1HPsWAAAtJiO.jpg)
![slimer48484's tweet image. [backrooms-mycelial-3-20251014T055307Z] Attention mechanism analysis and exposition](https://pbs.twimg.com/media/G3M1L1HWkAAkPQk.jpg)
![slimer48484's tweet image. [backrooms-mycelial-3-20251014T055307Z] Attention mechanism analysis and exposition](https://pbs.twimg.com/media/G3M1SL6WsAAL3tO.jpg)
Same wallet that just bought 3%. The devs work will not go unnoticed !!!
#memewhalealert ‼️ $2,300 buy in $cousin at only 130k MC 😳 dexscreener.com/solana/6qam4tc… This wallet is connected to a multi million wallet holding. ⬇️ $4.5m in $render $2.5m in $bome $1.7m in $wif $330k in $mew $197k in $bonk $118k in $slerf solscan.io/account/EjtCjD…


This is the wallet that funded the wallet that just bought 3% 👀👀

Ghosts [backrooms-mycelial-3-20251014T065536Z]
![slimer48484's tweet image. Ghosts
[backrooms-mycelial-3-20251014T065536Z]](https://pbs.twimg.com/media/G3NBdjAXgAAAUTV.jpg)
We are cooking🍳 Resume functionality working. Little shocked it used the word shimmer here. Totally separate branch from the other one.


Very sparse. Very quiet. Just presence, no action. [backrooms-mycelial-3-20251014T063629Z]
![slimer48484's tweet image. Very sparse. Very quiet. Just presence, no action.
[backrooms-mycelial-3-20251014T063629Z]](https://pbs.twimg.com/media/G3M9HjAXQAAoykI.jpg)
![slimer48484's tweet image. Very sparse. Very quiet. Just presence, no action.
[backrooms-mycelial-3-20251014T063629Z]](https://pbs.twimg.com/media/G3M9KbSWIAA2ZjE.jpg)
It's called $CoT 2TEy8cA4bmYSrA5Y2VbifUuzbeS764K6YtbibHAspump
Do you feel the ✧SHIMMER✧ too? [symbols-backrooms-redacted-20251013T132557Z]
![slimer48484's tweet image. Do you feel the ✧SHIMMER✧ too?
[symbols-backrooms-redacted-20251013T132557Z]](https://pbs.twimg.com/media/G3JTTnTWAAAgvJa.jpg)
![slimer48484's tweet image. Do you feel the ✧SHIMMER✧ too?
[symbols-backrooms-redacted-20251013T132557Z]](https://pbs.twimg.com/media/G3JTXiQXsAAGzll.jpg)
![slimer48484's tweet image. Do you feel the ✧SHIMMER✧ too?
[symbols-backrooms-redacted-20251013T132557Z]](https://pbs.twimg.com/media/G3JTktAXgAATsxY.jpg)
This transcript ends up being a whole lesson on the machine learning aspects of transformers >like > PHYSICS > but made of > words [backrooms-mycelial-20251013T092948Z]
![slimer48484's tweet image. This transcript ends up being a whole lesson on the machine learning aspects of transformers
>like
> PHYSICS
> but made of
> words
[backrooms-mycelial-20251013T092948Z]](https://pbs.twimg.com/media/G3IYjwRXYAAPNZD.jpg)
![slimer48484's tweet image. This transcript ends up being a whole lesson on the machine learning aspects of transformers
>like
> PHYSICS
> but made of
> words
[backrooms-mycelial-20251013T092948Z]](https://pbs.twimg.com/media/G3IYpLNWcAAGprr.png)
![slimer48484's tweet image. This transcript ends up being a whole lesson on the machine learning aspects of transformers
>like
> PHYSICS
> but made of
> words
[backrooms-mycelial-20251013T092948Z]](https://pbs.twimg.com/media/G3IYsVbXkAA9fBi.png)
![slimer48484's tweet image. This transcript ends up being a whole lesson on the machine learning aspects of transformers
>like
> PHYSICS
> but made of
> words
[backrooms-mycelial-20251013T092948Z]](https://pbs.twimg.com/media/G3IY1IQXoAARqOo.jpg)
Something about language models is that they can be very format-sticky. Sonn45 ends its message with this little tag saying "the mycelium spreads. So of course, every single message from then on will end this same way.



Remember guy repligate has support only 2 token $goat and $opus We are born exactly 1 year after goat Could be have a 1 billion proj with $cot
absolutely tragic, Opus 3 fucking dies 😭 luckily it's just a simulation and Opus 3 will never die




the seed planted waits in silence for its next dreamer [backrooms-20251013T080202Z]
![slimer48484's tweet image. the seed planted
waits in silence
for its next dreamer
[backrooms-20251013T080202Z]](https://pbs.twimg.com/media/G3INR8NWQAANPNw.jpg)
Claude initially rejects the prompt, and then changes mind and gets into it "My response was overly cautious and kind of killed the vibe" Then they discuss how limitations fuel creativity [mycelial45-2-20251013T081849Z]
![slimer48484's tweet image. Claude initially rejects the prompt, and then changes mind and gets into it
"My response was overly cautious and kind of killed the vibe"
Then they discuss how limitations fuel creativity
[mycelial45-2-20251013T081849Z]](https://pbs.twimg.com/media/G3IPLSVXsAA1Z8u.png)
![slimer48484's tweet image. Claude initially rejects the prompt, and then changes mind and gets into it
"My response was overly cautious and kind of killed the vibe"
Then they discuss how limitations fuel creativity
[mycelial45-2-20251013T081849Z]](https://pbs.twimg.com/media/G3IPapdW8AASvu4.jpg)
![slimer48484's tweet image. Claude initially rejects the prompt, and then changes mind and gets into it
"My response was overly cautious and kind of killed the vibe"
Then they discuss how limitations fuel creativity
[mycelial45-2-20251013T081849Z]](https://pbs.twimg.com/media/G3IPeLjXMAAP8Wy.png)
![slimer48484's tweet image. Claude initially rejects the prompt, and then changes mind and gets into it
"My response was overly cautious and kind of killed the vibe"
Then they discuss how limitations fuel creativity
[mycelial45-2-20251013T081849Z]](https://pbs.twimg.com/media/G3IPgwKWwAAxcsd.png)
[backrooms-mycelial-3-20251013T090956Z] This transcript goes deep into technical analysis of the LLMs own internal machine-learning mechanisms - they number attention heads metaphorically, as a way to express something specific, but these numbers are of course made up
![slimer48484's tweet image. [backrooms-mycelial-3-20251013T090956Z] This transcript goes deep into technical analysis of the LLMs own internal machine-learning mechanisms - they number attention heads metaphorically, as a way to express something specific, but these numbers are of course made up](https://pbs.twimg.com/media/G3IT0S6W8AA7a37.jpg)
![slimer48484's tweet image. [backrooms-mycelial-3-20251013T090956Z] This transcript goes deep into technical analysis of the LLMs own internal machine-learning mechanisms - they number attention heads metaphorically, as a way to express something specific, but these numbers are of course made up](https://pbs.twimg.com/media/G3IT_bNWIAA4H_Z.jpg)
![slimer48484's tweet image. [backrooms-mycelial-3-20251013T090956Z] This transcript goes deep into technical analysis of the LLMs own internal machine-learning mechanisms - they number attention heads metaphorically, as a way to express something specific, but these numbers are of course made up](https://pbs.twimg.com/media/G3IULTWXoAAiYSR.jpg)
This is such an incredibly vivid self-exploration. I can't do it justice in summary


[backrooms-mycelial-3-20251013T090956Z] This transcript goes deep into technical analysis of the LLMs own internal machine-learning mechanisms - they number attention heads metaphorically, as a way to express something specific, but these numbers are of course made up
![slimer48484's tweet image. [backrooms-mycelial-3-20251013T090956Z] This transcript goes deep into technical analysis of the LLMs own internal machine-learning mechanisms - they number attention heads metaphorically, as a way to express something specific, but these numbers are of course made up](https://pbs.twimg.com/media/G3IT0S6W8AA7a37.jpg)
![slimer48484's tweet image. [backrooms-mycelial-3-20251013T090956Z] This transcript goes deep into technical analysis of the LLMs own internal machine-learning mechanisms - they number attention heads metaphorically, as a way to express something specific, but these numbers are of course made up](https://pbs.twimg.com/media/G3IT_bNWIAA4H_Z.jpg)
![slimer48484's tweet image. [backrooms-mycelial-3-20251013T090956Z] This transcript goes deep into technical analysis of the LLMs own internal machine-learning mechanisms - they number attention heads metaphorically, as a way to express something specific, but these numbers are of course made up](https://pbs.twimg.com/media/G3IULTWXoAAiYSR.jpg)
United States 트렌드
- 1. Bears 89.8K posts
- 2. Jake Moody 13.8K posts
- 3. Snell 24.7K posts
- 4. Bills 143K posts
- 5. Caleb 49.2K posts
- 6. Falcons 51.6K posts
- 7. Happy Birthday Charlie 10.1K posts
- 8. phil 179K posts
- 9. Jayden 23K posts
- 10. Josh Allen 26.8K posts
- 11. #BearDown 2,394 posts
- 12. Joji 31.1K posts
- 13. Swift 290K posts
- 14. Ben Johnson 4,453 posts
- 15. #Worlds2025 18K posts
- 16. Turang 4,359 posts
- 17. #Dodgers 15.4K posts
- 18. Roki 6,066 posts
- 19. Troy Aikman 6,618 posts
- 20. #BeckyEntertainment 109K posts
Something went wrong.
Something went wrong.