#aiproductioncoding Suchergebnisse
Can't wait for Gemini Pro 3 API gpt-5 / gpt-5-codex are so slow, barely usable. #genai #AIProductionCoding @AIPackAI
Actually, #OpenAI o3 is pretty good at coding. #Gemini 2.5 pro good. #AIProductionCoding @aipackai #AIPACKProCoder
Planning to rename `jc@coder` AI Pack to `code10x@coder` More packs will be under the `code10x@` namespace: - `code10x@rust` for Rust best practices - `code10x@kdd` (for Kubernetes development) - `code10x@dom-native` for Native web components. #AIProductionCoding @AiapackAI
#AIProductionCoding Gemini Flash 2.5 is really good at code documentation. So, it's a great model to refresh, update, consolidate docs. And it's virtually free. A perfect coding partner with Gemini Pro 2.5. @GoogleAI @OfficialLoganK
Our take for #ProductionCoding is to go: 1) API (per usage) 2) Provider Unlocked - Use tool that are model provider agnostic. Try #AIPack with pro@coder, #AIProductionCoding A little bit more work, but so much more control
Interesting @AnthropicAI haiku-4.5 price increase Now haiku is at 2x to 3x the price of gemini-flash-latest Will compare the two soon... @GoogleAI #genai #AIProductionCoding
One single TypeScript Spec file for #MCP Thank you @AnthropicAI for this! Single spec file should be the norm for AI Production Coding. We are starting to do that for our internal and external libs, and then use those single files as knowledge. #genai #AIProductionCoding
#AIProductionCoding Important to be able to swtich between Claude / Gemini 2.5 Pro. Sometime, 2.5 pro go biserk on requirements. #genai #AICoding @AipackAI
`jc@coder` v0.1.4 AI PACK is here! - Claude/@AnthropicAI caching - Working_globs for AI lensing and concurrency - "Show doc" prompt - Plus a bunch of other improvements BTW, `jc@coder` now generates about 15-30% of @AipackAI code. news.aipack.ai/p/quick-demo-c… #AiProductionCoding
Interestingly, when using pro@coder we switch back and forth between planning and non-planning. We even interlace this with spec-based approaches. ➜ parametric prompts that allow us to toggle these modes on and off depending on the intent #AIPack #ProCoder #AIProductionCoding
Can't wait for Gemini Pro 3 API gpt-5 / gpt-5-codex are so slow, barely usable. #genai #AIProductionCoding @AIPackAI
Interesting @AnthropicAI haiku-4.5 price increase Now haiku is at 2x to 3x the price of gemini-flash-latest Will compare the two soon... @GoogleAI #genai #AIProductionCoding
Interestingly, when using pro@coder we switch back and forth between planning and non-planning. We even interlace this with spec-based approaches. ➜ parametric prompts that allow us to toggle these modes on and off depending on the intent #AIPack #ProCoder #AIProductionCoding
Our take for #ProductionCoding is to go: 1) API (per usage) 2) Provider Unlocked - Use tool that are model provider agnostic. Try #AIPack with pro@coder, #AIProductionCoding A little bit more work, but so much more control
Actually, #OpenAI o3 is pretty good at coding. #Gemini 2.5 pro good. #AIProductionCoding @aipackai #AIPACKProCoder
#AIProductionCoding Gemini Flash 2.5 is really good at code documentation. So, it's a great model to refresh, update, consolidate docs. And it's virtually free. A perfect coding partner with Gemini Pro 2.5. @GoogleAI @OfficialLoganK
One single TypeScript Spec file for #MCP Thank you @AnthropicAI for this! Single spec file should be the norm for AI Production Coding. We are starting to do that for our internal and external libs, and then use those single files as knowledge. #genai #AIProductionCoding
#AIProductionCoding Important to be able to swtich between Claude / Gemini 2.5 Pro. Sometime, 2.5 pro go biserk on requirements. #genai #AICoding @AipackAI
Planning to rename `jc@coder` AI Pack to `code10x@coder` More packs will be under the `code10x@` namespace: - `code10x@rust` for Rust best practices - `code10x@kdd` (for Kubernetes development) - `code10x@dom-native` for Native web components. #AIProductionCoding @AiapackAI
`jc@coder` v0.1.4 AI PACK is here! - Claude/@AnthropicAI caching - Working_globs for AI lensing and concurrency - "Show doc" prompt - Plus a bunch of other improvements BTW, `jc@coder` now generates about 15-30% of @AipackAI code. news.aipack.ai/p/quick-demo-c… #AiProductionCoding
Something went wrong.
Something went wrong.
United States Trends
- 1. #BUNCHITA 1,379 posts
- 2. #SmackDown 45.1K posts
- 3. Tulane 4,254 posts
- 4. Giulia 14.6K posts
- 5. Aaron Gordon 3,637 posts
- 6. Supreme Court 184K posts
- 7. Russ 13.7K posts
- 8. Frankenstein 77.3K posts
- 9. Connor Bedard 2,856 posts
- 10. #TheLastDriveIn 3,663 posts
- 11. #TheFutureIsTeal N/A
- 12. Podz 2,973 posts
- 13. #OPLive 2,246 posts
- 14. Northwestern 5,017 posts
- 15. Caleb Wilson 5,684 posts
- 16. Justice Jackson 5,470 posts
- 17. Memphis 16.2K posts
- 18. Keon 1,171 posts
- 19. Scott Frost N/A
- 20. Tatis 2,002 posts