#aiproductioncoding hasil pencarian

Can't wait for Gemini Pro 3 API gpt-5 / gpt-5-codex are so slow, barely usable. #genai #AIProductionCoding @AIPackAI


Man, this gpt-5.1-codex is good and fast, and relatively cheap because it is very efficient with reasoning tokens. (for Rust Coding and more) #rustlang @OpenAI #AIProductionCoding #AIPack


The passion and dedication are commendable. Thank you and the team. So far Gemini 3 for AI Production Coding has been amazing. #rustlang #AIProductionCoding


So far: ➜ `gpt-5.1` Surprisingly fast and cheap for simple task ➜ `gpt-5.1-codex` Seems good and faster than 5-codex ➜ `gpt-5.1-codex-mini` Seems pretty good, even cheaper, and faster than `gemini-flash-latest` so far. @OpenAI @GoogleAI #GoogleGemini #AIProductionCoding


Ok, gpt-5.1 and gpt-5.1-codex seems to be pretty fast. #rustlang #AIProductionCoding


Actually, #OpenAI o3 is pretty good at coding. #Gemini 2.5 pro good. #AIProductionCoding @aipackai #AIPACKProCoder


#AIProductionCoding Gemini Flash 2.5 is really good at code documentation. So, it's a great model to refresh, update, consolidate docs. And it's virtually free. A perfect coding partner with Gemini Pro 2.5. @GoogleAI @OfficialLoganK


Planning to rename `jc@coder` AI Pack to `code10x@coder` More packs will be under the `code10x@` namespace: - `code10x@rust` for Rust best practices - `code10x@kdd` (for Kubernetes development) - `code10x@dom-native` for Native web components. #AIProductionCoding @AiapackAI


Our take for #ProductionCoding is to go: 1) API (per usage) 2) Provider Unlocked - Use tool that are model provider agnostic. Try #AIPack with pro@coder, #AIProductionCoding A little bit more work, but so much more control


#AIProductionCoding Important to be able to swtich between Claude / Gemini 2.5 Pro. Sometime, 2.5 pro go biserk on requirements. #genai #AICoding @AipackAI


One single TypeScript Spec file for #MCP Thank you @AnthropicAI for this! Single spec file should be the norm for AI Production Coding. We are starting to do that for our internal and external libs, and then use those single files as knowledge. #genai #AIProductionCoding


`jc@coder` v0.1.4 AI PACK is here! - Claude/@AnthropicAI caching - Working_globs for AI lensing and concurrency - "Show doc" prompt - Plus a bunch of other improvements BTW, `jc@coder` now generates about 15-30% of @AipackAI code. news.aipack.ai/p/quick-demo-c… #AiProductionCoding


Interesting @AnthropicAI haiku-4.5 price increase Now haiku is at 2x to 3x the price of gemini-flash-latest Will compare the two soon... @GoogleAI #genai #AIProductionCoding


Interestingly, when using pro@coder we switch back and forth between planning and non-planning. We even interlace this with spec-based approaches. ➜ parametric prompts that allow us to toggle these modes on and off depending on the intent #AIPack #ProCoder #AIProductionCoding


The passion and dedication are commendable. Thank you and the team. So far Gemini 3 for AI Production Coding has been amazing. #rustlang #AIProductionCoding


Man, this gpt-5.1-codex is good and fast, and relatively cheap because it is very efficient with reasoning tokens. (for Rust Coding and more) #rustlang @OpenAI #AIProductionCoding #AIPack


So far: ➜ `gpt-5.1` Surprisingly fast and cheap for simple task ➜ `gpt-5.1-codex` Seems good and faster than 5-codex ➜ `gpt-5.1-codex-mini` Seems pretty good, even cheaper, and faster than `gemini-flash-latest` so far. @OpenAI @GoogleAI #GoogleGemini #AIProductionCoding


Ok, gpt-5.1 and gpt-5.1-codex seems to be pretty fast. #rustlang #AIProductionCoding


Can't wait for Gemini Pro 3 API gpt-5 / gpt-5-codex are so slow, barely usable. #genai #AIProductionCoding @AIPackAI


Interesting @AnthropicAI haiku-4.5 price increase Now haiku is at 2x to 3x the price of gemini-flash-latest Will compare the two soon... @GoogleAI #genai #AIProductionCoding


Interestingly, when using pro@coder we switch back and forth between planning and non-planning. We even interlace this with spec-based approaches. ➜ parametric prompts that allow us to toggle these modes on and off depending on the intent #AIPack #ProCoder #AIProductionCoding


Our take for #ProductionCoding is to go: 1) API (per usage) 2) Provider Unlocked - Use tool that are model provider agnostic. Try #AIPack with pro@coder, #AIProductionCoding A little bit more work, but so much more control


Actually, #OpenAI o3 is pretty good at coding. #Gemini 2.5 pro good. #AIProductionCoding @aipackai #AIPACKProCoder


#AIProductionCoding Gemini Flash 2.5 is really good at code documentation. So, it's a great model to refresh, update, consolidate docs. And it's virtually free. A perfect coding partner with Gemini Pro 2.5. @GoogleAI @OfficialLoganK


One single TypeScript Spec file for #MCP Thank you @AnthropicAI for this! Single spec file should be the norm for AI Production Coding. We are starting to do that for our internal and external libs, and then use those single files as knowledge. #genai #AIProductionCoding


#AIProductionCoding Important to be able to swtich between Claude / Gemini 2.5 Pro. Sometime, 2.5 pro go biserk on requirements. #genai #AICoding @AipackAI


Planning to rename `jc@coder` AI Pack to `code10x@coder` More packs will be under the `code10x@` namespace: - `code10x@rust` for Rust best practices - `code10x@kdd` (for Kubernetes development) - `code10x@dom-native` for Native web components. #AIProductionCoding @AiapackAI


`jc@coder` v0.1.4 AI PACK is here! - Claude/@AnthropicAI caching - Working_globs for AI lensing and concurrency - "Show doc" prompt - Plus a bunch of other improvements BTW, `jc@coder` now generates about 15-30% of @AipackAI code. news.aipack.ai/p/quick-demo-c… #AiProductionCoding


Tidak ada hasil untuk "#aiproductioncoding"
Tidak ada hasil untuk "#aiproductioncoding"
Loading...

Something went wrong.


Something went wrong.


United States Trends