RunAnywhereAI's profile picture. RunAnywhere: Cut your latency by half and cost to ZERO. Backed by @yoheinakajima

RunAnywhere

@RunAnywhereAI

RunAnywhere: Cut your latency by half and cost to ZERO. Backed by @yoheinakajima

RunAnywhere 已转帖

@trylogical You can try out the @RunAnywhereAI SDK for running your optimized models with production ready infra, we have mac support in native Swift in Beta.


RunAnywhere 已转帖

This dynamic will persist until inference moves on device and marginal cost of a prompt is zero. Only then will the AI app ecosystem bloom

My AI investment thesis is that every AI application startup is likely to be crushed by rapid expansion of the foundational model providers. App functionality will be added to the foundational models' offerings, because the big players aren't slow incumbents (it is wrong to…



RunAnywhere 已转帖

ollama for mobile, picking up attention

yoheinakajima's tweet image. ollama for mobile, picking up attention


RunAnywhere 已转帖

@RunAnywhereAI makes it easy for you to deploy at Scale, whether your user has a $50 Oppo or a $1,200 iPhone 17 Pro Max.


RunAnywhere 已转帖

On-device isn’t hypothetical. It’s shipping. 100k+ tokens processed locally across 550 devices with @RunAnywhereAI

ShubhamMal72313's tweet image. On-device isn’t hypothetical. It’s shipping.
100k+ tokens processed locally across 550 devices with @RunAnywhereAI

RunAnywhere 已转帖

RunAnywhere 已转帖

RUN ANYWHERE! Letss goo @sanchitmonga22

ollama for mobile, picking up attention

yoheinakajima's tweet image. ollama for mobile, picking up attention


RunAnywhere 已转帖

if you're already starting to fine-tune open source models for smaller tasks, using something like the @runanywhereai sdk can let you offload some of the inference to user devices, lowering cloud inference costs and latency but there's probably more fun/unique applications i'm…

ollama for mobile, picking up attention

yoheinakajima's tweet image. ollama for mobile, picking up attention


RunAnywhere 已转帖

ollama for mobile, picking up attention

yoheinakajima's tweet image. ollama for mobile, picking up attention

RunAnywhere 已转帖

bro just install the @RunAnywhereAI SDK, you’ll get a locally-run relationship


RunAnywhere 已转帖

if your AI girlfriend is not a LOCALLY running fine-tuned model, she’s a prostitute.

alxfazio's tweet image. if your AI girlfriend is not a LOCALLY running fine-tuned model, she’s a prostitute.

RunAnywhere 已转帖

Accelerated computing running on the edge will enable this


RunAnywhere 已转帖

Don’t need to be accepted into @a16z @speedrun to demo what you’re building

sanchitmonga22's tweet image. Don’t need to be accepted into @a16z @speedrun to demo what you’re building
sanchitmonga22's tweet image. Don’t need to be accepted into @a16z @speedrun to demo what you’re building

RunAnywhere 已转帖

First world problems

sanchitmonga22's tweet image. First world problems

United States 趋势

Loading...

Something went wrong.


Something went wrong.