不过如果你想在本地甚至macbook上运行LLM,ollama是一个不错的选择。它可以非常方便地把LLM在本地运行起来,支持了mstrial、llama2等许多模型,只需要运行ollama run llama2即可运行,非常方便。ollama.ai


United States Trends
Loading...

Something went wrong.


Something went wrong.