Good paper by Netflix on cosine similarity. It goes back to building good RAG systems, which is hard. Before deploying these systems, you have to make intelligent decisions about chunking, hierarchical chunking, embedding, and even the algorithm for similarity look-up.…

作为tech leader,如果只关注流程是不合格的,你必须要有技术权威,自身不硬如何带领团队做出出色的产品?更夸张的是有些tech leader整天只做传话筒的工作,又或者关心的是要文档中的图要画的风格统一。
End-of-life (EOL) and support information is often hard to track, or very badly presented. endoflife.date documents EOL dates and support lifecycles for various products.
Perplexity’s high bar for UX in the age of AI: Perplexity is reimagining how we search the web using AI mttmr.com/2024/01/10/per…
论文推荐《ToolLLM: Facilitating Large Language Models to Master 16000+ Real-world APIs》,arxiv.org/abs/2307.16789 尽管开源大型语言模型取得了进步,但在使用API执行指令的能力上仍显著受限,ToolLLM框架弥补了开源大型语言模型在工具使用能力上的不足,提升了LLaMA模型的执行和泛化能力。
不过如果你想在本地甚至macbook上运行LLM,ollama是一个不错的选择。它可以非常方便地把LLM在本地运行起来,支持了mstrial、llama2等许多模型,只需要运行ollama run llama2即可运行,非常方便。ollama.ai
基础设施是大厂们的必争之地,除了微软的Azure,Amazon的Bedrock提供了一个AI平台,可以使用它来调用AI21 Labs、Anthropic、Cohere、Meta、Stability AI 和 Amazon的LLM。
论文推荐《RAG VS FINE-TUNING: PIPELINES, TRADEOFFS, AND A CASE STUDY ON AGRICULTURE》,arxiv.org/pdf/2401.08406… 比较RAG和fine-tune,RAG最近的真的很火。
论文分享《Retrieval-Augmented Generation for Large Language Models: A Survey》,arxiv.org/abs/2312.10997 本文介绍了检索增强生成(RAG)通过结合外部数据库中的知识,提高了模型的准确性和可信度,特别是在知识密集型任务上,并允许不断更新知识和集成特定领域的信息。
Faster RAG re-ranking with ColBERT After re-ranking using GPT-4 yesterday, I tested out ColBERT for re-ranking today. Test: • Re-ranking Airbnb's 10-K, like before. Results: • ColBERT and GPT-4 were identical in ranking quality However, ColBERT was lightning-fast.…

United States 트렌드
- 1. Good Sunday 43K posts
- 2. #ProofOfFortification 1,904 posts
- 3. Liverpool 87.9K posts
- 4. Stanford 11.7K posts
- 5. #sundayvibes 3,455 posts
- 6. Pico Prism 4,003 posts
- 7. Norvell 4,464 posts
- 8. Florida State 10.6K posts
- 9. Manchester United 42.8K posts
- 10. spencer smith 1,180 posts
- 11. Brendon 6,471 posts
- 12. #AEWWrestleDream 72.3K posts
- 13. Lott N/A
- 14. Sabrina 71.6K posts
- 15. José Gregorio Hernández 44.4K posts
- 16. Shatta Wale 44K posts
- 17. lorde 7,494 posts
- 18. Vaticano 41K posts
- 19. Woodstock 3,208 posts
- 20. Darby 12K posts
Something went wrong.
Something went wrong.