you could handle 10,000 concurrent users on a single server depending on what the backend needs to be doing. heavy computation obviously not, but if this is for a website, even with logins, and medium computation like databases and table calls, you dont need all that. just use…
Yes this will work for many use cases especially if traffic pattern is predictable My breakdown leans toward general-purpose, elastic systems where traffic can spike, teams are distributed, and infra needs to scale out cleanly
If using postgresql, Dont need Redis for caching queries. We can use shared buffers
, 🤣 You haven’t seen yet single server handling 20 000 users and coping fine
I will intern in a company that makes all these infrastructure decisions for a month
who reads photos with small fonts? write it in text format, like a normal person would
Cool bro, will look into this once my SaaS reaches 10,000 concurrent users.
AWS infrastructure with ELB and horizontal scaling: Number of connections on the first day of ticket redemption for the Estudiantes/Gimnasia soccer derby.
10k concurrent users Cache Sqlite on memory Db sqlite (only reads) Any backend Monolit with 1gb ram 2 core Happy code
United States Trends
- 1. Veterans Day 208K posts
- 2. Veterans Day 208K posts
- 3. Luka 69.3K posts
- 4. Nico 111K posts
- 5. Mavs 24.8K posts
- 6. #csm220 3,728 posts
- 7. Kyrie 5,908 posts
- 8. Gambit 19.5K posts
- 9. Dumont 19.4K posts
- 10. Wike 46.8K posts
- 11. #MFFL 2,015 posts
- 12. Arlington National Cemetery 8,385 posts
- 13. Vets 19.3K posts
- 14. Mantis 3,096 posts
- 15. Venom 16.8K posts
- 16. United States Armed Forces 1,428 posts
- 17. Made in China 4,655 posts
- 18. Wanda 21.3K posts
- 19. Armistice Day 17.6K posts
- 20. Mavericks 28.1K posts
Something went wrong.
Something went wrong.