In the example (Twitter),
The upload bandwidth is calculated as 12Gbps.
Estimate the bandwidth in Gbps (gigabits per second) by dividing the incoming and outgoing data by the number of seconds in a day.
This assumes that the traffic is uniformly distributed over 24 hrs, which is never the case in reality.
Shouldn’t we consider the peek-time tweets and calculate the bandwidth accordingly?
The same goes with server estimation as well.
Course: Grokking Modern System Design Interview for Engineers & Managers - Learn Interactively
Lesson: Examples of Resource Estimation - Grokking Modern System Design Interview for Engineers & Managers