Is the bandwidth calculation correct?

In the example (Twitter),
The upload bandwidth is calculated as 12Gbps.
Estimate the bandwidth in Gbps (gigabits per second) by dividing the incoming and outgoing data by the number of seconds in a day.

This assumes that the traffic is uniformly distributed over 24 hrs, which is never the case in reality.
Shouldn’t we consider the peek-time tweets and calculate the bandwidth accordingly?
The same goes with server estimation as well.

Course: Grokking Modern System Design Interview for Engineers & Managers - Learn Interactively
Lesson: Examples of Resource Estimation - Grokking Modern System Design Interview for Engineers & Managers

Hi Renjith,

Thank you for the feedback. We acknowledge and agree that your observation is correct regarding the bandwidth estimation example of Twitter. Please note that these are only ballpark estimations and cannot reflect the actual requirements for Twitter. If we want peak load, we can take some example numbers for that, but those will not be much better than the current average case, spread over the day. We need request and response distributions to do better. But at the design level we don’t need to go to that level of detail.

Regarding the server calculation he have the text: “We assume that there is a specific second in the day when all the requests of all the users arrive at the service simultaneously.” That means we have assumed that all the load will come in some specific second.

1 Like