The client pulls the data from the server whenever it requires. And it keeps doing it over and over to fetch the updated data.
An important thing to note here is that every request to the server and the response to it consumes bandwidth. Every hit on the server costs the business money & adds more load on the server.
What if there is no updated data available on the server, every time the client sends a request?
The client doesn’t know that, so naturally, it would keep sending the requests to the server over and over. This is not ideal & a waste of resources. Excessive pulls by the clients have the potential to bring down the server.
Does this only refer to when the client is fetching data dynamically? What if the user keeps triggering an event that sends a HTTP GET request to the server? Wouldn’t that also cause a waste of resources?