Lets say, we have 80 TB data and in P2P network, there are 10 million nodes, each having 4 GB RAM (storage capacity). How this big Data will be hosted in the network?
4 GB is the main memory. The storage memory of nodes is ideally way higher. Also, when it comes to data-intensive and compute-intensive operations in a P2P architecture, dedicated servers are set up to fulfill the hardware requirements. Crypto mining systems is one example of this.