Much of the increased energy use will come from data centers, fixed and mobile networks. The consumption for desktop equipment will reduce with the shift to mobile devices, but not be sufficient to offset the increase in mobile network energy consumption.
Rod had two measures for energy use: per bit and per useful bit (the latter excluding overheads). Historically there has been a 15% improvement in energy efficiency for telecommunications per year.
Rod pointed out that about half the energy consumed in a data centre is in cooling the equipment. Also 75% of the data traveling in the data center is internal, 8% to other data centres and only 17% to end users.
In my course "ICT Sustainability", run at ANU and on-line around the world, the students learn how to estimate and reduce the energy use of data centers and other equipment.
Professor Tucker pointed out that energy consumption of high density hard drives is expected to decrease markedly by 2020, less so for solid state devices. Rod pointed out that at some time in the future cloud service providers will need to transfer data to off-line storage, such as disk drives which are switched off or optical tape. Users will need to then wait a few tens of seconds, or minutes for their data. This was something I learned about as a IT professional in the 1980s and is something today's professionals will need to learn.
Professor Tucker pointed out that wireless communications use much more energy than wire or fibre. Increasing the bandwidth of wireless greatly increases power consumption. One way to reduce the power use while increasing the bandwidth is to use more lower power base stations closer to the use. At the extreme there may be one base station in each home. This unshared equipment can also be switched to low power mode at night (whereas it is harder with shared equipment).
Today's fibre networks require one microjoule per bit. With higher use the energy per bit will decrease, as the fibre cable doesn't use much extra energy. However, Professor Tucker estimates that today's real networks use about thousand times as much energy as they theoretically could. This shows scope for reductions.
I suggest it would also be interesting to look at efficiency at the application level. Recently I was evaluating an educational App for teaching English. I copied a small portion of text from the App and pased it to my web editor. I found a small amount of visible text came with a large amount of formatting, creating about a 400% overhead. Web pages compress well, but even so this is an overhead which could be reduced.The CSS standard allows for formatting to be defined once and then applied, but this tends not to happen in many applications.
Professor Tucker showed the diurnal Internet traffic load. Nort America shows a peak in the middle of the day. In Italy there are two peaks, one in the morning and a smaller one in the afternoon (and a dip for lunch). However, the network equipment can't power down to this extent (I suggest perhaps we need "off peak" computing charges).
Professor Tucker pointed out that Facebook stores multiple copies of new photos at data centers around the world to reduce response times. Rod estimated each photo uses 10 Watt Hours. Less used photos are stored at just one data centre, which would reduce energy use (at the cost of slower access). However, I suggest a better way to reduce consumption would be to reduce the resolution of the photos. The original photo could be stored at high resolution at one center and the copies at a lower resolution suitable for the typical smart phone. The smaller photos will use one tenth to one hundredth the storage.
Rod pointed out that very large amounts of data will take more energy to send via the Internet, than if copied to a removable device and physically transported.
Rod also pointed out that there is the potential to use the Internet to reduce other energy use through measures such as telecommuting instead of air travel. However, this was made less credible as he traveled from Melbourne to Canberra to tell us to use teleconferencing. ;-)
Rod claimed that using a cloud application, specifically Google Docs, uses more energy than a local application.
I asked if the cost of energy was sufficiently large to influence user's behavior to reduce energy use. He suggested having an energy star rating scheme for cloud services, as applies to white-goods (and computers) would have an effect. But he commented that this would be far more difficult than for a simple appliance. It seems to me that this could still be a useful area for research.
Vishwanath, A., Jalali, F., Hinton, K., Alpcan, T., Ayre, R. W., & Tucker, R. S. (2015). Energy Consumption Comparison of Interactive Cloud-Based and Local Applications. Selected Areas in Communications, IEEE Journal on, 33(4), 616-626. DOI: 10.1109/ICCW.2015.7247606