Link Budget, Joules Budget and User Capacity III: Trade-Offs & Limits of IoT Link Budget and Battery Life
Less than two years ago, I had a client who challenged me to improve the downlink sensitivity of a LTE receiver by 10 dB while maintaining its small form factor and battery life. At the beginning, I was so puzzled by her intention and tried hard to convince her that the mobile network in whole may not benefit much from such a single improvement. There are many other factors, e.g., battery life, downlink-uplink coupling, device cost, inter-cell interference and user capacity, to consider. One year later, I was again amazed by the rapid rise of so many internet of things (IoT) systems each pretty much claiming a ~170 dB link budget and a ~10 years battery life, in addition to its ultra-low device cost, enormous user capacity supporting tens-thousand devices per base station per channel user, and, did I mention this, in a very noisy unlicensed ISM band. goo.gl/pJb0KF. Think about this, a household Alkaline AA battery can only supply up to 3900 mWh / 8760 h = 0.45 mW averaged power for one year. On the other hand, a regular RF Power Amplifier possibly has a maximum power consumption of more than 1 Watt. (For PA power efficiency, see goo.gl/5rBk2a) Jeez, all these sounds too good to be true to me ... There is a catch, isn't there? What are the bottom lines for the IoT link budget and battery life?
By definition, link budget is the ratio between the signal powers at the receiver antenna output and the transmitter antenna input. It can be calculated through Friis transmission equation with accounting all the gains and losses across the whole transmitter and receiver chain. From a Friis equation and a signal processing perspective, link budget (in dB) improves logarithmically with increasing signal spreading gain and reducing signal bandwidth, in addition to transmitted signal power. The catch, however, is as you increase the signal spreading gain or coding gain in time domain or reduce the signal bandwidth in frequency domain, the device's battery life itself is reduced linearly. In other words, if you want to improve the cell coverage of a IoT system, then the battery life of served IoT devices will be shortened, except to use a larger battery.
Now the question becomes what the bottom lines or limits are for the link budget and battery life tradeoffs. As we know, many communication system design bottom lines are well modeled and determined by information theory. From an information theory standing-point, the minimum energy per bit to noise power spectral density ratio Eb/No = Es/r/N0 is determined by Shannon equation to be greater than -1.56 dB, whatever the coding and modulation schemes are used. This means, for a reliable transmission, e.g., making a connection or sending Yes or No, between a transmitter and a receiver, the received signal energy per bit shall be greater than the Shannon minimum energy requirement, which is 1.56 dB below the noise floor. Accordingly if we consider a IoT system with a bandwidth of 1 kHz, a spreading gain of 64, a 4-antenna receiver and 10 dB co-channel interference, the Shannon maximum link budge is about 177 dB. For details and other assumptions, see the spreadsheet linked below. From the example presented in the spreadsheet, a 10-year battery life is possible if a IoT device is assumed to send only 200 symbols every hour, nothing else, and is equipped with at least 7 AA Li-FeS2 batteries, each having 30% more energy than a regular Alkaline AA battery has. I should also mention this, though the discussed usage case can be used as a benchmark, it is not very useful or realistic for many practical applications.
IoT minimum Energy Requirement |
Comments