How to Calculate LoRaWAN Link Budget

Your gateway shows -138 dBm receive power. Your LoRa module’s datasheet claims -137 dBm sensitivity at SF12. That’s a 1 dB margin, so technically it should work, right?
It won’t. Not reliably. The gap between “technically works” and “works in production” is where most LoRaWAN deployments fail.
The link budget calculation isn’t complicated math. It’s arithmetic: addition and subtraction of gains and losses expressed in decibels. What makes it tricky is knowing which numbers to plug in. That 8 dBi antenna doesn’t deliver 8 dBi toward your street-level sensor. That “typical” path loss exponent doesn’t apply to your specific urban canyon. And that fade margin you’re ignoring? It’s the difference between 90% and 99.9% reliability.
This guide walks through the complete link budget calculation for LoRaWAN, with particular attention to the two areas where engineers most often get burned: antenna gain realities and fade margin selection.
The Core Link Budget Equation
The link budget is simply an accounting of every gain and loss between your transmitter and receiver:
Received Power (dBm) = Tx Power + Tx Antenna Gain − Tx Cable Loss − Path Loss + Rx Antenna Gain − Rx Cable Loss
What actually matters for deployment planning is whether that received power exceeds your receiver’s sensitivity threshold by enough margin to handle real-world variability:
Link Margin = Received Power − Receiver Sensitivity
Or, rearranged for the question you’re usually asking (“How much path loss can I tolerate?”):
Maximum Allowable Path Loss = Tx Power + Tx Antenna Gain − Tx Cable Loss + Rx Antenna Gain − Rx Cable Loss − Receiver Sensitivity − Fade Margin
Every term in decibels. Gains add, losses subtract. The chain concept is straightforward, but the accuracy of your result depends entirely on the accuracy of each input.
Transmit Side Parameters
Transmit Power
Regulatory limits cap your conducted transmit power before you even consider hardware capabilities:
- EU868: 14 dBm ERP (equivalent to ~16 dBm EIRP) for most channels; some sub-bands allow up to 27 dBm EIRP
- US915: 30 dBm conducted power (+36 dBm EIRP with 6 dBi antenna)
A critical distinction: conducted power is what leaves the radio IC. EIRP (Effective Isotropic Radiated Power) includes antenna gain. European regulations typically specify ERP or EIRP limits, meaning high-gain antennas require reducing conducted power. US regulations specify conducted power, giving you more flexibility.
Common LoRa transceivers like the SX1276 output up to +20 dBm conducted. Higher-power modules exist for US deployments, but most battery-powered nodes run at +14 dBm or lower to preserve battery life.
Antenna Gain and Cable Losses
Transmit antenna gain varies dramatically by form factor:
- Chip antennas: −2 to 0 dBi (often negative gain due to poor matching)
- PCB trace antennas: 0 to 2 dBi
- Whip/stub antennas: 2 to 3 dBi
- External omnis: 3 to 8 dBi
For end nodes with direct antenna connections, cable loss is typically negligible. For gateways with remote antennas, cable loss matters. At 868/915 MHz, expect roughly:
- RG-58: 0.2 dB/m
- LMR-240: 0.12 dB/m
- LMR-400: 0.07 dB/m
Each connector adds 0.1–0.5 dB. A 5-meter run of RG-58 with two N-connectors costs you about 1.5 dB: power you paid for but never radiated.
Path Loss: Where Theory Meets Reality
Free Space Path Loss
The theoretical minimum, assuming no obstructions and perfect line-of-sight:
FSPL (dB) = 20·log₁₀(d) + 20·log₁₀(f) − 147.55
Where d is distance in meters and f is frequency in Hz. At 868 MHz and 1 km distance, FSPL is approximately 91.5 dB.
Free space path loss assumes your signal travels through a vacuum with nothing in the way. It’s useful as a lower bound, but if you’re planning urban or indoor deployments based on FSPL, you’ll be disappointed.
Real-World Propagation Models
The log-distance path loss model captures environmental effects:
Path Loss (dB) = FSPL(d₀) + 10·n·log₁₀(d/d₀)
The path loss exponent n captures how quickly signal attenuates with distance:
| Environment | Path Loss Exponent (n) |
|---|---|
| Free space | 2.0 |
| Suburban | 2.5–3.0 |
| Urban | 2.7–3.5 |
| Dense urban | 3.0–4.0 |
| Indoor (same floor) | 3.0–4.0 |
| Indoor (multi-floor) | 4.0–6.0 |
LoRaWAN’s sub-GHz frequencies help here. Lower frequencies diffract around obstacles and penetrate building materials better than 2.4 GHz alternatives. But don’t mistake “better” for “good.” A concrete wall still costs you 10–15 dB.
For terrain-aware modeling, tools like CloudRF or EDX SignalPro incorporate elevation data, building footprints, and clutter models. For flat-terrain planning or initial estimates, the log-distance model with an appropriate exponent is usually sufficient.
Antenna Gain: What Datasheets Don’t Tell You
This is where most link budget errors originate. That 8 dBi omni on your gateway doesn’t deliver 8 dBi in all directions. That would violate physics.
Radiation Pattern Realities
High-gain omnidirectional antennas achieve their gain by compressing the radiation pattern vertically. An 8 dBi omni concentrates energy toward the horizon, with significantly reduced gain above and below. If your gateway antenna is mounted at 30 meters and your sensor is 200 meters away at ground level, the signal path is 8.5° below horizontal, potentially outside the main lobe.
Example: A gateway using an 8 dBi omni mounted at height may deliver only 3–4 dBi effective gain toward street-level nodes, because the node falls in a lower-gain region of the antenna pattern.
For deployments covering nearby nodes at significant elevation differences, a lower-gain antenna (4–6 dBi) with a broader vertical beamwidth often outperforms a higher-gain option.
Polarization and Ground Plane Effects
Polarization mismatch between transmit and receive antennas costs you 3 dB for circular vs. linear, and theoretically infinite (practically 20+ dB) for cross-polarized linear antennas. Most LoRaWAN deployments use vertically polarized antennas on both ends, but tilt a handheld device 90° and you’ve created a significant mismatch.
Quarter-wave antennas require a ground plane to function correctly. End nodes with small PCB ground planes or plastic enclosures often see 3–6 dB degradation from datasheet specifications. If antenna performance is critical, validate with a network analyzer or field testing. Don’t trust the datasheet.
Fade Margin: The Insurance Policy
Fade margin is the buffer between your calculated received power and receiver sensitivity. It accounts for everything your model didn’t capture: multipath fading, weather variation, seasonal vegetation, temporary obstructions, and the inevitable deviation between models and reality.
Reliability vs. Margin
The relationship between fade margin and link reliability follows a logarithmic pattern in fading environments:
| Fade Margin | Approximate Availability |
|---|---|
| 10 dB | ~90% |
| 15 dB | ~95% |
| 20 dB | ~99% |
| 25 dB | ~99.7% |
| 30 dB | ~99.9% |
These numbers assume Rayleigh fading conditions. Your actual environment may be better (strong line-of-sight) or worse (severe multipath in industrial settings).
Selecting Appropriate Margins
Static outdoor deployments (fixed sensors, good gateway placement): 10–15 dB. You’ve done a site survey, the environment is stable, and occasional packet loss is acceptable.
Variable environments (sensors in vegetation, urban street level): 15–20 dB. Seasonal changes and moving obstructions will cause link quality variation.
Mobile applications or critical infrastructure: 20–30 dB. You can’t guarantee orientation, location, or timing, and failures have consequences.
The cost of over-margining: If you specify 30 dB margin everywhere, you’ll force higher spreading factors than necessary. That means longer airtime, higher power consumption, and reduced network capacity. A battery-powered node operating at SF12 instead of SF9 will consume roughly 8× more energy per packet. Start with 15 dB for typical static deployments and adjust based on field data.
Receiver Sensitivity and Spreading Factor Trade-offs
LoRa’s variable spreading factor creates a sensitivity range unmatched by other LPWAN technologies. For the SX1276 at 125 kHz bandwidth:
| Spreading Factor | Receiver Sensitivity | Relative Airtime |
|---|---|---|
| SF7 | −123 dBm | 1× |
| SF8 | −126 dBm | 2× |
| SF9 | −129 dBm | 4× |
| SF10 | −132 dBm | 8× |
| SF11 | −134 dBm | 16× |
| SF12 | −137 dBm | 32× |
Each spreading factor step buys approximately 2.5 dB sensitivity improvement at the cost of doubling airtime. The LoRaWAN Adaptive Data Rate (ADR) algorithm will optimize this automatically for established links, but your initial planning should consider worst-case (SF12) for determining coverage boundaries.
Worked Example: Urban Sensor Deployment
Scenario: Asset tracking sensors in a European city, EU868 frequency band.
Known parameters:
- Tx Power: 14 dBm (conducted, regulatory maximum for most EU868 channels)
- Tx Antenna: 2 dBi integrated whip on sensor enclosure
- Tx Cable Loss: 0 dB (direct connection)
- Rx Antenna: 6 dBi gateway omni (derated from 8 dBi spec for elevation angle)
- Rx Cable Loss: 3m LMR-400 = 0.21 dB, plus 2 connectors at 0.2 dB each = 0.6 dB total
- Rx Sensitivity: −137 dBm (SF12 @ 125 kHz)
- Fade Margin: 15 dB (static outdoor deployment)
Calculate maximum allowable path loss:
Path Loss_max = 14 + 2 − 0 + 6 − 0.6 − (−137) − 15 = 143.4 dB
Convert to distance using log-distance model:
Assuming urban environment with n = 3.0, reference distance d₀ = 1000m where FSPL ≈ 91.5 dB:
143.4 = 91.5 + 10(3.0)·log₁₀(d/1000)
Solving: d ≈ 2.8 km
Reality check: Urban deployments with street-level sensors and building clutter rarely achieve theoretical maximums. Expect 1–2 km practical range for reliable coverage. The calculation tells you the physics allows 2.8 km. Site surveys tell you whether your specific environment will deliver it.
Using Link Budget Calculators Effectively
Calculators accelerate iteration but shouldn’t replace understanding.
When to use tools:
- Comparing multiple scenarios quickly
- Generating professional documentation for stakeholders
- Sanity-checking your manual calculations
When to calculate manually:
- Validating that a tool’s assumptions match your deployment
- Understanding which parameters most affect your margin
- Unusual configurations not well-represented in standard tools
Reliable tools:
- Semtech LoRa Calculator: Authoritative for sensitivity values and airtime calculations
- The Things Network Coverage Mapper: Real-world measurements that validate (or challenge) your calculations
- CloudRF/Radio Mobile: Terrain-aware propagation modeling for complex deployments
No tool substitutes for a site survey on critical deployments. Walk the coverage area with a test node. The difference between modeled and measured performance often surprises.
Building Confidence Into Your Calculations
Link budget calculation is arithmetic. Getting it right is engineering judgment: choosing realistic inputs, appropriate margins, and validating assumptions against reality.
The two most common failure modes are optimistic antenna gain assumptions and inadequate fade margin. A 3 dB error in antenna gain cuts your range by 30% in free space, more in urban environments. Skimping on fade margin means your 99% link becomes an 80% link when conditions change.
Start your calculations with conservative assumptions. Plan for SF12 coverage, then let ADR optimize upward. Budget 15 dB fade margin, then tighten if site surveys support it. Derate antenna gain from datasheet specifications unless you’ve validated performance in your specific mounting configuration.
Your link budget is a prediction. Field measurements are data. When they disagree, trust the data, and update your model for the next deployment.
Hubble’s satellites bring LoRaWAN coverage to locations where terrestrial link budgets simply don’t close. See how it works →