A simple and comprehensive technique to determine the probability that a cascade of erbium-doped fiber amplifiers (EDFAs) may be driven into unacceptable regimes of bit error rate (BER) and/or gain levels is presented. This technique allows network designers to determine the tolerances by which,the signal power levels may deviate from their predesigned average values and still give acceptable gain variances and BERs at the receiver. We show that even in the signal power range well above the receiver sensitivities (- 38, dBm/ch) where the gain spread is not significant, the corresponding spread in BER due to random arrival of packets might,result in unacceptable performance. We show for typical levels of operation, the BER temporarily (for about 3 mus) deviates to below 10(-9) (10(-15)) with a probability of 10(-3) (10(-2)), for 100 (64) channels. We show that the gain spread for a single EDFA can be negligible for a range of signal and pump powers at a given average gain.