Keywords

1 Introduction

The Nile is the longest river in the world and at the same time with one of the most water-limited basins. Without the Nile, major portions of Sudan and Egypt would run out of water. Of the water entering Lake Nasser at Aswan dam, 85 % originates from the Ethiopian Highlands (Sutcliffe and Parks 1999) . Consequently, there is a growing anxiety about changes in discharge and sediment load due to planned dams, climate-induced, and landscape-induced changes upstream.

Several studies have employed past rainfall and discharge as an effective method to study the effect of climate on hydrology (Conway 2000; Kim et al. 2008; Yilma and Demarce 1995) . In one study by Tesemma et al. (2010) , past trends of precipitation and discharge in the Blue Nile basin were investigated. The results show that there was no significant trend at 5 % probability level in the basin-wide annual, dry season, and short and long rainy season rainfall in the past 40 years. These results are in agreement with Conway (2000) .

Tesemma et al. (2010) reported that despite no trend in rainfall, discharge at Bahir Dar and Kessie, representing the upper one third of the Blue Nile basin in Ethiopia and at El Diem at the border between Sudan and Ethiopia, changed significantly over a 40-year period. Specifically, the annual discharge increased by about 25 % over the 40-year period for the upper Blue Nile, while it remained the same for the whole Blue Nile basin at El Diem. The increase of discharge in the upper Blue Nile basin is unexpected since annual rainfall remained the same, and potential evaporation from year to year does not vary greatly. This study would expect, as was the case at El Diem, that the annual discharge, which is the difference between precipitation and evaporation, should stay the same for a given amount of annual rainfall. The reasons for the difference in discharge will be discussed later. In addition to the annual trends, all three stations show significant increasing discharge in the long wet season (June through September). As a percentage of the 40-year seasonal mean, these increments were 26 % at Bahir Dar, 27 % at Kessie, and 10 % at El Diem. In addition, the results show a significant increase for the short rainy season discharge (March to May) at Bahir Dar by 33 % and Kessie by 51 %. This was likely caused by the start of operation of the Chara Chara weir in 1996 at the outlet of Lake Tana that increased the flow during the dry season. No significant change was observed at El Diem over the short rainy season. During the dry season (from October to February), the discharge at both Bahir Dar and Kessie did not change significantly but there was a 10 % significant decreasing trend at El Diem.

The only long-term erosion studies for which sediment concentration data are available for an extended period of time are for the small Soil Research Conservation Practices watersheds in the upper reaches of the large basins. It is not suitable for trend analysis in the Blue Nile basin . Nyssen et al. (2004) found that, historically, erosion and sedimentation were directly related to rainfall amounts. When the climate was wet, gullies formed and the rivers incised. When the amount of rainfall declined, the gullies filled up. The findings of Professors Ahmed and Bashir indicate that these historic trends apply to recent times as well, since the Rosieres dam at the border with Sudan filled up more during low-flow years than during wet years. Concentrations at the end of the rainy season were greater for the dry years than for the wet years (Steenhuis et al. 2012) .

Most studies on the future changes in hydrology employ future rainfall rates predicted with Global Circulation models (GCM). The changing landscape is not included in these hydrological considerations. To assess the trend in landscape changes on discharge and erosion in the Blue Nile basin , this study investigated how, in the past, the landscape has affected discharge and erosion rates. This study used a mathematical model and applied it to a 40-year period from 1964 to 2003. Changes in best-fit parameters in time are assumed to be indicative of changes in the landscape.

2 Rainfall-Runoff-Erosion Simulation

Rainfall-runoff-erosion models can show if the relationship between rainfall, discharge, and sediment concentration has changed in time, and what are the underlying landscape parameters for this change (Mishra et al. 2004) . This is different from statistical tests that examine trends in rainfall and discharge which are independent of each other.

The runoff and erosion model used here is the Parameter Efficient Distributed (PED) model that was validated for the Blue Nile basin by Steenhuis et al. (2009) , Tesemma et al. (2010), and Tilahun et al. (2013a, b) . In the PED model, various portions of the watershed become hydrologically active when threshold moisture content is exceeded. The three regions distinguished in the model are the bottomlands that can potentially saturate degraded hillslopes and permeable hillslopes. Each of the regions is lumped average of all such areas in the watershed. In the model, the permeable hillslopes contribute rapid subsurface flow (called interflow) and baseflow. For each of the three regions, a Thornthwaite–Mather-type water balance was calculated. Surface runoff and erosion are generated when the soil is saturated and assumed to be at the outlet within the time step. Percolation is calculated as any excess rainfall above field capacity on the permeable hillside soil. Zero-order and first-order reservoirs determine the amount of water reaching the outlet. Equations are given in Steenhuis et al. (2009) , Fig. 8.1 (Tilahun et al. 2013a), and in Tilahun et al. (2013b) . In Fig. 8.1, P is precipitation; E p is potential evaporation; A is area fraction for zones with subscripts 1—saturated area, 2—degraded area, and 3—infiltration areas; S max is maximum water storage capacity of the three areas; BS max is maximum baseflow storage of linear reservoir; t ½ ( = 0.69/α) is the time it takes in days to reduce the volume of the baseflow reservoir by a factor of two under no recharge condition; and τ* is the duration of the period after a single rainstorm until interflow ceases.

Fig. 8.1
figure 1

Schematic of the hydrology model. (Tilahun 2013a)

Sediment concentrations are obtained as a function of surface runoff per unit area and a coefficient that decreases linearly from the transport limit at the start of the rainy monsoon phase to the source limit after about 500 mm rainfall. Tilahun et al. (2013a, b), based on the work of Hairsine and Rose (1992) and Ciesiolka et al. (1995) , expressed the sediment concentration, Cr (g L−1), in runoff from runoff source areas (Eq. 8.1):

$$ {{C}_{r}}=[{{a}_{s}}+H({{a}_{t}}-{{a}_{s}})]{{q}_{r}}^{n} $$
(8.1)

where a t is the variable derived from stream power and relates to the sediment concentration in the water when there is equilibrium between the deposition and entrainment of sediment and a s is related to the sediment concentration in the stream when entrainment of soil from the source area is limiting. H is defined as the fraction of the runoff-producing area with active rill formation, q r is the runoff rate, and n is the coefficient. Assuming that the interflow and baseflow are sediment free, the sediment load per unit watershed area, Y (g m−2 day−1), from both the saturated and degraded runoff source areas can be obtained as flux per unit area, Eq. 8.2:

$$ Y={{A}_{1}}{{q}_{{{r}_{1}}}}[{{a}_{{{s}_{1}}}}+H({{a}_{{{t}_{1}}}}-{{a}_{{{s}_{1}}}})]{{q}_{{{r}_{1}}}}^{n}+{{A}_{2}}{{q}_{{{r}_{2}}}}[{{a}_{{{s}_{2}}}}+H({{a}_{{{t}_{2}}}}-{{a}_{{{s}_{2}}}})]{{q}_{{{r}_{2}}}}^{n} $$
(8.2)

where q r1 and q r2 are the runoff rates expressed in depth units for contributing area A 1 (fractional saturated area) and A 2 (fractional degraded area), respectively. Theoretically, for both turbulent flow and a wide field, n is equal to 0.4 (Ciesiolka et al. 1995; Tilahun et al 2013a, b; Yu et al. 1997) . Then, the concentration of sediment in the stream can be obtained by dividing the sediment load Y (Eq. 8.2) by the total watershed discharge as shown in Eq. 8.3:

$$ C=\frac{({{A}_{1}}{{q}_{{{r}_{1}}}}^{1.4}[a_{{{s}_{1}}}^{{}}+H({{a}_{{{t}_{1}}}}-{{a}_{{{s}_{1}}}})]+{{A}_{2}}{{q}_{{{r}_{2}}}}^{1.4}[{{a}_{{{s}_{2}}}}+H({{a}_{{{t}_{2}}}}-{{a}_{{{s}_{2}}}})])}{{{A}_{1}}{{q}_{{{r}_{1}}}}+{{A}_{2}}{{q}_{{{r}_{2}}}}+{{A}_{3}}({{q}_{b}}+{{q}_{i}})} $$
(8.3)

where q b (mm day−1) is the baseflow, q i (mm day−1) is the interflow per unit area of the non-degraded hillside, and A 3 is the area water that is being recharged to the subsurface reservoir (baseflow).

2.1 Input Data

Two types of input data needed for the PED model are climate and landscape. Climate input data were derived from 10-day rainfall amounts of ten selected rainfall-gauging stations using the Thiessen polygon method (Kim et al. 2008) . Potential evaporation was based on long-term average potential evaporation data over the basin and was equal to 3.5 mm day−1for the long rainy season (from June to September) and 5 mm day−1 for the dry season (from October to May) (Steenhuis et al. 2009) . Landscape input parameters of both the relative areas and the maximum amount of water available for evaporation for the saturated, degraded, and recharge areas in the basin were estimated. In addition to the six “surface parameters,” the three baseflow parameters are the first-order baseflow reservoir constant, a zero-order interflow rate constant that indicates the duration of the linearly decreasing interflow, and the maximum water content of the baseflow reservoir. The landscape parameter values cannot be determined a priori and need to be obtained by calibration. The sediment input parameters, the sources and the H function, were calibrated similar to those of Tilahun et al. (2013a) , with the exception that it was assumed that during the latter part of the rainy monsoon phase, some sediment in the streams would be picked up from the banks.

Since this study’s interest was in the overall trend in the Nile basin, a relationship was established between precipitation and discharge, and between sediment concentrations and predicted surface runoff and discharge at El Diem at the Ethiopian and Sudan border. Available data at El Diem for this study were 10-day precipitation for the period from 1964 to1969; 1993; and from 1998 to 2003. Ten-day discharge measurements were available for 1964–1969, 1993, and 2003. Monthly discharge was available for the period 1998–2003. Erosion measurements were available only for 2 years, 1993 and 2003.

3 Results

Calibration of the model parameters (Tables 8.1 and 8.2) was based on the assumption that the subsurface flow parameters (interflow and baseflow) remained the same over time, as well as the storage of the landscape components. This study assumed that over the 40-year time span, only parameters that were affected by erosion would change. Since erosion makes the soil shallower, the fraction of the degraded soils that produce surface runoff was increased from the 1960s to the 2000s. For estimating sediment concentration, this study assumed that the erosivity (both transport and source limit) remained the same with time. Thus, the variations in concentrations were contributed by an increase in degraded areas over the simulation period.

Table 8.1 Model input parameters fixed in time for surface flow components, baseflow and interflow
Table 8.2 Model input parameters variable in time for fractional areas of degraded hillsides and permeable hillsides

The calibrated parameter values that are fixed throughout the rainy phase for the model are shown in Table 8.1. In accordance with earlier findings for small watersheds (Engda et al. 2011; Steenhuis et al. 2009; Tilahun et al. 2013a, b) , this study assumed that a total of 15 % of the Blue Nile basin produced surface runoff when the area became saturated around the middle to end of July. Since the Blue Nile basin intercepts interflow that is missing in the water balance from the smaller basins, the interflow lasts approximately 5 months (Table 8.1) and is much longer than from the smaller basins. Half-life of the aquifer was 40 days and is in the range that is found for the smaller basins. The calibrated transport-limiting and source-limiting coefficients are in the range found earlier as well.

As noted earlier, only the fraction of the degraded hillsides was adjusted so that the predicted discharge fit better to the observed 10-day discharge values for the periods 1964–1969,1993, and 2003. Since the three fractions should add up to one, an increase in degraded land resulted in a decrease in permeable hillsides (Table 8.2).

3.1 Discharge

In the 1964–1969 period, the observed and predicted discharge values corresponded most closely when the hillside (recharging the interflow and groundwater) made up 75 % of the landscape with a soil water storage of 250 mm (between wilting point and field capacity). Surface runoff was produced from the exposed surface or bedrock making up 10 % of the landscape and saturated areas comprising 15 % of the area (Tables 8.1 and 8.2, Fig. 8.2, solid line). After the dry season, the exposed bedrock needed to fill up to a storage of 10 mm before it became hydrologically active, whereas the saturated areas required 200 mm (Table 8.1) which were invariant in time. The Nash–Sutcliffe efficiency was an acceptable 0.88 for discharge over a 10-day period (Table 8.3). In Table 8.3, the last three rows indicate the goodness of fit when the degraded areas in the second row are used in the PED model for the years specified in the first column.

Fig. 8.2
figure 2

Observed (closed spheres) and predicted (solid line, with 10 % of the watershed consist of degraded hillslopes) discharge for the Blue Nile at El Diem at the Ethiopian Sudan border for the period of 1963–1969. The dotted line is the discharge assuming that the degraded areas constitute 22 % of the landscape

Table 8.3 NS (Nash–Sutcliffe) values and RM (root-mean-square error) in mm day−1 for discharge at El Diem (bolded numbers are the best fit)

In 1993, it was assumed that the fraction of degraded area in the basin increased to 18 % (Table 8.2) with otherwise the same input parameters. The predicted discharge fitted the observed 10-day values with a Nash–Sutcliffe efficiency of 0.94. Note that the average precipitation for 1993 came from a different source than Tesemma, and this study used a constant potential evaporation of 4 mm day−1 to obtain a mass balance. The best fit was obtained for 2003 by increasing exposed bedrock coverage to 22 % (from the 10 % in the 1964–1967 period) and decreasing the hillslopes by 12–63 % (Table 8.2), while all other parameters were kept the same (Table 8.1, Fig. 8.3b) when compared to 1993 (Fig. 8.3 a). The Nash–Sutcliffe model efficiency of 0.98 was again remarkably high for the PED model with nine input variables (Table 8.3). Similarly, small root mean square errors were obtained (Table 8.3). The high runoff Nash–Sutcliffe efficiencies are an indication that, although the model had only a few parameters, it effectively captured the hydrological processes in which various portions of the watershed became hydrologically active after the dry season.

Fig. 8.3
figure 3

a Observed (closed spheres) and predicted discharge (solid line) for the Blue Nile at El Diem at the Ethiopian Sudan border for three fractions of degraded hillsides for 1993, 10 % (dotted line), 18 % (solid line), and 22 % (dashed line). b Observed (closed spheres) and predicted discharge (solid line) for the Blue Nile at El Diem at the Ethiopian Sudan border for three fractions of degraded hillsides for 2003, 10 % (dashed line), 18 % (dotted line), and 22 % (solid line)

To further confirm that the model parameters predicting discharge actually changed between the mid-1960s to the 2000s, the calibrated model parameters between the three periods were interchanged. The results (Table 8.3) show that the accuracy of simulation decreased when the degraded areas were inputted for the other periods. This was most distinct for the discharge simulations for 2003. The 10 % degraded area resulted in a Nash–Sutcliffe efficiency of 0.92 and a root mean square error of 3.1 mm/10-day (Table 8.3). This improved to 0.97 and 1.7 mm/10-day, respectively, for the 18 % degraded fractional area, and for the optimum 22 % degraded area the Nash–Sutcliffe was 0.98 and the root mean square error was 1.2 mm/10-day (Table 8.3 bottom row). Moreover, by comparing observed versus predicted discharge in Fig. 8.2, it became obvious that the peaks are overpredicted by using the 22 % degraded area (instead of the optimum 10 %) and the peak is underpredicted in Fig. 8.4b for the 10 % degraded area instead of the 22 %. The increased runoff with increasing degraded areas is also confirmed by calculating the average discharge for the years 1963–1969, 1993, and 1997–2003 which is 290 mm/year for a 10 % degraded area and 317 mm/year for 22 % degraded area. Thus, runoff increased by almost 10 %. The results in Fig. 8.3a are not as conclusive for the peak flows but show that initial amounts of runoff increased when the degraded areas increased, which is better simulated by the optimum solution. Finally, the simulations (Figs. 8.2 and 8.3a, b) show that the baseflow and interflow are decreased when the degraded areas are increased, because there is a smaller fraction of area in the watershed that fills up the reservoirs.

Fig. 8.4
figure 4

a Observed (closed spheres) and predicted sediment concentrations (solid line) for the Blue Nile at El Diem at the Ethiopian Sudan border for two fractions of degraded hillsides for 1993,18 % (solid line) and 22 % (dashed line). b Observed (closed spheres) and predicted sediment concentrations (solid line) for the Blue Nile at El Diem at the Ethiopian Sudan border for two fractions of degraded hillsides for 2003, 18 % (dashed line) and 22 % (solid line)

Although this model is based on a conceptual framework, it can be seen as a mathematical relationship that relates the spatially averaged 10-day rainfall to the 10-day watershed discharge. This relationship between rainfall and watershed discharge clearly changes over the 40-year period (Figs. 8.2 and 8.3a, b), indicating that the runoff mechanisms are shifting due to landscape characteristics since the precipitation variation is accounted for in the mathematical relationship.

The discharge model results explain soil erosion during the period from the early 1960s to 2000. The hillsides that were eroded in this period no longer stored rainfall as much and produced more runoff in 2003 that otherwise was a source of interflow in 1963. This in turn caused a greater portion of the watershed to become hydrologically active at an earlier stage, releasing more of the rainfall sooner and resulting in earlier flows and greater peak flow.

These simulation results are in line with the statistical result at the El Diem site, which shows increasing trends of runoff during long or short rainy seasons but decreasing dry season runoff, while annual flow has no significant change at El Diem. This is likely due to the large forest tracks in the south, but a significant increase in wet season flow in the upper Blue Nile was found by Tesemma et al. (2010) in a statistical analysis.

3.2 Sediment Concentrations

Sediment concentrations were simulated using Eqs 8.1–8.3. Input data consisted of surface runoff and baseflow and interflow calculated with the hydrology model (Fig. 8.1), with H function (Tilahun 2013a) and the transport and source limit (Table 8.1). Consequently, any erosion -related parameters were not changed for 1993 and 2003. In this case, the best-fit predicted line to the observed sediment concentrations was obtained for 1993. For estimating sediment concentrations in the Blue Nile River at El Diem for 2003, the same parameters were used with increased surface runoff due to increased degraded hillsides, obtained from the hydrology model.

The simulation results are shown in Fig. 8.4a, b and the statistics in Table 8.4. The Nash–Sutcliffe efficiencies are very close to one for the averaged 10-day concentrations. For both years, interchanging the hydrology output of the PED model resulted in a poorer fit. For example, using the degraded area of 18 % for simulating 2003 the sediment concentration had a Nash–Sutcliffe efficiency of 0.90 while using the correct 22 % degraded area increased the Nash–Sutcliffe efficiency to 0.94 (last line Table 8.4). The root mean errors were also significantly improved.

Table 8.4 NS (Nash–Sutcliffe) values and RM (root-mean-square error) values for sediment concentration in g L−1 during the rainy monsoon phase at El Diem (bolded numbers are the best fit)

While the total discharges were only minimally affected for the years 1993 and 2003, as can be seen in Fig. 8.4a, b, where the dashed line can hardly be distinguished from the solid lines, the sediment concentrations were significantly more affected. The peak concentration in 1993 of 5 g L−1 in Fig. 8.4a was well simulated by 18 % degraded area (solid line) but by using the 22 % degraded area in the PED model overpredicted the peak concentration to almost 7 g L−1, an overestimate of 2 g L−1. In Fig. 8.4b, the peak concentration of 7 g L−1 was well simulated by using predicted surface runoff and subsurface flows for the 22 % degraded area. The 18 % degraded area underpredicted the peak by more than1 g L−1. In Table 8.4, the last two rows indicate the goodness of fit when the degraded areas in the second row are used in the PED model for the years specified in the first column.

4 Discussion

Since the PED model simulated the discharge and sediment concentrations well, the model parameters could be optimized so that the effect of increasingly degrading landscape could be detected in the output signal of both discharge and sediment concentration of the Blue Nile basin in Ethiopia. It is obvious from these results that landscape is becoming increasingly more degraded and has a significant effect on the discharge and more significantly on the concentration of sediment. To understand better what the effect of the increased degradation was on the discharge of the Blue Nile , the PED model was run over the period from 1964 to1969 and from1993 to 2003 with the fraction of degraded areas in 1964 (0.10), 1993 (0.18), and 2003 (0.22), Fig. 8.4a. Thus, doubling of degraded areas from 10 to 22 % increased the annual discharge at El Diem approximately from 30 to 33 cm per year. This 10 % increase in discharge (similar to the study by Gebremicael 2013) between the 10 % degraded area in 1964 and the 22 % degraded area in 2003 is highly important for Egypt. For instance, it is widely alleged that the current Nile flow is more than 85 billion m3 (BCM) that was originally allocated in the 1929 treaty dividing the Nile flow between Egypt and Sudan. Although pure speculation, it might be very well that the original estimate of 85 BCM in 1929 might have been accurate and that the current alleged increase in Nile flow is caused by the degrading landscape in Ethiopia . If this scenario can be shown to be a reality, Ethiopia could argue for the use of the additional Nile flow due to the degradation for its planned irrigation works even though Ethiopia was not part of the 1929 treaty and is not abiding by it.

To show that increasingly degrading landscape has significant effect on sediment concentration, again cumulative flows from 1998 to 2003 were used ( Fig. 8.5a). Cumulative sediment load was calculated for the period assuming that land has not degraded since the 1960s when degraded area was 10 %, intermediate degrading in 1993 (18 %), and severe degrading in 2003 (22 %) of the land. It was assumed that erosion per unit land surface of the three land conditions was not affected which is realistic. An unlikely assumption was that conservation measures that significantly change land degradation was not carried out through this period. Thus, the threefold increase (Fig. 8.5b) from approximately 2 t ha−1 year−1 for the 10 % degraded area to 6 t ha−1 year−1 for the 22 % degraded area was the maximum that can be expected. The 5–6 t is equal to what is the same as the reported sediment loss at El Diem (Steenhuis et al. 2012) , but the 2 t ha−1 year−1 in the 1960’s at El Diem (assuming the same rainfall as in the 1990) is unlikely. Many soil and water conservation practices have been installed during the past 50 years and that would have lowered the present-day erosivity. This implies that the erosivity in the 1960s would have been greater resulting in more soil loss at that time than indicated in Fig. 8.5b.

Moreover, unlike the conclusion on discharge, the findings on the sediment concentrations are clearly limited by the restricted access to the sediment data at El Diem. In this case, there were only two good years of sediment data available and those were 10 years apart. The 2 years’ sediment data indicated a significant increase in the sediment concentrations over a 10-year period, which was well simulated by assuming that the degraded areas increased. However, extrapolating this over a longer period of time, as was done in Fig. 8.5b, introduces uncertainty. Thus, more research needs to be done before our sediment load results representing the 1960s can be accepted.

Fig. 8.5
figure 5

a Cumulative runoff for the Blue Nile at the border with Sudan for the period 1998–1993 assuming an increasing fraction of the landscape becoming degraded. b Sediment losses for the Blue Nile at the border with Sudan for the period 1998–1993 assuming an increasing fraction of the landscape becoming degraded

5 Concluding Remarks and Challenges

The Blue Nile in Ethiopia that drops more than 1,000 m from Lake Tana (1,786 m.a.s.l.) to the Ethiopian Sudan border in approximately 800 km distance has more than enough stream power to carry all the sediment delivered from the agricultural lands. Although there may be sediment deposition before the runoff reaches the stream, it was hypothesized that once the sediment is in the stream, it is carried downstream across the Ethiopian border. At the border with Sudan, sediment concentration as high as 12.3 g L−1 has been reported (Seleshi et al. 2011) . This has positive and negative implications. When sediment delivery from the agricultural land is decreased, the concentration in the river is decreased as well and less sediment will deposit in the reservoirs. However, if more sediment is delivered when the landscape degrades more or if forests are being converted to agricultural lands, this additional sediment will be delivered downstream as well and will fill up the planned reservoirs at a more rapid rate once constructed.

Thus, the challenge is to decrease the landscape erosion . Currently, many soil and water conservation practices are being installed by farmers to decrease erosion. The farmers volunteer their labor on behalf of the government. This effort could be made more effective if the effect of these practices is monitored, and then install those that are most effective. In this regard, the study by Tilahun et al. (2013b) that show (gully) erosion from the bottomlands, especially at the end of the rainy season, is greater than from the upland is important. In addition, the model indicates that the degraded areas contribute greatly to the sediment load. Therefore, concentrating efforts in decreasing gully formation in bottomlands by planting permanent vegetation in the degraded areas will benefit most. In addition, there is some evidence in the experimental observations of Tilahun (2012) that tillage greatly accelerates soil loss and that no-till might be an effective soil and water conservation practice, provided that weeds and diseases associated with this practice can be suppressed (McHugh et al. 2006) .