Introduction

Atmospheric deposition of acidifying sulfur and nitrogen compounds derived from burning fossil fuels, farming, and other industrial activities is a critical environmental stress. In large areas of Europe and North America, acid deposition has led to acidification of soils and surface waters, with adverse effects on the abundance of freshwater biota (Driscoll et al. 1980) and forest growth (Tomlinson 2003).

Because the industrialization of East Asia is progressing rapidly, the effects of atmospheric deposition on terrestrial and freshwater ecosystems in Japan are of public, political, and scientific concern. The population of about 2 billion people in this area, representing 30% of the world’s population, has been increasing rapidly during the last several decades. The nitrogen load in East Asia is estimated to have increased by a factor of 3.8 from 1961 to 2000 (Shindo et al. 2005), and SO2, NO x , and NH3 emissions are predicted to increase by 46%, 95%, and 100%, respectively, from 2001 to 2030 (Klimont et al. 2001).

In Japan, however, acidification of soils and surface waters due to acid deposition has not yet been observed, owing to the dominance of acid-tolerant catchments (Ohte et al. 2001). Analysis of the solute compositions of 225 streams in forested catchments (Kobayashi 1960) showed that average alkalinity was 550 μmolc l−1 and that most streams had alkalinity levels higher than 200 μmolc l−1 (with a few exceptions that had been influenced by wastewater from an ore mine or spring water). This level has been used as a criterion for classifying surface waters as either “acid-sensitive” or “acid-tolerant” (Stoddard et al. 1999). In their simulation study, Ikeda and Miyanaga (2001) found that the rates of chemical weathering of primary minerals in forested Japanese catchments were high enough to neutralize acid deposition. Tokuchi and Ohte (1998) compared the proton budgets in forested catchments in Japan with those in Europe and North America and found that the major source of H+ in Japanese forests was internal proton load rather than atmospheric deposition. Shibata et al. (2001) reported that the acid-neutralizing capacities of forest soils in Japan were high enough to allow vegetation to take up more cations and produce more H+ than those of forests in Europe and North America. Many monitoring studies conducted during the 1990s showed that stream waters in Japan were well buffered and there were no signs of a long-term trend of pH decline (e.g., Baba and Okazaki 1998; Komai et al. 2001).

Although there is no clear evidence that atmospheric deposition acidifies surface waters in Japan, recent studies showed that several streams in forested catchments in central Japan have suffered acidification. Kurita and Ueda (2006) found that stream water pH had decreased by 0.3–0.8 units during the last 30 years in some forested catchments on granite and rhyolite bedrocks. Yamada et al. (2007) found that the stream water NO3 concentration increased from the late 1980s to the early 2000s and that stream water pH began to decrease in the mid-1990s in the Lake Ijira catchment in central Japan.

The Lake Ijira catchment is one of the forested catchments in Japan where surface water acidification due to atmospheric deposition is most likely to occur. Wet depositions of hydrogen, sulfur, and nitrogen were among the highest levels observed in a nationwide monitoring program for wet deposition at 53 sites conducted by the Ministry of the Environment of Japan (Yamada et al. 2007). The Lake Ijira catchment is located 40 km north-northwest of the Chukyo industrial zone, which has caused some of the most severe air pollution in Japan, and the air pollution emitted from this area is a major source of atmospheric deposition in the catchment (Kitada et al. 2000). In addition, the Lake Ijira catchment has been classified as acid-sensitive. The soils are derived from old and base-poor rocks with minimum deposition of volcanic ash that produces noncrystalline minerals (allophane and imogolite) that are characterized by a strong pH-buffering ability (Shoji et al. 1993), and the mean lake water alkalinity of 151 μmolc l−1 (Yamada et al. 2007) was lower than 200 μmolc l−1, the criterion for susceptibility to acid deposition (Stoddard et al. 1999).

The purpose of this study was to elucidate the processes of pH decline in stream water in the Lake Ijira catchment observed by Yamada et al. (2007) by synthesizing data on atmospheric deposition, soil chemistry, stream water chemistry, and forest growth.

Materials and methods

Study site

Lake Ijira is a reservoir (lat. 35°34′N, long. 136°42′E; Fig. 1) in central Gifu prefecture, Honshu, Japan. The catchment has two subcatchments, the Kamagadani catchment (363 ha) and the Kobora catchment (108 ha). The monitoring of stream water chemistry for this study was conducted in the Kamagadani catchment. The bedrock consists of chert (90%) and mudstone (10%) of Middle Jurassic to Early Cretaceous age. The dominant soil type is Dystrochrepts. The dominant vegetation is Japanese cypress (Chamaecyparis obtusa, 45%), followed by broadleaf trees (31%), Japanese red pine (Pinus densiflora, 15%), and Japanese cedar (Cryptomeria japonica, 8%). Japanese cypress and Japanese cedar stands are plantation forests, which were aged mainly from 15 to 25 years and 30 to 45 years, respectively, in 1998, the middle of the monitoring period (1988–2003). The red pine and broadleaf stands are secondary forests. Major tree species in broadleaf secondary forests are Clethra barbinervis, Quercus serrata, Ilex pedunculosa, Quercus variabilis, Carpinus tschonoskii, Acer mono, and Quercus glauca. Forest management (fertilization, clear-cutting, thinning, weed control, liming, and ditching) that affects NO3 leaching was not conducted during the monitoring period (Kakumu et al. 2005).

Fig. 1
figure 1

Map of the study area, showing the monitoring and sampling locations

Climate

The mean annual temperature in the catchment is 14.9°C, and the mean annual precipitation is 2,503 mm. The precipitation regime is characterized by relatively wet springs and summers (>200 mm mo−1 from April to September) and relatively dry winters (about 100 mm mo−1 from December to February). There were two climatic anomalies in this area during the monitoring period (Geng et al. 2000). In 1993, many regions in East Asia experienced an extremely cool and wet summer, which severely damaged rice production in Japan (Hosoe 2004). In the summer of 1994, East Asia experienced extremely high temperatures and severe droughts. The monthly mean temperatures recorded in July 1993 (23.8°C) and July 1994 (29.4°C) were the lowest and highest, respectively, observed in this area since 1950; the 30-year mean temperature in July was 26.2°C (Climate Statistics Database 2005).

To assess the potential water deficiency stress on forests, we calculated the drought index in summer (from June to August) according to the method of Spiecker (1995) using monthly meteorological data recorded at Gifu meteorological observatory, which is located 20 km south of the study site (Climate Statistics Database 2005):

$$ {\text{drought index = precipitation(mm)}} - {\text{potential evapotranspiration(mm)}}. $$
(1)

The potential evapotranspiration was calculated according to the method of Thornthwaite and Mather (1955). A positive drought index indicates sufficient supply of precipitation for tree growth, and a negative value indicates drought stress.

Wet deposition

Wet deposition was collected biweekly from October 1988 to March 1998 at the outlet of the Kamagadani catchment (110 m a.s.l.; Fig. 1). A filtering-type bulk sampler (Tamaki et al. 2000) was used to prevent collected samples from evaporating and deteriorating by reaction with gases and aerosols in the atmosphere. Each sample bottle was covered with aluminum foil to keep it dark and suppress algal growth during storage in the field. After an interruption in monitoring from April 1998 to May 1999, a wet-only sampler was installed on the middle of the slope of the Kamagadani catchment (140 m a.s.l.; Fig. 1). This sampling has occurred weekly since June 1999.

Dry deposition

Dry deposition was estimated by the inferential method. Concentrations of gaseous SO2, HNO3 , and NH3 and particulate SO4 2−, NO3 , and NH4 + in the atmosphere were determined biweekly from April 2003 to March 2004 using a four-stage filter-pack method (EANET 2003). Rates of dry deposition were calculated according to the methods of Matsuda et al. (2001) for the gases and the methods of Takahashi and Wakamatsu (2004) for the particulate components. Dry deposition of H+ was estimated as follows:

$$ \left( {{\text{H}}^ + } \right)_{\text{dry}} = 2\left( {{\text{SO}}_{ 2} } \right)_{\text{dry}} + \left( {{\text{HNO}}_{ 3} } \right)_{\text{dry}} + \left( {{\text{NH}}_{ 3} } \right)_{\text{dry}} , $$
(2)

where (•) represents the deposition rate, in kmol ha−1 year−1.

Stream water discharge and chemistry

Discharge, water temperature, pH, electrical conductivity (EC), and alkalinity were measured at the outlets of the Kamagadani catchments two or three times per month from October 1988 to March 1998, and have been measured quarterly since April 1998. These measurements were interrupted from November 1992 to March 1993. Discharge was calculated based on water depth and flow velocity. The water depth was measured at 100-cm intervals across the river flow, and the flow velocity was measured at the middle points of each 100-cm split using a flow meter (CM-10S, Toho Dentan, Tokyo, Japan). A stream water sample for the measurement of pH, EC, and alkalinity was collected at the center of the stream’s flow. Stream water samples for the measurement of major ion concentrations were collected quarterly.

Analysis of solution samples

Samples of wet deposition and stream water were transported to the laboratory within 1 h after collection and were analyzed for pH and EC immediately after arrival. The pH was measured using a glass electrode (HM-40V, DKK-TOA, Tokyo, Japan), and the EC was measured using a conductivity meter (AOL-40, DKK-TOA). The alkalinity of stream water was determined by titrating samples to an endpoint of pH 4.8, with pH values measured using the pH meter. Wet deposition samples, stream water samples, and extract solutions for dry deposition were passed through a membrane filter (pore size, 0.45 μm) and stored in a refrigerator (4°C) until chemical analysis. The concentrations of inorganic ions (Cl, NO3 , SO4 2−, Na+, K+, Ca2+, Mg2+, and NH4 +) were determined using ion chromatography (IC7000, Yokogawa, Tokyo, Japan; HIC-SP Super, Shimadzu, Kyoto, Japan), and concentrations of H+ ions were calculated from the pH values.

Soil chemical properties

Five plots were established for soil sampling around Lake Ijira (Fig. 1). The square plots ranged from 5 to 10 m on a side. Five soil samples were collected from A horizons from the corners and center of each plot each year in early November from 1988 to 1997 and in 2000 and 2004. At each sampling, a soil sample was collected from an undisturbed area beside the previous samplings, and the position of the sampling was marked with a plastic stake. The collected soil samples were air-dried and sieved through a 2-mm mesh before chemical analysis.

Soil pH was measured with a glass electrode (HM-40V, DKK-TOA) in the supernatant suspension of a 1:2.5 mixture of air-dried soil and deionized water. Exchangeable Ca2+ was extracted with 1 mol l−1 CH3COONH4 (pH 7.0) solution, and the extract was analyzed using atomic absorption spectrometry (Z-6100, Hitachi, Tokyo, Japan). Exchangeable Al was extracted with 1 mol l−1 KCl, and the extract was titrated with 0.1 mol l−1 NaOH and then with 0.1 mol l−1 HCl after adding 4% NaF solution (McLean 1965).

Forest growth

In Japanese cedar and Japanese cypress stands that were thinned in December 2004 (Fig. 1), tree disk samples were taken at breast height (1.3 m) from 50 felled trees in each stand. The average ages of the sampled stands were 41 years for Japanese cedar and 42 years for Japanese cypress. Ring widths were measured to the nearest 0.025 mm in four directions. To assess forest growth, the basal area increment (BAI) of trees was calculated as:

$$ {\text{BAI}} = \pi \left( {R_{n}^{2} - R_{n - 1}^{2} } \right), $$
(3)

where R is the radius of annual ring and n is the year of tree-ring formation (Le Blanc 1990; Duchesne et al. 2002).

Quality control

The quality of all the measurement processes was controlled according to the procedures specified in the technical manuals that guide monitoring efforts in Japan (Committee for Acid Deposition Measures 2004). Rain water, stream water, and soil data used for this analysis generally had no significant error component based on the monitoring criteria. For all rain water and stream water data, ion balances were assessed and the measured EC was compared with the calculated EC based on measured solute compositions (EANET 2000). Variability of soil properties was evaluated using the statistical procedures specified by EANET (2006) to avoid significant error. To ensure the quality of the analytical data, several interlaboratory comparison projects have been conducted for the measurement of acid deposition since 1991, for stream water since 2000, and for soil since 1999 (e.g., EANET 2007).

Statistical analyses

Student’s t-test was used for comparison of two groups (e.g., alkalinity during different periods) and two linear regression lines (relationships between SO4 2− concentration and stream discharge in different periods) and for detecting significant correlations between two variables. To detect long-term trends, we used the seasonal Mann-Kendall test for stream water quality parameters (pH, alkalinity, and ion concentrations in stream water) and the Mann-Kendall test for yearly measured data (drought index, soil pH, BAI, and annual mean concentrations of wet deposition), as described by Gilbert (1987).

Results

Climate

At the study site, summers have been becoming drier since 1950 (Fig. 2a). The drought index showed a significant negative trend (p < 0.01), reflecting both increasing temperature (p < 0.01) and decreasing precipitation (p < 0.01; data not shown). However, the drought index remained positive until the early 1990s (Fig. 2a), suggesting that precipitation had been sufficient to maintain forest growth. The summer of 1994 was the driest summer since 1950. Total precipitation in July 1994 was only 28 mm, which was less than one-tenth of the long-term average monthly precipitation in July (347 mm) and the lowest total precipitation in July since 1950 (Climate Statistics Database 2005). Total precipitation of 203 mm in August 1994 was similar to the long-term average monthly precipitation in August (228 mm). In addition to 1994, summer drought stress occurred in 2000 and 2002 (Fig. 2a). The annual mean precipitation during the monitoring period (1989–2003) was 2,495 mm (Fig. 3a). Annual precipitation decreased from 3,156 mm in 1989 to 1,604 mm in 1994 and increased thereafter (Fig. 3a). The mean annual discharge was 1,653 mm, and annual discharge followed the pattern of annual precipitation (Fig. 3a).

Fig. 2
figure 2

a Drought index from June to August, calculated as the sum of precipitation minus the sum of potential evaporation during the 3-month summer period. b Mean basal area index (BAI) of Japanese cedar (n = 50) and Japanese cypress (n = 50). Average ages in 2004 were 41 year for Japanese cedar and 42 year for Japanese cypress. The arrow indicates the year of the most severe summer drought (1994)

Fig. 3
figure 3

a Annual precipitation and annual stream discharge in the study area, b volume-weighted annual mean concentrations of H+, SO42−, and dissolved inorganic nitrogen (DIN) in wet precipitation, and c annual wet depositions of hydrogen, sulfur, and nitrogen

Forest growth

The BAI of Japanese cedar and Japanese cypress increased during the 1960s and 1970s, reached a maximum around the early 1980s, and decreased from the mid-1980s onward (Fig. 2b). The peak in BAI corresponded to stand ages of 18 years for Japanese cedar and 21 years for Japanese cypress. After the peak, BAI decreased in a stair-step pattern, with BAI showing two significant periods of decrease: the first decrease began in 1982 for Japanese cedar (p < 0.001) and in 1984 for Japanese cypress (p < 0.01) and ceased around 1988, and the second decrease began in 1994 for Japanese cedar (p < 0.01) and in 1993 for Japanese cypress (p < 0.01) and ceased around 1998. The study site has suffered chronic defoliation and dieback of Japanese red pine since 1977, and the heaviest damage was observed in 1992 (Yamagata City Office of Gifu Prefecture, personal communication). However, no quantitative information on defoliation and dieback (e.g., change in basal area due to dieback and area of damage) was available.

Atmospheric deposition

The volume-weighted annual mean concentrations of dissolved inorganic nitrogen (DIN, NO3 plus NH4 +) and of SO4 2− in wet deposition ranged from 33 to 48 μmolc l−1 and 25 to 44 μmolc l−1, respectively, and did not show clear trends (Fig. 3b). Annual wet depositions of nitrogen and sulfur (Fig. 3c) appear to have been determined mainly by annual precipitation (Fig. 3a), and generally decreased until 1994 and increased thereafter (Fig. 3c). The volume-weighted annual mean concentrations of H+ in wet deposition generally increased throughout the study period, with the exception of a decrease between 1994 and 1996 (Fig. 3b, p < 0.05). Annual wet deposition of H+ remained constant until 1993, decreased from 1993 to 1994, and increased thereafter (Fig. 3c). Because wet deposition was collected until 1998 using a bulk sampler at the outlet of the catchment (Fig. 1) and since 1999 using a wet-only sampler on the middle slope of the catchment (Fig. 1), caution is necessary when discussing trends in atmospheric deposition from the mid-1990s to the early 2000s. However, the trend of increasing atmospheric deposition from 1994 to the early 2000s was observed throughout Japan (Committee for Acid Deposition Measures 2004), suggesting that the overall trend in the present study is accurate. Mean annual wet depositions were 0.68 kmol ha−1 year−1 for H+, 0.42 kmol ha−1 year−1 for SO4 2−, and 1.00 kmol ha−1 year−1 for DIN. Dry depositions measured in 2003/2004 totaled 0.15 kmol ha−1 year−1 for H+, 0.15 kmol ha−1 year−1 for SO4 2−, and 0.37 kmol ha−1 year−1 for DIN. Thus, total depositions (wet + dry) were 0.83, 0.57, and 1.37 kmol ha−1 year−1, respectively.

Soil chemistry

The average pH of the surface mineral soils decreased from 4.5 in 1990 to 3.9 in 2003 (Fig. 4a). The soil pH in 2003 was significantly lower than that in 1990 in all stands (p < 0.01 in four stands, p < 0.05 in one stand). The average exchangeable Al increased slightly, from 10.4 cmolc kg−1 in 1990 to 12.0 cmolc kg−1 in 2004 (Fig. 4b). On average, exchangeable Al accounted for 88% of the effective CEC (the sum of exchangeable basic cations and Al).

Fig. 4
figure 4

a Soil pH, b exchangeable Al, and c exchangeable Ca2+ of the surface mineral soils (A horizon). Data represent mean values of five samples from each plot. Error bars represent SEs. The legend shows dominant species in each stand in decreasing order

There appeared to be perturbations in soil chemistry around 1996/1997. Generally, soil pH decreased, exchangeable Al increased, and exchangeable Ca2+ remained low until 1994/1995 (Fig. 4). The pH in 1995 was significantly lower than that in 1990 in three of five stands (p < 0.01 in two stands, p < 0.05 in one stand). In two stands that contain pine, the decrease in soil pH was less obvious (p = 0.18 and 0.17). The average rate of decrease of soil pH during this period was 0.07 units per year. Exchangeable Ca2+ increased from 1994 onward and showed a large peak in 1996/1997 (Fig. 4c). Exchangeable Ca2+ in 1996/1997 was about 10 times the value in 1992 in four of five stands. The peak of exchangeable Ca2+ in 1996/1997 corresponded to an increase in soil pH of 0.2–0.5 units in three of five stands in 1996 (Fig. 4a). In the three stands, the soil pH in 1996 was significantly higher than that in 1995 (p < 0.01 in two stands, p < 0.05 in one stand), and the soil pH in 1997 was significantly lower than that in 1996 (p < 0.01 in one stand, p < 0.05 in two stands). After these peaks in 1996/1997, exchangeable Al generally did not increase (Fig. 4b), and soil pH remained relatively constant in the two of five stands that contained pine (Fig. 4a).

Stream water pH and alkalinity

Although there was considerable variability from one sampling period to the next, stream water pH showed an overall increase until 1996/1997 (p < 0.01; Fig. 5a), followed by an overall decrease thereafter. Changes in stream water alkalinity (Fig. 5b) differed from the changes in pH. Stream water alkalinity showed no clear trend before 1995 (p = 0.50), increased from 1995 to 1996/1997 to concentrations higher than those before 1995 (p < 0.001), and decreased from 1996/1997 onward (p < 0.05). Average alkalinity was 134 μmolc l−1.

Fig. 5
figure 5

Stream water pH (a) and alkalinity (b)

Stream water NO3 and SO4 2−

The stream water NO3 concentration increased throughout the monitoring period (p < 0.001; Fig. 6a). The average NO3 concentration roughly doubled from the late 1980s and early 1990s (22 μmolc l−1) to the early 2000s (42 μmolc l−1). A peak NO3 concentration of 80 μmolc l−1 was observed in August 1994, when the water level was very low due to extremely low precipitation in July 1994 (Figs. 3a, 8a).

Fig. 6
figure 6

Concentrations of a NO3 and b SO4 2− in stream water. Closed circles represent data observed in summer (July, August, and September) and open circles represent data observed in the other seasons

Prior to 1996/1997, the stream water NO3 concentration decreased significantly with increasing stream water temperature (r 2 = 0.24, p < 0.01; Fig. 8a). After 1996/1997, however, the stream water NO3 concentration showed no correlation with stream water temperature (r 2 = 0.02, p = 0.46; Fig. 8a). The stream water NO3 concentration showed no correlation with stream discharge until 1997 (r 2 = 0.00, p = 0.96), but increased significantly with increasing stream discharge beginning in 1998 (r 2 = 0.49, p < 0.01; Fig. 7a).

Fig. 7
figure 7

Relationships between a stream discharge and the stream water NO3 concentration, and b between the log-transformed stream discharge and stream water SO4 2− concentration. The NO3 concentrations for August 1994 were excluded from the linear regression analysis

The stream water SO4 2− concentration showed a decreasing tendency until 1993 (p = 0.10; Fig. 6b). The stream water SO4 2− concentration increased greatly, to a maximum of 295 μmolc l−1 in August 1994, remained relatively high until around 1996, and then decreased gradually. The stream water SO4 2− concentration decreased with increasing discharge (r 2 = 0.40, p < 0.0001; Fig. 7b) and was higher from 1994 to 1996 than during other periods (p < 0.05; Fig. 7b).

The ratio of total basic cations to total strong-acid anions, (K+ + Ca2+ + Mg2+)/(NO3  + SO4 2−), increased until 1996, except for a period of decrease from 1993 to 1995. The ratio decreased from 1996 onward, except for a peak around 2001/2002 (Fig. 9). The increase in this ratio until 1996 mainly reflected the decrease in the stream water SO4 2− concentration until 1994 (Fig. 6b), whereas the decrease since 1996/1997 mainly reflected the increase in the stream water NO3 concentration since 1996/1997 (Fig. 6a).

Discussion

Atmospheric deposition

It seems difficult to attribute all the changes in stream water chemistry at the study site to the changes in atmospheric deposition (Fig. 3c); however, these changes appeared to be partly responsible for the changes in stream water chemistry. For example, the increase in H+ deposition since 1994 (Fig. 3c), the increase in nitrogen deposition since 1994 (Fig. 3c), and the decrease in sulfur deposition until 1994 may have been partly responsible for the trends in stream water pH (Fig. 5), NO3 (Fig. 6a), and SO4 2− (Fig. 6b), respectively, during the corresponding periods. However, many changes in stream water did not appear to follow the changes in atmospheric deposition. For example, although nitrogen deposition decreased until 1994 (Fig. 3c), the stream water NO3 concentration did not decrease during this period (Fig. 6a). Thus, changes in nitrogen deposition (Fig. 3c) do not explain the positive correlation between NO3 concentration and stream discharge since 1998 (Fig. 7a) and the loss of seasonality of NO3 concentrations since 1996/1997 (Figs. 6a, 8). Although H+ deposition remained fairly constant until 1993 (Fig. 3c), stream water pH increased during this period (Fig. 5a). The large peak in the stream water SO4 2− concentration starting in 1994 (Fig. 6b) cannot be explained by the change in wet sulfur deposition (Fig. 3c). These disagreements between changes in atmospheric deposition and stream water chemistry suggest that changes in nitrogen and sulfur cycles and the accompanying changes in biogeochemical reactions that produce and consume H+ at the study site had stronger impacts on changes in stream water chemistry than the changes in atmospheric deposition.

Fig. 8
figure 8

Relationship between stream water temperatures and concentrations of stream water NO3 a until 1996/1997 and b thereafter. Data for August 1994 were excluded from the linear regression

Soil acidification

The average soil pH of 3.9 measured in 2003 at the study site was among the lowest observed in Japanese forests (Takahashi et al. 2001). Surface mineral soils became more strongly acidified until the mid-1990s. Generally, soil pH decreased, exchangeable Al increased, and exchangeable Ca2+ remained low until 1994/1995 (Fig. 4). Previous studies of long-term changes in soil pH showed that the rate of decrease of soil pH in surface mineral soils ranged from 0.01 to 0.03 units per year in natural forests (Drohan and Sharpe 1997; Jonsson et al. 2002; Bailey et al. 2005) and from 0.03 to 0.05 units per year in plantation forests (Binkley et al. 1989; Knoepp and Swank 1994; Richter et al. 1994; Ritter et al. 2003). Thus, the rate of decrease of 0.07 pH units per year in this study was noticeably higher than the previously reported values. Although the primary H+ source for forest soil acidification is net excretion of H+ from plant roots (Markewitz et al. 1998; Fujii et al. 2008), the contribution of H+ supplied by atmospheric deposition can be considerable in temperate forests (Markewitz et al. 1998). At the study site, however, the contribution from this source is unknown because no measurements of the H+ budget were obtained.

There appeared to be perturbations in soil chemistry in the mid-1990s. Exchangeable Ca2+ began to increase in 1994 and showed a large peak in 1996/1997 (Fig. 4c). The size of the exchangeable Ca2+ pool in soils is determined by the balance among the Ca2+ supply rate (weathering of Ca2+-bearing primary minerals and Ca2+ release through litter decomposition), the Ca2+ consumption rate (Ca2+ uptake by vegetation), and the Ca2+ leaching rate. Because the weathering rate is primarily controlled by temperature and precipitation, it is unlikely that the weathering rate at the study site suddenly increased from 1994 to 1996/1997. It is also unlikely that a decrease in Ca2+ leaching resulted in the increased pool of exchangeable Ca2+. Because annual precipitation increased from 1994 to 1996/1997 (Fig. 3a), Ca2+ leaching from surface soils should have increased during this period. Thus, possible causes for the increased pool of soil exchangeable Ca2+ during this period (Fig. 4c) were an increase in litterfall, a decrease in Ca2+ uptake by vegetation, or a combination of both. The 1994 summer drought might have increased needle litterfall in Japanese cedar, cypress, and red pine stands in 1994 and 1995 (Kouki and Hokkanen 1992), and the enhanced Ca2+ leaching from O horizons to mineral soil horizons might have resulted in the peak of exchangeable Ca2+ in 1996/1997 (Baba et al. 2004). In addition, the 1994 summer drought might have decreased Ca2+ uptake by vegetation in 1994 and the following several years (Spiecker 1995; Ciais et al. 2005; Dobbertin 2005).

Soil acidification (Fig. 4a) appeared to be alleviated after the peak of exchangeable Ca2+ in 1996/1997 (Fig. 4c). This peak resulted in a small peak in soil pH in 1996 (Fig. 4a). After this peak, soil pH remained stable in two of the five stands (Fig. 4a), and exchangeable Al generally stopped increasing (Fig. 4b). There are three possible explanations for this alleviation of soil acidification: (1) the H+ load might have been consumed by exchange reactions with exchangeable Ca2+, (2) a decline in the uptake of basic cations by vegetation due to the 1994 summer drought might have delayed soil acidification, or (3) a combination of both processes might have occurred.

Stream water NO3 and nitrogen saturation

The nitrogen deposition of 1.37 kmol ha−1 year−1 at the study site exceeded the range of thresholds (0.5–0.7 kmol ha−1 year−1) that are reported to enhance NO3 discharge into surface waters (Dise et al. 1998; MacDonald et al. 2002; Aber et al. 2003), and the stream water NO3 concentration increased throughout the monitoring period (Fig. 6a). Because the DIN concentration in precipitation did not change (Fig. 3b) and forest management that enhances NO3 leaching (Gundersen et al. 2006) was not conducted during the monitoring period at the study site (Kakumu et al. 2005), the cause of the increase in the NO3 concentration of stream water must be attributed to factors other than changes in nitrogen deposition and forest management.

The factors responsible for the control of stream water NO3 concentration appeared to differ before and after 1996/1997. Prior to 1996/1997, the stream water NO3 concentration decreased with increasing stream water temperature (Fig. 8a), probably due to enhanced nitrogen uptake by vegetation during the growing season. In contrast, a high stream water NO3 concentration was often observed during the summer from 1996/1997 onward (Figs. 6a, 8b). In addition, stream water NO3 concentrations began to increase with increasing stream discharge in 1998 (Fig. 7a). This positive correlation since 1998 (Fig. 7a) suggests that direct discharge of NO3 from surface soil horizons became a major source for NO3 discharge during this period and that the size of the NO3 pool in surface soils increased considerably from 1998 onward. These changes suggest that the study site progressed toward nitrogen saturation around 1996/1997 (Aber et al. 1989, 1998; Stoddard 1994). According to the classification proposed by Stoddard (1994), the stage of nitrogen saturation at the study site shifted from stage 1 before 1996/1997 to stage 2 thereafter.

Increases in the stream water NO3 concentration during the growing season can occur through enhanced nitrogen mineralization and nitrification in soils, reduction in nitrogen uptake by vegetation, or a combination of both. There are four possible reasons for the loss of seasonality in stream water NO3 concentration since 1996/1997: a change in soil chemistry in 1996/1997, maturation of plantation stands of Japanese cypress and cedar, a climate anomaly (the cold summer in 1993 and summer drought in 1994), and defoliation and dieback of Japanese red pine (which peaked in 1992). In the following section, we discuss these possible factors.

Causes for nitrogen saturation

Changes in Soil Chemistry. The coincident large peak in exchangeable Ca2+ in surface mineral soils (Fig. 4c) and the start of the loss of seasonality in the stream water NO3 concentration (Figs. 6a, 8) in 1996/1997 suggests that the enhanced pool of exchangeable Ca2+ was partially responsible for the enhanced NO3 discharge during the growing season. The increase in exchangeable Ca2+ in 1996/1997 was accompanied by a pH increase of 0.2–0.5 units in surface mineral soils (Fig. 4a). The simultaneous increases in pH and exchangeable Ca2+ observed in the present study (Fig. 4a, c) are similar to the soil chemical changes induced by lime application. The application of lime to forest soils stimulates net nitrification and NO3 leaching from soils (Robertson 1982; Marschner et al. 1992; De Boer et al. 1993; Kreutzer 1995; Geary and Driscoll 1996; Simmons et al. 1996; De Boer and Kowalchuk 2001; Nilsson et al. 2001; Nohrstedt 2002). Likewise, the alkalinization in 1996/1997 at the study site (Fig. 4a, c) should have accelerated net nitrification and NO3 leaching, especially during the summer, when high temperatures would enhance microbial activity. After the peaks in 1996/1997, the soil pH and exchangeable Ca2+ decreased sharply within a year (Fig. 4a, c), probably due to enhanced H+-generating nitrification in surface soils and a subsequent increase in Ca2+ leaching from surface soils.

Maturation of plantation stands. In even-aged plantation forests, primary production reaches its maximum at a relatively early stand age and decreases after the peak (Ryan et al. 1997). For Japanese cedar and Japanese cypress plantations, the decline in growth rate begins at stand ages of 20–40 years (Sakaguchi 1983). At the study site, the stand ages at the maximum growth rate, expressed as BAI, were 18 years for Japanese cedar and 21 years for Japanese cypress (Fig. 2b). Thus, in plantation stands above these ages, nitrogen uptake rates should have been decreasing.

However, it seems unlikely that the maturation of plantation stands was a major factor in the increase in NO3 discharge since 1996/1997. The Japanese cypress stands were young: in 1996/1997, about 50% of the cypress stands were younger than 20 years and 90% of cypress stands were younger than 30 years. In addition, Japanese cedar stands occupied only 8% of the catchment. Because cedar stands were mainly 30–50 years old in 1996/1997, the nutrient uptake rate should have been decreasing. However, it seems difficult to attribute a doubling of the concentration of stream water NO3 (Fig. 6a) to maturing of cedar stands that occupied less than 10% of the catchment area.

A climate anomaly in 1993/1994. The anomalously cold summer in 1993 and the summer drought in 1994 might have decreased nitrogen retention since the mid-1990s. In addition to forest maturation, climatic conditions appeared to have affected the forest growth rate at the study site. The decrease in BAI during the 1980s and 1990s showed a stair-step pattern rather than a monotonic pattern (Fig. 2b). This pattern suggests that factors other than forest age had a strong effect on forest growth. Because the second decrease in BAI, in 1993/1994 (Fig. 2b), coincided with the climate anomaly suggests that the climate anomaly might have triggered the second decrease in BAI.

Extreme climatic conditions during the growing season have considerable effects on forest growth. Summer drought has negative effects on forest growth in the current year and in the following several years (Spiecker 1995; Ciais et al. 2005; Dobbertin 2005). A cold summer often decreases radial growth in the current year (Pfeifer et al. 2005; Chhin et al. 2008). In central Japan, several studies noted declines in forest growth in 1994 and the following several years. Kakumu et al. (2005) reported a decrease in BAI that began in 1993/1994. Saigusa et al. (2005) noted a decline in gross primary production from 1994 to 1995 and a subsequent gradual increase in its value until the early 2000s. Finally, a reduced growth rate and increased mortality in 1994 and 1995 was reported by Kodani et al. (1997). These observations strongly suggest that the 1994 summer drought had a considerable negative impact on forest growth in 1994 and the following several years in central Japan.

The summer drought in 1994 might have triggered perturbations in biogeochemical processes at the study site. Several changes in soil and stream water chemistry began in 1994 and 1995. Exchangeable Ca2+ in the surface mineral horizons began to increase in 1994 (Fig. 4c). The increase in exchangeable Al generally stopped around 1994/1995 (Fig. 4b). Very high NO3 and SO4 2− concentrations were observed in the summer of 1994 (Fig. 6). There is some evidence of an increase in stream water alkalinity from 1995 to 1996 (Fig. 5b).

Defoliation and dieback of japanese red pine. Defoliation and dieback of Japanese red pine due to pine wilt disease caused by the nematode Bursaphelenchus xylophilus should have decreased nitrogen retention. Defoliation stimulates nitrogen mineralization and nitrification in surface soils (Lewis and Likens 2007) and enhances stream water NO3 concentration for several years after a lag of several months to years following defoliation (Webb et al. 1995; Eshleman et al. 1998; Tokuchi et al. 2004; Lewis and Likens 2007). At the study site, however, it seems unlikely that the defoliation was a major factor in the increase in NO3 discharge since 1996/1997 because red pine stands occupied only 15% of the catchment area.

Flushing of NO3 leaching in August 1994

The remarkably high NO3 concentration of 80 μmolc l−1 in August 1994 (Fig. 6a) was observed during a time of very low discharge (Fig. 7a), suggesting that the control of NO3 concentration in August 1994 differed from that of other periods. A sharp and high peak of NO3 concentrations in soil solutions and stream waters after drought requires reduced nutrient uptake (Lamersdorf et al. 1997; Xu et al. 1998) as well as NO3 flushing by the rewetting of soils (Reynolds et al. 1992; Cui and Caldwell 1997). Therefore, rewetting of severely dried soils after summer drought in July 1994 and reduced nutrient uptake by vegetation due to the drought should have been the primary causes for the high NO3 concentration.

Stream water SO4 2−

The stream water SO4 2− concentrations from 1994 to 1996 were higher than those of other periods (Fig. 7b). The 1994 summer drought might have been responsible for the enhanced SO4 2− concentration. Large peaks in stream water SO4 2− concentration are often observed after summer drought (Eimers and Dillon 2002; Lewis and Likens 2007) because sulfur mineralization in O horizons is stimulated when warm and dry periods are broken by rainstorms (Eimers et al. 2004). Similarly, rewetting of the O horizon after the 1994 summer drought might have resulted in enhanced mineralization of sulfur in the O horizon and an increase in the stream water SO4 2− concentration at the study site.

Stream water acidification

The difference between trends in stream water pH (Fig. 5a) and soil pH (Fig. 4a) suggests that their major controls differ. The alkalinity and pH of surface waters are controlled by the balance between the supply of basic cations (K+, Ca2+, Mg2+) and the supply of strong-acid anions (NO3 and SO4 2−) from soils to surface waters (Cosby et al. 1985; Stoddard et al. 1999). At the study site, the ratio of total basic cations to total strong-acid anions (Fig. 9) showed a similar trend to that of stream water pH (Fig. 5a): the ratio generally increased until 1996 and decreased thereafter. This similarity in trends was in good agreement with the general theory of surface water acidification (Cosby et al. 1985). At the study site, five major events may have considerably affected the stream water pH and alkalinity by changing the balance of cation and anion supplies to streams: the change in nitrogen and sulfur depositions, which decreased until 1994 and increased thereafter (Fig. 3c); the decrease in the stream water SO4 2− concentration until 1994 (Fig. 6b); the loss of seasonality of stream water NO3 concentration after 1996/1997 (Figs. 6a, 8); the increase in stream water SO4 2− concentrations from 1994 to 1996 (Fig. 6b); and the increase in exchangeable Ca2+ in surface mineral soils, which peaked in 1996/1997 (Fig. 4c).

Fig. 9
figure 9

Ratio of total basic cations to total strong-acid anions, (K+ + Ca2+ + Mg2+)/(NO3  + SO4 2−), in stream water

Annual precipitation (Fig. 3a), annual discharge (Fig. 3a), and atmospheric deposition (Fig. 3c) generally decreased until 1994 and increased thereafter. These changes might have been primary causes for the change in stream water pH that increased until 1996/1997 and decreased thereafter (Fig. 5a) with a time lag of several years in the response. However, it is doubtful that the changes in precipitation, discharge, and atmospheric depositions were the main causes for the changes in stream water pH. Matsubara et al. (2009) investigated the changes in stream water pH during the same period as our study in 15 forested catchments within 50 km from our study site and found three sites with increasing trends, eight with decreasing trends, and four with no clear trend. They did not observe the changes in stream water pH (an increase until the mid-1990s and a decrease thereafter) observed in our study (Fig. 5a). If the changes in atmospheric depositions, precipitation, and discharge (Fig. 3a, c) had been major controls for the changes in stream water pH at our study site (Fig. 5a), similar changes in stream water pH should have been observed in neighboring areas that experienced similar changes in annual precipitation (Climate Statistics Database 2005). Therefore, it is difficult to conclude that the changes in atmospheric depositions, precipitation, and discharge (Fig. 3a, c) were major causes for the changes in stream water pH at our study site (Fig. 5a). This suggests that changes in biogeochemical processes at the study site were the main causes for the changes in stream water pH (Fig. 5a).

The loss of seasonality of stream water NO3 concentration since 1996/1997 (Figs. 6a, 8) appeared to be the main cause for the decline in stream water pH (Fig. 5a) and alkalinity (Fig. 5b) since 1996/1997. The start of stream water acidification (Fig. 5) and the loss of seasonality of stream water NO3 (Fig. 8) coincided in 1996/1997. The decrease in the ratio (K+ + Ca2+ + Mg2+)/(NO3  + SO4 2−) in stream water since 1996 (Fig. 9) mainly reflected the increase in the stream water NO3 concentration since 1996/1997 (Fig. 6a). The decrease in the stream water SO4 2− concentration until 1994 (Fig. 6b) appeared to be the main cause for the increase in stream water pH (Fig. 5a) until 1996/1997 because the increase in the ratio (K+ + Ca2+ + Mg2+)/(NO3  + SO4 2−) in stream water until 1996 (Fig. 9) mainly reflected the decrease in the stream water SO4 2− concentration until 1994 (Fig. 6b). There are two possible causes for the decreasing stream water SO4 2− concentration prior to 1994 (Fig. 6b). First, the decreasing sulfur deposition until 1994 (Fig. 3c) may have resulted in the decreasing stream water SO4 2− concentration (Fig. 6b). However, this mechanism seems unlikely because the increasing sulfur deposition after 1994 (Fig. 3c) was not accompanied by an increasing stream water SO4 2− concentration (Fig. 6b). Second, increasing retention of SO4 2− in surface mineral soils may have resulted in the decreasing stream water SO4 2− concentration. Until the mid-1990s, soil pH had decreased at a rate of 0.07 units per year (Fig. 4a). This rate of decrease was clearly higher than previously reported rates of 0.01–0.03 units per year in natural forests (Drohan and Sharpe 1997; Jonsson et al. 2002; Bailey et al. 2005) and 0.03–0.05 units per year in plantation forests (Binkley et al. 1989; Knoepp and Swank 1994; Richter et al. 1994; Ritter et al. 2003). The continuous and rapid decrease in soil pH during this period (Fig. 4a) may have increased SO4 2− adsorption on soil minerals and decreased SO4 2− discharge into streams (Nodvin et al. 1988).

Although stream water pH and alkalinity showed decreasing trends after 1996/1997 (Fig. 5), the change in stream water alkalinity (Fig. 5b) did not follow the trend in stream water pH (Fig. 5a) nor the trend in the ratio (K+ + Ca2+ + Mg2+)/(NO3  + SO4 2−) prior to 1996/1997 (Fig. 9). This disagreement between stream water alkalinity (Fig. 5b) and pH (Fig. 5a) suggests that the controls on these parameters are slightly different. Stream water alkalinity (Fig. 5b) apparently followed the change in H+ deposition (Fig. 3c), with a time-lag of several years in the response. H+ deposition remained fairly constant until 1993 (Fig. 3c), and stream water alkalinity remained fairly constant until 1995 (Fig. 5b). H+ deposition suddenly decreased from 1993 to 1994 (Fig. 3c), and stream water alkalinity increased from 1995 to 1996/1997 (Fig. 5b). H+ deposition increased after 1994 (Fig. 3c), and stream water alkalinity decreased after 1996/1997 (Fig. 5b). These correspondences might suggest that H+ deposition had stronger effects on stream water alkalinity than on stream water pH. Another possible cause for the increase in stream water alkalinity from 1995 to 1996/1997 (Fig. 5b) is the enhanced supply of basic cations from soils into streams. The coincident peaks of stream water alkalinity (Fig. 5b) and exchangeable Ca2+ (Fig. 4c) in 1996/1997 suggest that an enhanced pool of exchangeable Ca2+ increased the Ca2+ supply to streams, resulting in enhanced stream water alkalinity in 1996/1997 (Houle et al. 2006).

Conclusions

Stream water pH and alkalinity began to decrease in 1996/1997 in an acid-sensitive forested catchment in central Japan. The enhanced NO3 discharge during the growing season that has occurred since 1996/1997 appeared to be the primary causes for the stream water acidification. The enhanced atmospheric depositions of H+, SO4 2−, and DIN since 1994 may have had additional effects on surface water acidification. At the study site, the balance between anion discharge and cation discharge appeared to be a major control on stream water pH, whereas H+ deposition and the pools of exchangeable cations in soils appeared to be major controls on stream water alkalinity.

The catchment appeared to become nitrogen saturated (changing from stage 1 to stage 2) in 1996/1997. Nitrogen deposition of 1.37 kmol ha−1 year−1 in the catchment exceeded the widely observed range of thresholds of 0.5–0.7 kmol ha−1 year−1 that are reported to enhance NO3 discharge into surface waters in Europe and North America. Prior to 1996/1997, however, deposited nitrogen had been retained fairly well. Forest growth was high enough to retain nitrogen and kept NO3 leaching from soils to surface waters at low levels. As a result, the stream water NO3 concentration remained relatively low and decreased during the growing season due to nitrogen uptake by vegetation. There seem to be two direct causes for the enhanced NO3 discharge that started in 1996/1997. The first was a decline in nitrogen retention by vegetation since the mid-1990s. A severe summer drought in 1994 decreased forest growth rates in 1994 and the following several years in central and western Japan. The decline in forest growth should have decreased nitrogen retention in the catchment. Defoliation and dieback due to pine wilt disease, which peaked in 1992, should also have decreased nitrogen retention in Japanese red pine stands. Nitrogen retention in Japanese cedar stands should have been decreasing as a result of maturation of these stands throughout the monitoring period. The second cause was a stimulation of net nitrification in 1996/1997. The 1994 summer drought resulted in an increase in the pool of exchangeable Ca2+ in surface mineral soils from 1994 to 1996/1997, presumably through enhanced needle litterfall in coniferous stands and reduced Ca2+ uptake by vegetation. The peak in the pool of exchangeable Ca2+ resulted in an increase of soil pH by 0.2–0.5 units. The sudden increase in soil pH may have accelerated H+-generating net nitrification and resulted in the accumulation of NO3 in surface soils. After 1996/1997, the stream water NO3 concentration increased with increasing stream discharge, and the seasonality of stream water NO3 concentration was lost.