Spent nuclear fuel from the Finnish nuclear power plants is disposed in bedrock encapsulated in copper canisters. The canister wall thickness is nominally 50 mm, and the design lifetime is 100 000 years. The allowed corrosion rates are very low, and it is assumed that the most corrosive oxic period will last only some years in the beginning. In this paper we have studied the effect of initial oxide film on the corrosion rate, as the oxide films can catalyze cathodic reduction of oxygen and thus increase copper corrosion rate. The research was done by monitoring corrosion rates using linear polarization resistance (LPR) and comparing the corrosion rates of oxidized and non-oxidized specimens. Test environments were synthetic ground water and bentonite clay pore water under air purging and nitrogen purging at temperatures 20-80°C. The measured corrosion rates varied from less than one to almost 40 μm a-1. Comparison of the environmental variables indicated that the corrosion rates in pH = 10 pore water were 60-70% lower than in pH = 8 ground water. The effect of dissolved oxygen was low. The activation energy calculations showed that corrosion was controlled by charge transfer and in some cases corrosion was under mixed control. The comparisons to determine the effect of oxide films were done using power law modelling, box-and-whisker plots, and statistical t-tests. The analyses showed that the oxide film has in most cases no effect on corrosion. Only in air-purged ground water at elevated temperatures the oxide film increased corrosion.
View Full Article