Evaluating & Improving a Steel for Impact Wear Resistance


Earth-moving equipment and associated tools are subject to a variety of wear mechanisms, and this proves to be a challenge with respect to qualifying the best material for general application.

Preference for quenched and tempered steels for ground engaging tools is often related to other performance criteria, e.g. fatigue, and notch or fracture toughness. Even in a loose-soil work site, occasional buried and unmovable obstacles produce impact-loading; quenched and tempered steels provide better fracture resistance.

In a study, a Mn-Si-Mo-V steel was formulated to minimize solidification shrinkage porosity with the goal of obtaining a high-strength and impact-resistant steel. Three chemistries, baseline, baseline plus 2 wt. % nickel (Ni) and baseline with 1.5 wt. % Ni but no vanadium (V), were produced keeping a manganese to carbon ratio of 5-1 (Table 1). Steels were heat treated to produce an average prior austenite grain size of 50-70 µm, a lath martensitic microstructure, and a Stage I tempered hardness of 525-560 HBW (53-55 HRC). Yield strength and ultimate tensile strength averaged 1482 MPa (215 ksi) and 1930 MPa (280 ksi).

Nickel additions proved effective in enhancing both tensile ductility and impact toughness. The alloy without vanadium showed the least dependence of tensile ductility with distance from the chill and varied between 11% to 7%. Impact wear was simulated using a gouging abrasion test, and the alloys demonstrated a 27% to 46% reduction in wear relative to steels currently employed.

All three alloys produced the expected Stage I tempered martensitic microstructure, but the measured hardness was lower than expected for these steels. Most notably in Table 2 was a greater difference between calculated and measured hardness for the Ni-containing alloys. The addition of Ni lowers both the martensite start temperature and the rate of martensite transformation.

Plate martensites are more common when formed below 392F (200C), and as much as 20% plate martensite may be expected for the baseline +Ni steel. Metallographic results appear to corroborate the calculated kinetics. Measurement of prior austenite grain size was in reasonable agreement of the size calculated for the time and temperature that the steel was held prior to quenching (Table 2). The smaller measured grain size observed in the baseline heat is likely due to grain pinning by AlN or VN.

In general, the tensile properties decreased with distance from the chill. In a study for cast Cr-Ni-Mo alloys using the same casting configuration, the tensile ductility was shown to be related to the maximum pore length measured on the fracture surface. Measurements of pore lengths on tensile fracture surfaces for the alloys reported here are being conducted and will be reported later. A significant decrease in ultimate tensile strength also was observed when the elongation to failure was less than 4%.

At present there is no method to relate quantitative measurements of porosity as collected on random metallographic cross sections to the maximum shrinkage pore size that might exist in the gauge section of a tensile bar. However, it is reasonable to assume that material with a higher porosity coverage would exhibit lower tensile ductility. A comparison of the porosity coverage between the baseline and the baseline +Ni would support this claim. To obtain a tensile elongation greater than 9% in the baseline +Ni steel would require the area coverage of porosity to be less than 0.024% with a secondary dendrite arm spacing of less than 100µm. A more stringent requirement of lower area coverage would be necessary for the same steel with a greater secondary dendrite arm spacing since the pores would be larger. This is observed for the baseline +Ni alloy where the same area coverage of 0.024% produced 6.5% elongation to failure with a secondary dendrite arm spacing of 375µm.

The variation in tensile properties with distance from the chill is more significant for the baseline alloy, and other mitigating factors might be contributing to the lower properties than just greater porosity. First, Type II (eutectic) MnS inclusions will lower tensile ductility and notch toughness. Second, the tensile fracture transitioned from a predominately ductile, microvoid coalescence to a brittle cleavage fracture. Brittle fracture was also observed in the baseline +Ni heat but only in association with porosity, and the pore size was generally smaller in comparison to the baseline chemistry.

In recent work, the cleavage fracture observed close to shrinkage pores in cast HY80 and HY100 steel was associated with M7C3 carbide precipitates with plate-like morphologies. The higher alloy content in these last regions to solidify could encourage coarse plate-like carbides. However, this fracture feature also was observed for both Charpy V-notch specimen and tensile specimen. In the study presented here, the cleavage like fracture was only observed during tensile testing. Furthermore, the carbide stability study conducted based upon the last 15% liquid does not predict M7C3 formation.

A “fish-eye” fracture feature was noted, and this would be indicative of hydrogen damage if the fracture was intergranular. It’s been reported the occurrence of hydrogen induced intergranular or transgranular quasi-cleavage fracture largely depends on tramp element (P, Sb, Sn, As) segregation to grain boundaries and the subsequent loss of grain boundary cohesive strength. Hydrogen embrittlement is typically studied in Ni-Cr-Mo type steels used in energy production. These steels are usually tempered above 572F (300C), which lowers the soluble carbon in ferrite to that being in equilibrium with cementite (Stage III tempering) or with the alloy carbide formed during secondary hardening. These elevated tempering temperatures also promote tramp element segregation to grain boundaries. Intergranular fracture is also a common fracture mode observed for both tempered martensite embrittlement in plain carbon steels and temper embrittlement in alloy steels in the presence of phosphorus.

Hydrogen induced quasi-cleavage fracture has been reported for the HY-class of steels when the effects of tramp element segregation are mitigated. Quasi-cleavage fracture also has been reported in Stage I tempered steels for a SAE 10B22 steel tempered at 302F (150C). Both boron and carbon increase the cohesive strength of grain boundaries. This lesson was learned during the development of interstitial free steels where intergranular fracture was characteristic of cold work embrittlement. The benefits of Stage I tempering now can be recognized as contributing to grain boundary cohesive strength as a result of the higher carbon content in solution.

The role of shrinkage porosity is similar to blister formation where hydrogen gas accumulates and builds up pressure within a cavity. The pore also can act as an irreversible trap, which is an activation energy greater than 50 kJ/mol. If the hydrogen is not fully removed by baking, then upon austenitization the thermal energy will be sufficient to release the hydrogen back into solution near the pore with austenite having a higher solubility for hydrogen. The pores also may act as stress risers and enhance the sensitivity of the steel to hydrogen as has been shown for martensitic advanced high strength steel.

Hydrogen damage in steels show a strain rate dependence with more damage occurring at slower strain rates. Charpy impact testing is not immune to hydrogen damage, as has been shown for line pipe steels. Hydrogen damage is manifested as a shift in the ductile to brittle transition temperature to higher temperatures. This shift in temperature is concentration dependent and the magnitude of the shift increases with hydrogen content. Thus, hydrogen induced cracking may be observed. It is thus reasonable to conclude air-melted steels with ultimate tensile strengths in excess 1,860 MPa (270 ksi) may be subject to hydrogen embrittlement and shrinkage porosity may require longer baking times to remove the hydrogen. Thus, the cleavage fracture associated with porosity in these ultra-high strength steels may be an indication of hydrogen damage.

The main difference between the baseline and the Ni-containing alloys can be described by the mode of solidification. In two calculated phase diagrams using the baseline chemistry with manganese being the independent variable, and then using 1.77 wt.% Mn with the baseline chemistry to plot the phase diagram with Ni as the independent variable, only the baseline composition goes through the peritectic reaction.

Notch toughness as measured by absorbed energy during Charpy impact testing also shows a dependence upon distance relative to the chill. Impact energies greater than 20 J (15 ft-lb) were obtained only from steel closest to the chill. Notch toughness increased with Ni content and overall the baseline +Ni alloy performed the best. Prior austenite grain size did not seem to be a factor, since the measured values for the three alloys were within the error of the measurement (±10%). The role of eutectic solidification of the MnS has not been accounted for, but the sulfur content of 80 ppm is relatively low and the Baseline tensile specimen closest to the chill had elongations to failure of 8% which is reasonable for a cast steel with an ultimate tensile strength of 1986 MPa (288 ksi).

The gouging wear data has been plotted with respect to hardness in Figure 4. The role of microstructure is depicted in Figure 4a and the best performers are the baseline +Ni and the austenitic steel. The same data is plotted in Figure 4b and the results are sorted by YS/UTS ratio. The tensile properties from the second row (away from the chill) were used to represent the steel produced in this study, since the wear test coupons were taken furthest from the chill. The most gouging wear resistant alloys tend to be those with low YS/UTS ratios. Thus, the major conclusion drawn from this study is that both high hardness for wear resistance in loose soil or sand can be combined with a low YS/UTS to produce good gouging wear resistance. The mechanism of this improvement is the rapid rate of work hardening, which is expected to decrease the depth of cutting in the wear scar and thus produce a smaller volume of material removal during each abrasive cut.

The role of Ni in the improved wear properties of the materials tested appears to be in lowering the yield strength to produce a lower YS/UTS ratio, i.e. greater work hardening as Ni is increased. The hardness of the counterbody was less than all three alloys tested from this study. As previously noted, a softer counterbody may produce more aggressive wear as a result of the rock sticking and rolling across the counterbody exposing fresh rock face to the steel being tested. The contribution of Mn-C solid solution defect pair to the work hardening of these Stage I tempered martensitic steels is still speculative, but the results obtained are better than quench and tempered steels with similar Ni contents. The higher wear rates for the baseline steel may be related to porosity and the presence of Type II sulfides.

Gouging wear resistance of quenched and tempered steel is improved by increasing the rate of work hardening as characterized by a low ratio (<0.8) of yield strength to ultimate tensile strength. The alloys formulated in this study would provide an excellent choice for a general application ground engaging tool providing both high hardness for resistance to 2-body wear as encountered in loose or sandy soils and resistance to gouging impact wear associated with rock breaking and crushing. The alloys studied have not been optimized, but provide a good starting point for formulating new steels with improved performance.