August 2020 Volume 2
FORGING RESEARCH
Using the Ring Test to Measure Forging Friction Factor as a Function of Temperature and Lubrication Conditions By Kester D. Clarke, Trevor Kehe, Spencer Randell, Andy Korenyi Both and Stephen Midson Colorado School of Mines, Metallurgical and Materials Engineering Department, Golden, CO, USA
Aluminum forgings are typically produced using hardened H13 steel dies, and friction is controlled by the application of lubricant onto the die faces between each deformation step [1]. The lubricant serves to reduce the friction between the workpiece and the die material, allowing better flow within the die cavity, less workpiece material buildup on the dies, and reduced stress on the workpiece and die surfaces as they slide relative to each other. Reducing, or even eliminating, the use of conventional organic or graphitic lubricants has been noted as a goal of the forging industry [2,3], but estimating the friction factor as a function of forging conditions can be challenging, and quantitatively evaluating the relative performance of various lubricants or PVD coatings applied to the die faces to reduce friction is necessary so that systematic studies can be performed. One of the goals of this project was to identify a testing methodology that could quantitatively distinguish between the various coatings and test conditions. The ring compression test [4-7] was chosen because it is a relatively easy test to perform, simulates metal deformation conditions present in commercial forging applications better than other test procedures (such as pin on-disk, for example), and relatively simple test equipment can be used. The ring test involves the compression of thin metallic rings with controlled dimensions (in this case OD:ID:thickness in the ratio of 6:3:2), and, as shown schematically in Figure 1a, the friction factor can easily be estimated based on measurements of the ring dimensions after forging (change in height and change in ID). The friction factor is defined as [6]: ng the ring test to measure forging friction factor as a function of temperature and ication conditions er D. Clarke, Tr vor Kehe, Spe cer Randell, Andy Korenyi-Both, St phen Midson orado School of Mines, Metallurgical and Materials Engineering Departme t, Golden, CO, A minum forgings are typically produced using hardened H13 steel dies, and friction is controlled he application of lubricant onto the die faces between each deformation step [1]. The lubricant es to reduce the friction between the workpiece and the die material, allowing better flow in the die cavity, less workpiece material buildup on the dies, and reduced stress on the kpiece and die surfaces as they slide relative to each other. Reducing, or even eliminating, the of conventional organic or graphitic lubricants has been noted as a goal of the forging industry ], but estimating the friction factor as a function of forging conditions can be challenging, and ntitatively evaluating the relative performance of various lubricants or PVD coatings applied e die faces to reduce friction is necessary so that systematic studies can be performed. One of goals of this project was to identify a testing methodology that could quantitatively distinguish ween the various coatings and test conditions. The ring compression test [4-7] was chosen use it is a relatively easy test to perform, simulates metal deformation conditions present in mercial forging applications better than other test procedures (such as pin-on-disk, for mple), and relatively simple test equipme t can be used. The ring test involves the compression hin metallic rings with controll d di ensions (in this case OD:ID:thickn ss in the ratio of 2), and, as shown schematically in Fi ure 1a, the friction factor can asily be estimated based measurements of the ring d mensions after forging (change in height and change in ID). The ion factor is defined as [6]: = = ℎ ℎ ℎ friction factor, m , can then be plotted on analytically-determined calibration curves, an mple of which is shown in Figure 1b [4]. of the challenges in performing ring tests to simulate hot forging of aluminum is that the die perature must be represent tive of temperatures experi nced in rea production settings. Thus, designed tooling for a hydraulic forging press that allows the use of ste l i serts that could be y preheated before fo ging, or, fo future evaluations, have con rolled surface modifications as PVD coatings or surface texturing. Objectives of the modified test equipment included the ty to quickly switch die faces, pre-heat the die inserts, and measure both load and displacement ng testing. The test die design is shown in Fig 2a, noting design features that allow die inserts e removed and inserted easily. The H13 steel inserts are roughly the size of a hockey puck (3 es in diameter and 1.5 inches tall), and can be readily preheated in air furnaces prior to testing. The friction factor, m, can then be plotted on analytically determined calibration curves, an example of which is shown in Figure 1b [4]. One of the challenges in performing ring tests to simulate hot forging of aluminum is that the die temperature must be representative of temperatures experienced in real production settings. Thus, we
designed tooling for a hydraulic forging press that allows the use of steel inserts that could be easily preheated before forging, or, for future evaluations, have controlled surface modifications such as PVD coatings or surface texturing. Objectives of the modified test equipment included the ability to quickly switch die faces, pre heat the die inserts, and measure both load and displacement during testing. The test die design is shown in Fig 2a, noting design features that allowdie inserts to be removed and inserted easily.TheH13 steel inserts are roughly the size of a hockey puck (3 inches in diameter and 1.5 inches tall), and can be readily preheated in air furnaces prior to testing. In the future, many of these inserts could be made economically using different steel alloys or heat treatments, and they could also be easily coated. The larger holders were fabricated from 4140 steel, and five pairs (upper and lower) of inserts were fabricated from H13 steel hardened to about 42 HRC. Prior to testing, the faces of the steel inserts were polished to a consistent finish. This involved grinding to a 1200 grit finish, followed by polishing using 6 μm , 3 μm and 1 μm diamond paste. For testing at room temperature, aluminum rings with an OD of 1.0-inches and an ID of 0.5-inches were simply placed onto inserts pre-loaded into the holders on the hydraulic press, and compressed using the hydraulic press. For lubricated tests, two types of lubricants were used ‒ an aerosol graphite spray (CNC brand) and a molybdenum-based (MoS2) grease called Molykote. The lubricants were applied directly to the aluminum rings rather than to the steel die inserts. For elevated temperature testing, the steel inserts were loaded into an electric resistance box furnace pre-heated to a temperature approximately 50°C/90°F hotter than the test temperature (furnace temperature of about 250°C/482°F for a test temperature of 200°C/392°F, and a furnace temperature of about 150°C/302°F for a test temperature of 100°C/212°F). The inserts were heated in the furnace for about two hours prior to testing. Once the inserts were preheated to the required temperature, they were removed from the furnace, inserted into the holders, and an aluminum test ring placed onto the lower insert (previous testing has shown that the aluminum rings would heat within seconds to the
FIA MAGAZINE | AUGUST 2020 53
Made with FlippingBook - Online magazine maker