Die-Casting Alloy Performance Testing

Die-casting alloy performance testing
Performance testing of die-cast alloys is a critical step in ensuring die-casting quality. Through scientific testing methods and precise data analysis, we can comprehensively evaluate the alloy’s physical and chemical properties, processing performance, and operational performance, providing a basis for material selection, process optimization, and quality control. The testing process must follow standardized procedures to ensure data accuracy and repeatability, encompassing full-chain performance verification from raw materials to finished product.

Chemical composition testing is fundamental to alloy performance, ensuring that alloy composition meets standard requirements and avoiding performance fluctuations due to compositional deviations. Common testing methods include spectral analysis and chemical analysis. Spectral analysis (such as direct reading spectrometers) can rapidly determine the content of major elements (such as aluminum, silicon, copper, and zinc) and trace elements (such as iron, manganese, and magnesium) in alloys. Spectral analysis, such as direct reading spectrometers, can rapidly determine the content of major elements (such as aluminum, silicon, copper, and zinc) and trace elements (such as iron, manganese, and magnesium) in alloys. With a detection time of ≤5 minutes and an accuracy of 0.001%, it is suitable for batch testing. For high-precision requirements (such as aerospace alloys), chemical analysis is required. This method uses dissolution and titration to precisely determine the composition. Iron content can be accurately measured to 0.0001%, effectively controlling impurity elements (for example, the iron content in aluminum alloys must be ≤0.8%, otherwise brittle phases will form). Test sampling must be representative, taking samples from multiple points (at least three) in the melt and mixing them, or sampling from different parts of the die-cast part to ensure that the overall composition is reflected. For products with environmental protection requirements (such as the RoHS directive), harmful elements such as lead, cadmium, and mercury need to be tested. The lead content must be ≤0.1% and the cadmium ≤0.01%. ICP-MS (inductively coupled plasma mass spectrometry) can be used, with a detection limit of 0.001ppm.

Mechanical property testing is central to assessing an alloy’s load-bearing capacity, encompassing indicators such as strength, hardness, ductility, and toughness. Tensile strength and elongation are determined through tensile testing. Standard specimens (5 mm diameter, 25 mm gauge length) are stretched using a universal testing machine (accuracy ±1%) in accordance with GB/T 228.1. Yield strength (Rp0.2), tensile strength (Rm), and elongation after fracture (A) are recorded. For aluminum alloys, Rm and A values must be ≥200 MPa and ≥3%, respectively, while for magnesium alloys, Rm and A values must be ≥220 MPa and ≥5%. Hardness testing is commonly performed using the Brinell hardness tester (HBW) and the Vickers hardness tester (HV). Brinell hardness is suitable for overall assessment (e.g., testing the casting itself, using a 250 kg load and a 5 mm steel ball), while Vickers hardness is suitable for smaller areas (e.g., coatings and welds, using a load of 1-10 kg). Aluminum alloys have a hardness of 50-100 HBW as-die-cast, reaching 100-150 HBW after heat treatment. Impact toughness is determined by the Charpy impact test. U-notch specimens are used to measure the absorbed energy on an impact testing machine. The impact energy of magnesium alloy AM60B is ≥8J, ensuring its impact resistance in low temperature environments.

Process performance testing focuses on the alloy’s die-casting suitability, predicting filling, solidification, and demolding performance during the molding process. Flowability testing uses a spiral die to measure the molten metal flow length under standard process parameters (e.g., aluminum alloy pouring temperature of 680°C and mold temperature of 200°C). For ADC12 aluminum alloy, the length must be ≥650mm. If it’s less than 550mm, composition or process adjustments are required. Shrinkage testing involves preparing a standard specimen (100mm×100mm×20mm) and measuring the dimensional difference between the die and the die after die-casting. Linear shrinkage is calculated. For aluminum alloys, the range must be between 0.8% and 1.2%. Deviations exceeding ±0.2% require optimization of holding pressure parameters. Demolding performance is evaluated through simulated die-casting tests, measuring the demolding force. For zinc alloys, the demolding force should be ≤500N, and for aluminum alloys, ≤800N. Excessive demolding force requires adjustment of the mold surface roughness or the addition of a release agent. In addition, the alloy’s melting properties also need to be tested, such as the hydrogen content of aluminum alloy (≤0.2mL/100g), which is measured using the vacuum decompression method. Excessive hydrogen content will cause porosity in die castings.

Performance testing simulates the alloy’s performance in service environments, assessing its weather resistance and reliability. Corrosion resistance is typically tested using the salt spray test (GB/T 10125). The specimen is placed in a 5% NaCl solution at 35°C in a salt spray chamber, and the time to rust onset is recorded. For 5-series aluminum alloys, red rust-free performance must be maintained for ≥1000 hours, while for zinc alloys after passivation, it must be maintained for ≥500 hours. Heat resistance is assessed through a high-temperature endurance test, where strength retention is measured at 200-300°C for 1000 hours. For 2014 aluminum alloy, this retention must be ≥80% at 250°C. Wear resistance is assessed using a pin-on-disc wear test, measuring the friction coefficient and wear volume. For A390 aluminum alloy, wear volume should be ≤5 mg/h (load 10 N, rotational speed 100 rpm ). Parts with airtightness requirements (such as hydraulic valves) must undergo a water pressure test and maintain a pressure of 10MPa for 5 minutes. No leakage is allowed. A helium mass spectrometer leak detector can detect a leakage rate of ≤ 1×10⁻⁹Pa・m³/s .

Microstructural testing deeply analyzes the internal structure of the alloy, revealing the correlation between performance and microstructure, and providing a microscopic basis for process optimization. Metallographic microscopy is used to observe grain size, phase distribution, and defects (such as pores and inclusions). Aluminum alloys must have a grain size of ≤50μm, a uniform distribution of silicon phases, and no inclusions larger than 0.1mm. Scanning electron microscopy (SEM) can observe micromorphology, such as the morphology of η and ζ phases in zinc alloys, to assess the effectiveness of aging treatments. Transmission electron microscopy (TEM) is used to analyze precipitated phases (such as the GP zone in aluminum alloys) to guide the optimization of heat treatment process parameters. For welded parts, weld microstructures must be inspected to ensure the absence of defects such as cracks and lack of fusion, and that the grain growth rate near the fusion line is ≤20%. Microstructural testing can trace the root cause of performance anomalies. The presence of a large number of acicular iron phases indicates that the iron content in the aluminum alloy is too high, requiring adjustments to the smelting process.