Guarded Hot Plate Thermal Conductivity Tester: User Manual and Specifications

Published: 3/20/2015

6 min read

News

Overview Thermal conductivity is a key parameter for characterizing material physical properties. Industries such as aerospace, nuclear energy, construction, and non-metallic materials require predic...

Article Content

Overview Thermal conductivity is a key parameter for characterizing material physical properties. Industries such as aerospace, nuclear energy, construction, and non-metallic materials require prediction or direct measurement of thermal conductivity. Test methods are typically divided into steady-state and transient methods. This instrument is based on the guarded hot plate (GHP) method, aligned with relevant Chinese national standards, and includes targeted improvements. Tests can be performed automatically via computer with digital readouts at each state point, or completed manually. It meets the high-precision needs of laboratories and material research institutes for measuring thermal conductivity. Reference standards: GB/T 10294-1998 and GB/T 3392-82 (Plastic—Thermal conductivity test method, guarded hot plate). Key Technical Performance - Application scope: Determination of the thermal conductivity of homogeneous, plate-shaped materials in dry or various moisture conditions. - Thermal conductivity range: 0.015–2 W/(m·K) or 0.035–5 W/(m·K). - Supports guarded hot plate steady-state testing with controlled experimental temperatures. - Digital instrumentation with temperature measurement accuracy better than 0.2 grade; automatic ambient temperature electronic compensation, with optional external ice-point compensation. - Power: 220 V/50 Hz with a high-precision voltage stabilizer. - Measurement performance: accuracy ±3% (±5%); precision ±2% (±3%); repeatability ±2%. - Metered heating power: 35 W ±1%. - PC connectivity for automated testing; under steady-state conditions, a data set can be acquired in as little as 6 seconds. - Displays experimental parameters and curves; supports data printout. - Operating conditions: ambient temperature 10–35 ℃; relative humidity ≤80% RH; room temperature stability with daily average fluctuation ≤±1.5 ℃. - Specimen size and effective area: maximum 200 × 200 × 50 mm; minimum 100 × 100 × 5 mm. The contact surface with the plates must be flat. Instrument Composition The system integrates a high-precision stabilized power supply, temperature measurement instruments, upper-plate heater, upper/lower plates, thermocouples, upper/lower constant-temperature water baths, and a computer-based testing system. - Specimen section: includes a constant-temperature device, maximum 80 ℃. - Heating system: metered-power heater. - Guarded hot plate layer with settable temperature. - Temperature measurement: high-accuracy digital instruments with internal zero compensation (modular designs may also be used). - Metered power uses constant heating with digital display, accuracy better than 1%, adjustable. - Computer test components: PC, communication interfaces, and software. Operating Procedure A. Specimen requirements 1) Sample from a homogeneous area of the material. 2) Final specimen size must not exceed 200 × 200 × 50 mm. 3) Prepare flat specimen surfaces and test 3–5 pieces per batch; a batch must share the same formulation, with bulk density variation <5%. 4) The two faces must be parallel with uniform thickness. The plate contact face must be flat and tightly coupled; apply a thin layer of the same material powder or a high-temperature thermal grease. No impurities or dust. 5) For powders, use a confining frame and prepare as above. 6) Measure thickness with a vernier caliper to 0.01 mm, or use another method; measure four points and use the average. Record thickness in meters. 7) Follow the relevant national standards for material-specific specimen conditioning. B. Operation 1) Temperature controller verification: Set the cold and hot plate bath temperatures equal and 5 ℃ above ambient. Close the instrument cover (ensure the central heating plate is OFF). When central, guard, and cold plate readings stabilize, adjust the controller offset so all three read the same as the bath temperature. 2) Thickness gauge zeroing: Close the cover and pre-compress the four dial indicators by 25 mm. 3) Test steps: Place the specimen between the hot and cold plates and close the cover. Set the cold plate bath to ambient +5 ℃; set the hot plate bath to 10 ℃ above the cold plate. When main and guard plate temperatures are stable, energize the main hot plate. Adjust power so the main and guard plate temperatures are equal. The main hot plate temperature drift must be ≤0.5 ℃ per hour. 4) Record core inputs: specimen thickness and heater power. 5) After mounting the specimen, power the instrument. After 15 minutes, set the test temperatures per the instrument manual (e.g., 100 ℃). Begin the test when the setpoints are reached. 6) When all temperature displays stabilize within ±0.5 ℃ of setpoints, start the PC and open the test software. 7) Proceed with automated or manual acquisition as supported by the software. 8) Enter the test temperature. If valid, continue to enter specimen thickness (mm) and effective area (mm²). 9) The system enters a simulated heat-flow model. When model criteria are satisfied, testing begins. The software computes thermal conductivity at intervals and, after completion, outputs results including the average thermal conductivity. If connected, a test report can be printed. 10) Return to the initial state and repeat 3–5 times using equivalent specimens. The final result for the specimen is based on these repeated tests. Notes and Formula Inputs - The instrument is based on the guarded hot plate steady-state method, with modeling enhancements to support computer-based simulation and evaluation. - Using five-channel thermocouple inputs, users can implement alternative test models if required. - Principle: under stable temperatures on both faces of the specimen, measure heat transfer through the effective area to determine thermal conductivity. - Units and symbols: I (heater current, A); U (heater voltage, V); d (specimen thickness, m); S (effective area, m²); ΔZ (time interval, hours); ΔT (average temperature difference between plates, K). Report results to three decimal places. - Proper specimen preparation is essential for valid results. Follow the supplied sample size and thickness guidelines when applicable. - The supplied computer is a desktop; keep interface cables to the instrument as short as practical. - Power requirements: 220 V with the instrument’s high-precision stabilizer. If using an external supply, ensure reliability and proper grounding. Software Installation 1) Confirm the system environment: Windows XP/SP2 and Office 2000. 2) Insert the provided CD, locate the appropriate folder (e.g., DRX-1), run setup.exe, and follow prompts until installation completes successfully. 3) After verifying instrument connections, launch the application from the Start menu or a desktop shortcut. Included Components - Test main unit: 1 set. - High-precision constant-temperature bath controllers: 2 units. - Computer with printer: 1 set (per contract configuration). - Test software: 1 set. - Reference standard sample: 1 piece. - User manual: 1 copy. - Certificate of conformity: 1 copy.

Original Article

This article was originally published on our Chinese website. You can view the original version below:

View Original Article →

Contact Us for More Information

Have questions about our products or services? Get in touch with our team for expert assistance.

Send us a Message
Contact Information

WeChat
WeChat QR CodeScan to connect on WeChat