Some background on myself. I am a engineering technician working with some EE folks. I'm not trained up on electrical theory and am learning as I go. Anyways, onto the questions.
I am taking some IR readings on some metalized Mylar capacitors. Caps are rated out at 1x10^5 MIN. I took 10 out of a batch of parts we produced. Readings were between 1.6x10^5 to 2.0x10^5 Mohms on our calibrated IR testing machine. Now there is another fellow we are working with who has been sent some of these caps for internal testing on his end. He does not test for IR but rather leakage current. He has a dedicated calibrated machine for leakage testing. His readings are in the 2000nA range. When I convert our IR readings to leakage, using I=V/R, I am getting around 5nA.
Is there anyone experienced with testing for IR and leakage that has any insight as to why we are getting vastly differing results? Or some insight into why a cap would excel in an IR test but flatline a leakage test? Are these technologies stressing the part in different ways?
One last thing here: We did setup a mock leakage testing rig. It uses some AC voltage into a resistor that outputs to a voltage reading device. I'm not 100% on technicalities of the setup, but we confirmed the leakage ratings on our end using this setup.
Appreciate the help.
Last edited by seth_sikora; 10-19-2022 at 02:22 PM.