How to test an amp for ohm rating?
Pick three frequencies you care about; I'd pick 50 Hz for the bass, 1KHz for the midrange, and 15 KHz for the highs. Hook the genreator to the input of the amp, hook the 'scope to the output, and put a 16 ohm resistor acorss the output. Turn the generator up in level until the sine wave flattens just slightly at the top or the bottom. Measure the peak-to-peak voltage, and divide by 2.8 to get the rms voltage. Square that number and divide it by the resistance to get the power. Do that at each of the three frequencies. Then lower the resistance of the load, until the clipping votage begins to sag. Measure the power of the amp at that resistance. If that's enought power for you, don't go below that resistance. If it's not, you can lower the resistance slightly below that and observe the waveform at the output for any anomolies. If it looks good and the amp stays cool, you're probably OK.
During all this testing, continually monitor the amp for overheating. If it overheats, you need to minimize the amount of time you operate the amp at full output.
Yeah, it's crude. If you want anything more accurate, you'll need a spectrum analyser or a THD (total harmonic distortion) meter.
Jim
Jim



I think you're safe as long as the amp's output is voltage limited, which limitation usually occurs because of the capacity of the power supply. When the current the load is drawing is sufficient to make a siginificant change in the voltage the amp is capable of delivering, it's time to think about not drawing any more current (not making the impedance of the load lower).
You might be able to get away with a lower impedance value than you'd get with this method I came up with, but you'd have to know more about the internal design of the amp to tell.
Jim
Trending Topics
Plan B: Contact the manufactuer of the amp, give them its part number and ask them what the imediance should be.
The Best of Mercedes & AMG





