Can a 10W 5ohm resistor take 100V for 3 seconds?

rg12

100 kW
Joined
Jul 26, 2014
Messages
1,591
Hey Guys,

I want to use this resistor for measuring IR.

Will it hold for 3 second tests?
 
It can take as many volts as you want as long as you don't exceed 10W through it.
 
wesnewell said:
It can take as many volts as you want as long as you don't exceed 10W through it.

I'm not an expert but at 5ohms 100v it means a ton more than 10w but I'm talking about 3 seconds, just to measure the drop.
 
Voltage has nothing to do with the load wattage. As long as the load is no more than 10W, it should last years. If you don't know what the load is, no one can answer your question. It is good for a 0.1A load VA=W. 100V times 0.1A=10W.
 
I'm just going to give my answer without looking at the others, so my perspective is fresh.
P = V²/R

= 100²/5

= 10000/5

=2000w

I think that your resistor will burn out before three seconds at that power level since that is 200 times the power rating.
 
Solcar said:
I'm just going to give my answer without looking at the others, so my perspective is fresh.
P = V²/R

= 100²/5

= 10000/5

=2000w

I think that your resistor will burn out before three seconds at that power level since that is 200 times the power rating.

Thank you :)
 
ROFLMAO. The R in his equation is the load. Meaning the load is only a 5 ohm resistance circuit. That equation is to determine the power it produces and has nothing to do with your question, which can't be answered if you can't provide what the load is, either in watts, amps or the resistance of what you or connecting the resistor to.
 
wesnewell said:
ROFLMAO. The R in his equation is the load. Meaning the load is only a 5 ohm resistance circuit. That equation is to determine the power it produces and has nothing to do with your question, which can't be answered if you can't provide what the load is, either in watts, amps or the resistance of what you or connecting the resistor to.

100v will push 20 amps through it. That's 2KW, the power of an electric fire.
 
I have this guy that can build an IR meter for me and I asked him if I have to use a 5ohm resistor in the tester since it will draw lots of amps and he said that you need to draw max amps in order to determine the IR.
That sounds a little odd as I thought about just throwing in an 800ohm resistor and then we wouldn't have that huge fire hazard.
 
Why not use an incandescent light bulb as an infrared source
 
rg12 said:
wesnewell said:
It can take as many volts as you want as long as you don't exceed 10W through it.

I'm not an expert but at 5ohms 100v it means a ton more than 10w but I'm talking about 3 seconds, just to measure the drop.


Solcar said:
I'm just going to give my answer without looking at the others, so my perspective is fresh.
P = V²/R
= 100²/5
= 10000/5
=2000w

I think that your resistor will burn out before three seconds at that power level since that is 200 times the power rating.

:idea: I would have a completely different approach to solve this problem :idea:

Whatever if 100V or 10 V... What matters is not the voltage, but the "Voltage across the resistor"... In other words, it's the "Voltage drop" at a given current that matters rather than the voltage : dV.... not V... Please try to remember this nuance. Google "Voltage across resistor" whenever you can't remember this.

Also the heat lost is dP rather than P (P is the total power used in the circuit (for exemple, including the power drawn by the controller).... dP is the part of the power that's lost in the circuit in the form of heat dissipated through the resistor).

So you defined dP = 10 W and R = 5 Ohms.... The circuit has an EMF (electromotive force) of 100V.

Knowing that dP = R x I^2 and that dV = R x I and also that dP = dV x I,
You can see that 100V does not matter.

Say you want to run your 5 Ohms resistor at its maximum 10 Watts rating (dP=10W).... What will be the voltage drop (the voltage across the resistor) ???

Well dP = R x I^2....
So at maximum rating of resistor (Max 10W).... The max current (I) rating can be calculated :
I^2 = dP/R = 10W / 5 Ohms
I^2 = 2
I = Square root of 2
So (doing square root), maximum current I = 1.414 Amps.... That's the figure you need to worry about. Don't exceed 10W also means don't exceed that 1.414 A rating (assuming the resistor is exactly 5 ohms value).

Now since dP = dV x I
then dV = dP / I
So voltage drop (aka voltage across the resistor) is dV = 10W / 1.414 A
dV = 7.07 Volts of drop..

What this means is with all other parameters being constant (dP = 10W ; R = 5 Ohms, and so I = 1.414A)
If you run your circuit with a 10 V battery, the resistor will take 7.07 V off.... You will have 2.93V left for your motor/controller
If you run your circuit with a 100 V battery, the resitor will still take 7.07 V off... You will have 92.93 V left for your motor/controller
If you run your circuit with a 1 000 V battery, the resistor will still still take 7.07 V off... You will have 992.93 V left for you motor/controller
If you run your circuit with a 10 000 V battery, the resistor will still still still take 7.07 V off... You will have 9 992.93 V left for your motor/controller.
If you run your circuit with a 100 000 V battery, the resistor will still still still still take 7.07 V off... You will have 99 992.93 V left for your motor/controller.

Now... Do you see how the resistor really really doesn't care at all about the voltage of the battery... The resitor will only care about stealing 7.07V to whaterver voltage is going through, period. You can run 3 volt or 100000V, it just doen't care.

But run 2 amps on that "5 Ohms-10W" rated resistor (once again whater voltage you choose don't matter), and you will burn it very quickly (2A @ 5 Ohms means 20W, which is well above the 10W rating... Once again, I used the dP = R x I^2 formula)

In conclusion, whatever the voltage your battery, If you want to do testing with a 5 Ohms resistor, you need to know what maximum load you want to be testing with (10A ? 50A ? 1.414A ?). That is, you need to know what's the maximum current that you will draw from your battery during those testings.

If you want a 5 Ohm resistor to handle 10 Amps, well dP = R x I^2
so you need a 500 Watt rating at least on that 5 ohm resistor.

If you want do your testing at a maximum of 50A, once again dP = R x I^2
You need a 12500 Watt rating at least on that 5 ohm resistor.

Matador.
 
Rg12, If you are talking about connecting a 10W 5 ohm resistor directly across a 100V source, then it will depend on the amperage of the 100V source. If it's not limited to 0.1A, it will blow the resistor. I did not think that was what you were asking. Maybe this will help.
https://www.google.com/search?q=how+to+build+a+battery+ir+tester&oq=how+to+build+a+battery+ir+tester&aqs=chrome..69i57.10999j0j2&sourceid=chrome&ie=UTF-8
 
Isn't the load determined by the resistor's resistance like talked here above?

You guys didn't mention the fact that a resistor can take alot more than it's rated for if it's only for a few seconds...I'm not talking about having it so many watts above it's rating continuosly.

What about having the resistor be 800ohm? how come that doesn't help?
 
Matador2 said:
In conclusion, whatever the voltage your battery, If you want to do testing with a 5 Ohms resistor, you need to know what maximum load you want to be testing with (10A ? 50A ? 1.414A ?). That is, you need to know what's the maximum current that you will draw from your battery during those testings.

If you want a 5 Ohm resistor to handle 10 Amps, well dP = R x I^2
so you need a 500 Watt rating at least on that 5 ohm resistor.

If you want do your testing at a maximum of 50A, once again dP = R x I^2
You need a 12500 Watt rating at least on that 5 ohm resistor.

Matador.
He wants to measure IR, which I take as internal resistance. If he's going to use a separate load, what does he need the 10w resistor for? He could measure the drop accros the load. I read OP as he wants to connect his resistor directly accross a 100v battery and use the voltage drop to calculate the battery's internal resistance.
 
rg12 said:
Isn't the load determined by the resistor's resistance like talked here above?

You guys didn't mention the fact that a resistor can take alot more than it's rated for if it's only for a few seconds...I'm not talking about having it so many watts above it's rating continuosly.

What about having the resistor be 800ohm? how come that doesn't help?

An 800ohm resistor would heat with about 12w, so won't blow, but it won't give you an accurate measurement because the accuracy of the 800 ohms is low compared with the battery's internal resisistance. Also, its resistance will change as it heats up. It could be used to compare two different 100v batteries as long as you use the actual battery voltage and not the nominal.
 
So what do you guys suggest? is there a way to do that without going with huge resistors?
remember that it's only for about 3 seconds...
 
Try a 120-240V light bulb.
 
d8veh said:
Matador2 said:
In conclusion, whatever the voltage your battery, If you want to do testing with a 5 Ohms resistor, you need to know what maximum load you want to be testing with (10A ? 50A ? 1.414A ?). That is, you need to know what's the maximum current that you will draw from your battery during those testings.

If you want a 5 Ohm resistor to handle 10 Amps, well dP = R x I^2
so you need a 500 Watt rating at least on that 5 ohm resistor.

If you want do your testing at a maximum of 50A, once again dP = R x I^2
You need a 12500 Watt rating at least on that 5 ohm resistor.

Matador.
He wants to measure IR, which I take as internal resistance. If he's going to use a separate load, what does he need the 10w resistor for? He could measure the drop accros the load. I read OP as he wants to connect his resistor directly accross a 100v battery and use the voltage drop to calculate the battery's internal resistance.


Well then, If he is going to put the resistor directly across the battery terminals, as long as the resistor has a high enough internal resistance and can handle the heat, that should not just dead-short circuit the battery...
A light bulb might be a good alternative.
 
rg12 said:
So what do you guys suggest? is there a way to do that without going with huge resistors?
remember that it's only for about 3 seconds...

What's your intention here rg12 ?

If it's to measure the battery internal resistance in DC, I would do this approach : https://endless-sphere.com/forums/viewtopic.php?f=14&t=87173&hilit=LTO&start=25#p1276187
(Also see second graph here : https://endless-sphere.com/forums/viewtopic.php?f=31&t=88051&start=50#p1289912)

It works well to do single cell testing.
Also should work with very big batteries but you would need some really big resistor.

Matador
 
Matador said:
Well then, If he is going to put the resistor directly across the battery terminals, as long as the resistor has a high enough internal resistance and can handle the heat, that should not just dead-short circuit the battery...
A light bulb might be a good alternative.

but how's he going to use that to measure the battery's IR?
 
d8veh said:
Matador said:
Well then, If he is going to put the resistor directly across the battery terminals, as long as the resistor has a high enough internal resistance and can handle the heat, that should not just dead-short circuit the battery...
A light bulb might be a good alternative.

but how's he going to use that to measure the battery's IR?

Plotting V = - r I + E
Like I did here : https://endless-sphere.com/forums/viewtopic.php?f=14&t=87173&hilit=LTO&start=25#p1276187
The slope gives you -r... r being the internal resistance of the battery.

All I did was use different loads (with different resitors) to see different voltage drop.
Do a graph
Get the slope and voila.

Matador
 
Back
Top