Hello nerds.
I have a question.
Given an extension cord being used in an AC circuit (ie, plugged into the wall) with a rating of 15A and isolated for 300V:
Assuming you always pass 15A through the cable, will the heat produced in the extension cable be the same at 120V 15A as it is at 240V 15A
I know the power transferred by the cable increases from 1800W to 3600W, but does the heat produced change?
In my mind, it doesn't, because ohmic/joule heating is: Heat = I^2 * R where R being the resistance of the wire is a constant.
Does this make sense?
I have a question.
Given an extension cord being used in an AC circuit (ie, plugged into the wall) with a rating of 15A and isolated for 300V:
Assuming you always pass 15A through the cable, will the heat produced in the extension cable be the same at 120V 15A as it is at 240V 15A
I know the power transferred by the cable increases from 1800W to 3600W, but does the heat produced change?
In my mind, it doesn't, because ohmic/joule heating is: Heat = I^2 * R where R being the resistance of the wire is a constant.
Does this make sense?