Ohm's law

Ohm's law states that, in an electrical circuit, the current passing through a conductor between two points is proportional to the potential difference (i.e. voltage drop or voltage) across the two points, and inversely proportional to the resistance between them. In mathematical terms, this is written as:

where I is the current in amperes, V is the potential difference in volts, and R is a constant, measured in ohms, called the resistance. The potential difference is also known as the voltage drop, and is sometimes denoted by E or U instead of V.

The law was named after the physicist Georg Ohm, who, in a treatise published in 1827, described measurements of applied voltage, and current passing through, simple electrical circuits containing various lengths of wire, and presented a slightly more complex equation than the one above to explain his experimental results. The equation above could not exist until the ohm, a unit of resistance, was defined (1861, 1864).

The resistance of most resistive devices (resistors) is constant over a large range of values of current and voltage. When a resistor is used under these conditions, the resistor is referred to as an ohmic device because a single value for the resistance suffices to describe the resistive behavior of the device over the range. When sufficiently high voltages are applied to a resistor, forcing a high current to flow through it, the device is no longer ohmic because its measured resistance is different (typically greater) from the value measured under standard conditions (see temperature effects, below).