
I know that watts=volts X amps, yah? But I don't get how simply delivering twice the voltage will also result in twice the amperage at a given maximum wattage.
Forget watts, they're not really relevant unless we're talking about electrical burns.
What you want is Ohm's Law, which says:
V = I * R
Where V is voltage, I is current, and R is resistance. For the purposes of this discussion, we rearrange it thus:
I = V / R
that is, current flow (in amps) is equal to the voltage divided by the resistance.
Generalising furiously, that is the situation in all practical circumstances - the current flow is effectively determined by the potential difference (voltage) and the resistance through which is is passing. Up the voltage, and the current will increase; up the resistance and the current will decrease.
When we talk about circuits having a certain current rating, we're talking about the MAXIMUM current that can/should flow.
Where confusion can arise is that power supplies (or, if you want to be picky, power supply systems, including distribution networks) have a maximum current they can supply. In terms of domestic mains electricity, that maximum is effectively infinite (unless you live at the far end of a long low-voltage power line), but it is a characteristic of all power supplies that, as the load (resistance) across them decreases (thereby demanding more current through them), the voltage at the terminals will also drop, effectively limiting the current the supply can deliver.
I hope that makes it a bit clearer...
