For example a 250w lamp has a lower arc current than a "320w or 360w" rated lamp
I think I see the basic of the notunderstanding:
You are used to regular mains circuit, where the voltage is given by the energy source (so the mains, battery,...) and current by the load (so e.g. an incandescent lamp,...). So if the energy source is given (e.g. 120V socket), the voltage is there pretty independent on what the load does. So if you connect there 120V/60W lamp, it would draw 0.5A and the transfered power would be 60W. If you put there 240V/60W lamp, the current would be something above 0.125A and real power something above 15W (even when the lamp is still rated 60W), as the lamp is not matched to the energy source.
Well, with the discharge lamp vs their ballast interface it is basically "the upside down": The lamp, so the load is, what dictate the voltage and n ot the ballast. The ballast is then, what dictate the current.
So you can not say "The 250V lamp has 2.15A arc current and is rated for 125V arc voltage", but you have to say "The 250W MV is rated for 2.15A and has 125V arc voltage"
The current is, what the ballast feed it, so if you connect the 250W lamp to the 400W ballast feed 3.5A, the arc current would be 3.5A (dictated by the 400W ballast) and the lamp would have still 125V arc voltage (as MV's keep the arc voltage over wide range of current), so it would yield the same real power (~400W) as with the 400W lamp (because all the MV's and probe MH's happen to have very similar arc voltage of about 100W is designed to have 110V, 1kV to have about 130V across the arc). The delivered power by given ballast would be always the same with lamps having the same arc voltage, regardless what is their rating. So if the 3.5A is above the lamp's rated current, the lamp is overdriven (e.g. the 250W 2.25A lamp on a 3.5A "400W" ballast), if lower, the lamp is underdriven (1kW lamp rated for ~8A run on 3.5A "400W" ballast)
In other words the 3.5A "400W" ballast feed the 400W only into lamp having 125V arc voltage, but into whatever lamp having about the same 125V arc voltage.
When you use a lamp, which have lower arc voltage (let say 100V) and run it on the same 3.5A ballast, the current stay the same (as it is dictated by the ballast, not the lamp), the delivered power would be V*A*PF=100*3.5*0.9=320W (the lamp power factor is below 1, as the voltage and current have each different shape - the voltage is about rectangular, the current is more like sinewave, so PF=0.9 or from the CWA a triangle, so PF=0.87=~0.9 as well)
If I say it in different words: As on 120V mains would work any incandescent rated for 120V regardless of the rated power, on a 3.5A ballast would run any discharge lamp rated for 3.5A, regardless of the rated power. Don't look as much on the ballast "power", as that figure only mean for what lamp the ballast was originally designed for, but does not say anything about the real power delivered to the lamp.
With CWA and MV/probeMH the thing is simple, as the lamps have constant voltage (based on how they are designed) and the CWA feed constant current (based on their design).
With HPS the thing is way more complex, as the HPS arc voltage is somehow dependent on the arc current the lamp get from the ballast and in order to make the system thermally stable, the ballasts are designed to have their output current somehow dependent on the voltage the lamp have.
So in order to get the real lamp volts and ballast currents, you have to draw both dependencies into one VA graph and the resulting lamp voltage and ballast current (so the power) would be given by the point, where the two lines cross.