The power into the transformer will equal the power out, plus any losses. The ratio of the voltage in vs the voltage out will be determined by the ratio of the windings of the transformer. Similarly, the amperage in vs the amperage out will also be determined by that ratio, but in an inverse fashion.

How much power is drawn will be determined by the loads you connect. If the load requires 100A at 120V, it will try and draw that. Whether the transformer is capable of delivering 100A without burning up is a different story, and depends on the construction of the transformer (e.g. the wire gauge used internally).

So the panel has to be able to deliver the power required (e.g. 50A at 240V), the transformer has to be able to handle that level of power based on its specifications, and the wiring on both sides has to be sized for the amperage it will carry.

If you had a 208V to 240V transformer connected to a 240V to 120V transformer, and everything was powered by a 208V 50A circuit, that would still not give you 120V at 100A. It would still be 86A, minus the losses of both transformers. You'd be worse off than before due to double losses.