It's defined exclusively by the resistivity of copper and the amount of power dissipation you can live with (temp rise, cooling capability, etc.).
A superconductor can be made in any size, and always dissipates zero power and stores as much energy as its operating temperature and critical field allow.
A piece of resistance wire cannot produce much magnetic field without getting excessively hot, no matter the size.
So, conductivity is a direct proportion into it. And conductivity (or the lack thereof) makes heat, so the limit is how much heat you can remove for a given safe temp rise.
I'm taking your question to mean, "assuming some given geometry, how should the dimensions be scaled for a given maximum power dissipation and temp rise?" So that, a low power magnet can be small, dense enough to achieve that temp rise, and therefore will have some magnetic field strength corresponding to all that copper and current. Wereas a very large one is limited by its surface area (e.g., electromagnets handling cars in the scrap yard), and may be limited to a much lower H at its most intense point, without overheating.
What I'm not taking your question to mean is, whether the magnetic field is maximal strength for a given size, or anything. That's a matter of geometry, which requires hard questions answered by extensive computation, or simulation. Everything else is just scaling, and power tradeoffs.
The peak field intensity should scale down with increasing size, because power density goes as a 2/3 law (unless additional cooling is provided). Though field intensity is dropping, total energy storage (or putting it another way, the total volume of relatively intense field) scales up with size.
Tim