Yes. This is an issue for e.g. switched-mode PSUs, where you may have large MOSFETs with significant gate capacitance, and you're switching at hundreds of kHz.
When using a MOSFET as a switch, the power dissipation is low when it's fully off (low current) or fully on (low voltage drop), but high when it's in between (both current and voltage drop are significant). Consequently, when switching large currents, you want to spend as little time in the linear region as possible, which means charging and discharging the gate capacitance as quickly as possible.
So while an idealised MOSFET has infinite gate resistance (and a practical MOSFET isn't far from that), the combination of significant capacitance and fast switching times mean that power-MOSFET drivers are designed to source/sink pulses of several amps to/from the gate.
However, this isn't something you necessarily need to worry about for driving an LED. Even if you're using PWM to vary the brightness, the switching frequency doesn't need to be more than a few hundred Hz. And a MOSFET designed for less than 1A will have much lower gate capacitance than one rated at 50A or more. Finally, efficiency is less of a consideration; if a 1kW PSU is "only" 95% efficient, that's still 50W of heat which needs to be removed.
The main consideration for using an FET with a microcontroller is that you want one which will saturate (turn fully on) at a sufficiently low gate voltage. These are normally termed "logic level" FETs, meaning that the output from a (nominally) 5V device will be sufficient to drive the gate directly. The data sheet for a device will normally have a graph of channel resistance (Rds(on)) against gate voltage (Vds).