I pick a value equal to, or somewhat lower than, the (output delta V) / (rated peak I) of the gate driver. So a "3A" driver at 12V would need 4 ohms, so it gets 3.3 or 4.7 ohms. Doesn't impact performance much (it still delivers ~3A peak), and helps ensure good behavior.
Funny thing is, most FETs have about as much internally (if you're lucky enough to find a datasheet specifying internal equivalent Rg; approximated in the SPICE model, if you have one, with a comparable value), so it should actually be halving the performance (crudely speaking). As time goes on I've been ratcheting down the resistors in my designs and haven't had a problem yet.
Well, there was that one breadboard, where I made a 12V 10ns gate driver (not quite as fast as those flat body, wide lead, RF driver chips IXYS-DEI makes, but I have a different design, better than that), running at 3MHz or so. Since it was breadboarded (i.e., dead bugged), I ran nice long, wide striplines out to the power section. Which made a nice tone burst generator when the transition hit the transistor. A teeny ferrite bead cured that with almost no impact on switching speed. Poor little ferrite bead does get awfully hot at 3MHz though.
Tim
--
Deep Friar: a very philosophical monk.
Website: http://seventransistorlabs.com
"Vladimir Vassilevsky" wrote in message
news:9fadne4oN-F9-lrNnZ2dnUVZ5j2dnZ2d@giganews.com...
> When they draw connection diagrams for power FETs or FET drivers in
> datasheets, they usually put 10 Ohm resistor in series with the
> gate.What is the rationale behind that? How did they come up with almost
> universal value of 10 Ohm?
>
> I could understand slowing down edges for EMC reason, or limiting gate
> current, or dumping stray inductance to avoid oscillation on
> transitions. That depends on particular application. But why always 10
> Ohms ?
>
> Vladimir Vassilevsky
> DSP and Mixed Signal Consultant
> www.abvolt.com
>
>