In article , snipped-for-privacy@green.rahul.net (Ken Smith) writes: |> In article , |> Nick Maclaren wrote: |> |> [... booth's divide ..] |> >The mind boggles! I have never seen it used, but I can believe that |> >it is. The most usual software divide is Newton-Raphson. |> |> Are you refering to the usual "find 1/X and then multiply by that" method?
No. That is a component of the method, but you need to do a bit more to get it quite right.
|> Booth's divide is more common in microcontroller land. If you have a micro |> that doesn't natively do 32 bit divides and an integer math library, |> chances are you are using a Booth's divide.
Boggle. Well, if performance isn't a major issue, I suppose that it doesn't matter what you use.
|> > Whatever |> >the case, only a few systems lack hardware division for the standard |> >types, very few applications use it for non-standard types, and the |> >proportion of time spent in such code is miniscule! |> |> Actually this isn't really true. There are a huge number of processors |> out there that don't do the divides in hardware for at least some of the |> types that are commonly used on them. These are not desk top PCs. They |> are systems that contain a processor but are not general purpose |> computers.
The context was where branch prediction of the division operation is critical for performance, as the above paragraph makes clear. If performance of the division IS a major issue, can you explain why people use Booth's algorithm IN SOFTWARE? As I said, the mind boggles!
[ All right, I do know of one case. If you have NEITHER hardware division NOR reasonable hardware multiplication, it is a real pain to code Newton-Raphson. But anyone who runs codes where division is a bottleneck on such a system is off his tiny mind. ]
Regards, Nick Maclaren.