CPU AND peripheral clocks

Why are peripheral clocks divided by 2 or 4 or by 8 times of the CPU clock?

Why can't they run at CPU core clock speed?

This is just a general question and not restricted to any processor or architecture.

Reply to
vivek
Loading thread data ...

Its as simple as this ...PERIPHERALS ARE SLOWER than processor.

-dK

Reply to
dk

If you run the pheripheral at the higher speed compare to the CPU. Then you can loose the sunchronization between CPU and the pheripherals.

Reply to
badal_akr

In theory, they could.

In practice, it usually takes more effort, more silicon area and higher power consumption to run at higher speed. At the highest speeds, it may not even be possible to achieve.

Putting a lot of effort in an optimized CPU core makes sense, since it can be used in a large number of peripheral configurations.

Also, it is usually not critical for peripherals to run at highest speed.

Reply to
Arlet

Basically, because the clock is too fast. There are two or three primary effects on clock propagation.

One is simple delay. It takes time for a signal to travel along a line. That line is longer when the peripheral is external to the chip.

Another is capacitance. A proximate conductor can add loading, and disturb the effective period of the clock.

A third is gain and trigger points. An amplifier used to regenerate the clock on the peripheral will have values for these. They will also disturb the timing.

The fundamental purpose of the clock is to synchronize things. Often the idea is that signals start to change when the clock rises, and get used when the clock falls. With the above, and other (such as noise) disturbances we no longer know when those signals are valid. It is already amazing that clocks can be as fast as they are.

--
 [mail]: Chuck F (cbfalconer at maineline dot net) 
 [page]: 
 Click to see the full signature
Reply to
CBFalconer

Thanks a lot. Very useful for me.

When programming for a chip, to set the initial configuration, how do i select whether a peripheral should be cpu clock/2 or cpu clock/4??

These peripherals are inside the chip (like Timers etc...)

Reply to
vivek

You need to know how fast you want to run the peripheral and set the clock speed at the slowest rate that will support that. The only real reason for using a slower programmable rate is to minimize power consumption. Otherwise feel free to run the peripheral clock as fast as it will go!

Reply to
rickman

Not neccessarily, there are several AVR's implementing a PLL which will allow a timer to run at 64 MHz, while the CPU core is running 1-8 MHz.

Many AT91 processors implement a dual PLL, allowing the CPU to run at one clock frequency, while other things (like USB) run at a higher clock frequency.

--
Best Regards,
Ulf Samuelsson
 Click to see the full signature
Reply to
Ulf Samuelsson

They can - Which chip(s) are you talking about ?

In some chips, the peripherals can clock faster : USB and Special PWM timers are examples.

It would be nice to clock ANY Peripherals FASTER than the core in some cases, but life gets tricky here : Those peripherals need to interact with the SFR flags, and it is not trivial to allow fast HW and (now slower) core, share the same SFR flags.

So most uC that have clock prescalers, change the clock to the Core and peripherals at the same time, because that's avoids any special flag handling.

-jg

Reply to
Jim Granville

For synchronisation between the CPU and the peripherals.

A peripheral will be designed with the maximum speed that it requires to operate on so that it would be efficient and it does not need extra speed unnecessarily as it might not operate on it. Further, it is expensive to design and most of the peripherals do not require that speed.

Karthik Balaguru

Reply to
karthikbalaguru

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.