Hi All
I've obtained the "The 8051 Micro Controller and Embedded Systems Using Assembly and C-2nd-Ed -", and in this manual the DELAY routine is described as below, which really confuses me a lot
######################################
In original 8051, one machine cycle lasts 12 oscillator periods So, 1 oscilator period for a 12Mhz Crystal will be 12.0 Mhz / 12 = 1Mhz Therefore one machine cycle is is 1/1Mz = 1us (1 micro second)
For 12Mhz the delay is as follows for this example: DELAY : MOV R3 ,#200 ;= 1 Machine Cycle REPEAT: DJNZ R3,REPEAT ;= 2 Machine Cycles RET ;= 2 Machine Cycles END
Therefore the above routine equals to a delay of 403us
"#200" is equal to a delay of , [(200x2)+1+2]x1us = 403us (403 Microseconds)
######################################
If the above is 100% correct, how will a 1 Second delay look like ?
I see a few examples showing a 1 sec delay as " MOV R1, #0FFH" - but don't seem to "reverse engineer" the "#0FFH" back to 1 Second using the above information
Can someone please help me to understand this, I don't just want to use this without knowing what I do
Kind Regards Lodewicus Maas