I am setting up the register definitions and initialization constants for a new processor. As a peripheral rich product, there are over 400 registers defined for the part. For the most part, the registers initialize to zero. Most bits, whether defined or not, are R/W. They read as zero, and must be written as zero. But there are exceptions: Undefined bits that Read as Zero, but must be written as ones.
There is one register on the part that assigns DMA control pins. It is an 8 bit register with two bits assigned to each of 4 DMA controllers. But there is only one valid setting for the four two bit combinations. The other values are prohibited. Setting these bits to zero is defined as prohibited, yet the reset value of the register is 0x00! Hopefully this is just an errata, but if not, it does not inspire a lot of confidence.
It seems like processors now carry such a feature rich set of peripherals that each pin has 4 to 6 possible functions. The initialization is no longer trivial. And recovery from erratic events becomes harder. We always include a full programmed defined initialization sequence as part of the startup sequence (which is why I am creating and verifying the definitions).
I understand that there is a lot of logic in the chip. I can understand sub-setting where bits are defined in higher performance members of a product family. But surely there is enough space for inverters and read back logic to keep a consistent set of bit definitions. It seems like more and more products are just being sewn together Frankenstein style rather than being carefully thought out. Anyone else seeing this. What about ARM based products/
b. Farmer