An interesting chip design using async processors controls a single I/O pin with the conventional high/low and drive/hi-z bits. A crystal is connected to this pin and the goal is to make it oscillate.
To start the crystal is driven with a square wave of approximately the right frequency as controlled by a timing loop of the async processor. This drive loop runs for a given number of cycles and exited. The I/O pin is monitored for a period of time watching for a transition from low to high. This indicates the crystal has absorbed enough energy to ring. If there is no transition for some period of time the timing loop is incremented and the process repeated. In this manner the initialization drive is scanned across the range of software timer values expected given the variations in PVT (process, voltage and temperature) which impact the timing loop in the async processor.
Once a drive frequency makes the crystal ring, it is continued by driving the output high for a minimum amount of time each time the input transitions from a low to a high. This way the crystal controls the rate of the circuit and the drive from the processor simply responds.
Someone has gotten this to work in the lab for a wide range of crystal frequencies up to 16 MHz. I can't see a good way to analyze this circuit the way is typically done for a typical oscillator using an analog amp and a few passives to drive the crystal. I'd like to figure out a way to analyze this digital circuit to know how close it is to the edge of not working rather than having to test the crap out of it with a range of parts, etc.
Any thoughts about whether it is important to isolate the crystal from a DC bias with a capacitor? During the initial drive the crystal will see a DC bias of about half Vcc. The rest of the time the DC component of the drive will be much lower. But then the model of a crystal has no parallel resistor or inductor to pass a DC current, so one might be needed. Would a DC bias on the crystal of no more than a volt impact it's life or operation?