Several times. Sometimes it's easier to bit-bang it than it is to figure out how to get a broken I2C controller to work right. [Not to mention any names ]
It can be pretty easy.
If you're doing it on bare metal with predictable timing for delay loops and without interrupts, and you don't try to implement clock-stretching, it's quite easy.
For the data signal you need both an input pin and an open-collector output pin. If you can read back the electrical level on an open-collector output pin (as opposed to the output latch value), then that's all you need for data.
If you're not doing clock-stretching, then all you need for the clock is a generic output pin (doesn't need to be open-collector, and you don't need to read it). If do you want to impliment clock-stetching, then the clock signal has the same requirements as the data signal.
If you want to try to use some sort of interrupt-driven counter/timer output or waveform generator peripheral, then it may take a bit more work. But, if you want predictable timing in an interrupt-driven multi-threaded environment that's probably what you'd need.
Grant Edwards grant.b.edwards Yow! I wish I was a
at sex-starved manicurist
Do you have a free interrupt for the clock pin and do you have a free timer? I bet you can map the state machine for I2C to those two things rather than polling the interface. That said, I haven't done it. I have done this for a UART receiver. Rather than sample at each clock interval I use an interrupt on input transitions (no noise in the line obviously) and measure the position with the timer. Easy to figure how how many bits from the last transition.
I2C starts with a high to low on the data line while the clock is high and the end condition is a lot to high on the data while the clock is high. Otherwise all data transitions are while the clock is low. So on the end of a message you will need to poll or something to find the end condition. Otherwise, interrupt on the falling edge of the clock when you are waiting for the start condition, on the rising edge of the clock when expecting data, sample the data and Bob's your uncle.
I guess I'm assuming you are implementing the slave. I think the master is easier and can be driven completely from a timer interrupt unless you want to implement clock stretching.
Actually, the project where I was a spectator to the process tried to use a Phillips I2C interface chip that had a really screwy microprocessor interface -- it was supposed to autodetect an 8080-style interface, but when connected to an 80186 all it managed to do was lock up. So, we bit- banged.
Not I2C, but I've bitbanged One-Wire, including enumeration (though not high-speed), in an MSP430. I assume this relates to your battery gauge question? If so, I'd think One-Wire was preferable - only one pin to diddle.
However, to get bit timings right, the important question is whether you can afford to stop the world while doing a data transfer. Trying to interrupt for each bit transition might be too hard. I guess you wouldn't need to read the gas gauge very often - perhaps you can pick a time when you can lock out all other interrupts?
Just make sure the I2C clock/data are connected to MCU pins that can be configured as open-collector (or open-drain), that there is a footprint for an external pull-up resistor on each, and that the software can read the electrical level (0/1) for each those signals.
That means that if the software outputs '1' on both those pins, and somebody else is pulling them low, the software needs to be able to read those pins and see that they're low -- as opposed to just reading back the '1' value that was written to the output latches for those pins.
On _some_ MCUs that means the clock and data signals each have to be connected to both an open-collector output pin and to an input pin.
Grant Edwards grant.b.edwards Yow! If I felt any more
at SOPHISTICATED I would DIE
the physical part is simple but there is not really a standard and there is lots of stuff that you almost have to bit bang because the "naive" 8 bits serial doesn't cut it, e.g. weird number of bits, clock phase changing between read and write parts of serial stream, delays need in weird places, data lines doubling as busy signals etc.
I'm confused now. I thought you were building the interface into the analog chip you were designing. Are you saying you can't find a suitable MCU with SPI? Or are you saying you are using an I2C interface slave chip and using the I/Os to control your chip?
I have vague memories that some old Intel uControllers acted like that. I can't think of anything I've used in the past 10-15 years where it was a problem, but it can't hurt to check on it before you lay out a board. :)
Interesting. I originally saw only ones like that.
Basically, if your MCU has pairs of "port data registers" (PDR) and "data direction registers" (DDR), odds are that a PDR is (or used to be, in its original versions) a single register which on Read delivers either the port's designated output vale, or the actual input, depending on whether the DDR configure the port as an output or input.
This gradually improved over the last 10 years or so, as MCU designers began to make MCU pins cleverer with each generation, at the expense of making them more complex to handle. Nowadays a single port pin may require more than 10 registers to be configured to operate it (turn off "higher" peripherals using the same pin or configure the crossbar to map pin to I/O, port direction, digital vs. analog, input amplifier enable, hi/lo thresholds, filters, output drive strength, pull-up, pull-down, configure clock and power supply to the relevant sub-sections of the MCU, ...)