Has anyone successfully bit-banged an I2C interface? Any comments on it's ease or difficulty? I've used I2C with processors that have built-in hardware, I've sat next to a guy who's bit-banged it, but I've never bit-banged it myself. I ask as a circuit designer who's going to design in an I2C chip to be presented to a software engineer with a smile and a "here, this ought to be easy!" I want to know whether I'll be honest or not. -- Tim Wescott Wescott Design Services http://www.wescottdesign.com
Bit-banging I2C
Started by ●March 16, 2016
Reply by ●March 16, 20162016-03-16
On 16.3.2016 г. 20:49, Tim Wescott wrote:> Has anyone successfully bit-banged an I2C interface? Any comments on > it's ease or difficulty? > > I've used I2C with processors that have built-in hardware, I've sat next > to a guy who's bit-banged it, but I've never bit-banged it myself. > > I ask as a circuit designer who's going to design in an I2C chip to be > presented to a software engineer with a smile and a "here, this ought to > be easy!" I want to know whether I'll be honest or not. >Master only I2C bitbanging is quite easy, I did it in a couple of hours when I did it first some 15 years ago (on a HC11....). Cost me a lot longer in more recent years to grasp some pervert logic behind some in-built I2C controllers (two of them, both wasted me at least a day each IIRC). Slave mode will probably be somewhat more complex and you will hit some MCU speed imposed restrictions. Dimiter ------------------------------------------------------ Dimiter Popoff, TGI http://www.tgi-sci.com ------------------------------------------------------ http://www.flickr.com/photos/didi_tgi/
Reply by ●March 16, 20162016-03-16
On 2016-03-16, Tim Wescott <seemywebsite@myfooter.really> wrote:> Has anyone successfully bit-banged an I2C interface?Several times. Sometimes it's easier to bit-bang it than it is to figure out how to get a broken I2C controller to work right. [Not to mention any names <cough-Samsung-cough>]> Any comments on it's ease or difficulty?It can be pretty easy. If you're doing it on bare metal with predictable timing for delay loops and without interrupts, and you don't try to implement clock-stretching, it's quite easy. For the data signal you need both an input pin and an open-collector output pin. If you can read back the electrical level on an open-collector output pin (as opposed to the output latch value), then that's all you need for data. If you're not doing clock-stretching, then all you need for the clock is a generic output pin (doesn't need to be open-collector, and you don't need to read it). If do you want to impliment clock-stetching, then the clock signal has the same requirements as the data signal. If you want to try to use some sort of interrupt-driven counter/timer output or waveform generator peripheral, then it may take a bit more work. But, if you want predictable timing in an interrupt-driven multi-threaded environment that's probably what you'd need. -- Grant Edwards grant.b.edwards Yow! I wish I was a at sex-starved manicurist gmail.com found dead in the Bronx!!
Reply by ●March 16, 20162016-03-16
On 2016-03-16, Grant Edwards <invalid@invalid.invalid> wrote:> On 2016-03-16, Tim Wescott <seemywebsite@myfooter.really> wrote: > >> Has anyone successfully bit-banged an I2C interface? > > Several times. Sometimes it's easier to bit-bang it than it is to > figure out how to get a broken I2C controller to work right. [Not to > mention any names <cough-Samsung-cough>] > >> Any comments on it's ease or difficulty? > > It can be pretty easy.[I was assuming you want your bit-banging a master. I've never bit-banged a slave implimentation, and my gut feeling is that it would be a bit more difficult.] -- Grant Edwards grant.b.edwards Yow! Oh, I get it!! at "The BEACH goes on", huh, gmail.com SONNY??
Reply by ●March 16, 20162016-03-16
On 3/16/2016 2:49 PM, Tim Wescott wrote:> Has anyone successfully bit-banged an I2C interface? Any comments on > it's ease or difficulty? > > I've used I2C with processors that have built-in hardware, I've sat next > to a guy who's bit-banged it, but I've never bit-banged it myself. > > I ask as a circuit designer who's going to design in an I2C chip to be > presented to a software engineer with a smile and a "here, this ought to > be easy!" I want to know whether I'll be honest or not.Do you have a free interrupt for the clock pin and do you have a free timer? I bet you can map the state machine for I2C to those two things rather than polling the interface. That said, I haven't done it. I have done this for a UART receiver. Rather than sample at each clock interval I use an interrupt on input transitions (no noise in the line obviously) and measure the position with the timer. Easy to figure how how many bits from the last transition. I2C starts with a high to low on the data line while the clock is high and the end condition is a lot to high on the data while the clock is high. Otherwise all data transitions are while the clock is low. So on the end of a message you will need to poll or something to find the end condition. Otherwise, interrupt on the falling edge of the clock when you are waiting for the start condition, on the rising edge of the clock when expecting data, sample the data and Bob's your uncle. I guess I'm assuming you are implementing the slave. I think the master is easier and can be driven completely from a timer interrupt unless you want to implement clock stretching. -- Rick
Reply by ●March 16, 20162016-03-16
On Wed, 16 Mar 2016 21:03:01 +0200, Dimiter_Popoff wrote:> On 16.3.2016 г. 20:49, Tim Wescott wrote: >> Has anyone successfully bit-banged an I2C interface? Any comments on >> it's ease or difficulty? >> >> I've used I2C with processors that have built-in hardware, I've sat >> next to a guy who's bit-banged it, but I've never bit-banged it myself. >> >> I ask as a circuit designer who's going to design in an I2C chip to be >> presented to a software engineer with a smile and a "here, this ought >> to be easy!" I want to know whether I'll be honest or not. >> >> > Master only I2C bitbanging is quite easy, I did it in a couple of hours > when I did it first some 15 years ago (on a HC11....). > Cost me a lot longer in more recent years to grasp some pervert logic > behind some in-built I2C controllers (two of them, both wasted me at > least a day each IIRC). > > Slave mode will probably be somewhat more complex and you will hit some > MCU speed imposed restrictions.Master is what I'm looking at. Just need an MCU to talk to a slave device. -- Tim Wescott Wescott Design Services http://www.wescottdesign.com
Reply by ●March 16, 20162016-03-16
On Wed, 16 Mar 2016 15:51:25 -0400, rickman wrote:> On 3/16/2016 2:49 PM, Tim Wescott wrote: >> Has anyone successfully bit-banged an I2C interface? Any comments on >> it's ease or difficulty? >> >> I've used I2C with processors that have built-in hardware, I've sat >> next to a guy who's bit-banged it, but I've never bit-banged it myself. >> >> I ask as a circuit designer who's going to design in an I2C chip to be >> presented to a software engineer with a smile and a "here, this ought >> to be easy!" I want to know whether I'll be honest or not. > > Do you have a free interrupt for the clock pin and do you have a free > timer? I bet you can map the state machine for I2C to those two things > rather than polling the interface. That said, I haven't done it. I > have done this for a UART receiver. Rather than sample at each clock > interval I use an interrupt on input transitions (no noise in the line > obviously) and measure the position with the timer. Easy to figure how > how many bits from the last transition. > > I2C starts with a high to low on the data line while the clock is high > and the end condition is a lot to high on the data while the clock is > high. Otherwise all data transitions are while the clock is low. So on > the end of a message you will need to poll or something to find the end > condition. Otherwise, interrupt on the falling edge of the clock when > you are waiting for the start condition, on the rising edge of the clock > when expecting data, sample the data and Bob's your uncle. > > I guess I'm assuming you are implementing the slave. I think the master > is easier and can be driven completely from a timer interrupt unless you > want to implement clock stretching.I'm dropping a slave chip onto the board and telling the software guy "look what I've done for you!". Mostly I want to make sure I'm not painting the guy into a corner. -- Tim Wescott Wescott Design Services http://www.wescottdesign.com
Reply by ●March 16, 20162016-03-16
On 3/16/2016 4:02 PM, Tim Wescott wrote:> On Wed, 16 Mar 2016 21:03:01 +0200, Dimiter_Popoff wrote: > >> On 16.3.2016 г. 20:49, Tim Wescott wrote: >>> Has anyone successfully bit-banged an I2C interface? Any comments on >>> it's ease or difficulty? >>> >>> I've used I2C with processors that have built-in hardware, I've sat >>> next to a guy who's bit-banged it, but I've never bit-banged it myself. >>> >>> I ask as a circuit designer who's going to design in an I2C chip to be >>> presented to a software engineer with a smile and a "here, this ought >>> to be easy!" I want to know whether I'll be honest or not. >>> >>> >> Master only I2C bitbanging is quite easy, I did it in a couple of hours >> when I did it first some 15 years ago (on a HC11....). >> Cost me a lot longer in more recent years to grasp some pervert logic >> behind some in-built I2C controllers (two of them, both wasted me at >> least a day each IIRC). >> >> Slave mode will probably be somewhat more complex and you will hit some >> MCU speed imposed restrictions. > > Master is what I'm looking at. Just need an MCU to talk to a slave > device.I have trouble remembering who posted what. Did you say previously that you aren't as good with the digital stuff as you are with the analog? Would you care for any help designing the I2C interface? I'm pretty free at this point. -- Rick
Reply by ●March 16, 20162016-03-16
On Wed, 16 Mar 2016 19:06:30 +0000, Grant Edwards wrote:> On 2016-03-16, Tim Wescott <seemywebsite@myfooter.really> wrote: > >> Has anyone successfully bit-banged an I2C interface? > > Several times. Sometimes it's easier to bit-bang it than it is to > figure out how to get a broken I2C controller to work right. [Not to > mention any names <cough-Samsung-cough>] > >> Any comments on it's ease or difficulty? > > It can be pretty easy. > > If you're doing it on bare metal with predictable timing for delay loops > and without interrupts, and you don't try to implement clock-stretching, > it's quite easy. > > For the data signal you need both an input pin and an open-collector > output pin. If you can read back the electrical level on an > open-collector output pin (as opposed to the output latch value), then > that's all you need for data. > > If you're not doing clock-stretching, then all you need for the clock is > a generic output pin (doesn't need to be open-collector, and you don't > need to read it). If do you want to impliment clock-stetching, then the > clock signal has the same requirements as the data signal. > > If you want to try to use some sort of interrupt-driven counter/timer > output or waveform generator peripheral, then it may take a bit more > work. But, if you want predictable timing in an interrupt-driven > multi-threaded environment that's probably what you'd need.Actually, the project where I was a spectator to the process tried to use a Phillips I2C interface chip that had a really screwy microprocessor interface -- it was supposed to autodetect an 8080-style interface, but when connected to an 80186 all it managed to do was lock up. So, we bit- banged. -- Tim Wescott Wescott Design Services http://www.wescottdesign.com
Reply by ●March 16, 20162016-03-16
On Wed, 16 Mar 2016 16:06:01 -0400, rickman wrote:> On 3/16/2016 4:02 PM, Tim Wescott wrote: >> On Wed, 16 Mar 2016 21:03:01 +0200, Dimiter_Popoff wrote: >> >>> On 16.3.2016 г. 20:49, Tim Wescott wrote: >>>> Has anyone successfully bit-banged an I2C interface? Any comments on >>>> it's ease or difficulty? >>>> >>>> I've used I2C with processors that have built-in hardware, I've sat >>>> next to a guy who's bit-banged it, but I've never bit-banged it >>>> myself. >>>> >>>> I ask as a circuit designer who's going to design in an I2C chip to >>>> be presented to a software engineer with a smile and a "here, this >>>> ought to be easy!" I want to know whether I'll be honest or not. >>>> >>>> >>> Master only I2C bitbanging is quite easy, I did it in a couple of >>> hours when I did it first some 15 years ago (on a HC11....). >>> Cost me a lot longer in more recent years to grasp some pervert logic >>> behind some in-built I2C controllers (two of them, both wasted me at >>> least a day each IIRC). >>> >>> Slave mode will probably be somewhat more complex and you will hit >>> some MCU speed imposed restrictions. >> >> Master is what I'm looking at. Just need an MCU to talk to a slave >> device. > > I have trouble remembering who posted what. Did you say previously that > you aren't as good with the digital stuff as you are with the analog? > Would you care for any help designing the I2C interface? I'm pretty > free at this point.I can do stuff at that level equally well in analog and digital. It's efficient design of fast FPGA stuff that I can't do. Basically I can write Verilog (or VHDL) code that's good enough to prove that something can be done, yet still atrocious enough to provoke someone like you to say "here, Tim, let me do that for you...". -- Tim Wescott Wescott Design Services http://www.wescottdesign.com