## Tuesday, 14 October 2008

### I2C on an AVR using bit banging

As a exercise I tried to talk to a I2C temperature sensor using bit banging, it was not as easy as I thought so I decided to post the code in case anyone needs to see the solution, if you happen to use my code drop me a line since that will encourage me to post more code :-)

```// Port for the I2C
#define I2C_DDR DDRD
#define I2C_PIN PIND
#define I2C_PORT PORTD

// Pins to be used in the bit banging
#define I2C_CLK 0
#define I2C_DAT 1

#define I2C_DATA_HI()\
I2C_DDR &= ~ (1 << I2C_DAT);\
I2C_PORT |= (1 << I2C_DAT);
#define I2C_DATA_LO()\
I2C_DDR |= (1 << I2C_DAT);\
I2C_PORT &= ~ (1 << I2C_DAT);

#define I2C_CLOCK_HI()\
I2C_DDR &= ~ (1 << I2C_CLK);\
I2C_PORT |= (1 << I2C_CLK);
#define I2C_CLOCK_LO()\
I2C_DDR |= (1 << I2C_CLK);\
I2C_PORT &= ~ (1 << I2C_CLK);

void I2C_WriteBit(unsigned char c)
{
if (c > 0)
{
I2C_DATA_HI();
}
else
{
I2C_DATA_LO();
}

I2C_CLOCK_HI();
delay(1);

I2C_CLOCK_LO();
delay(1);

if (c > 0)
{
I2C_DATA_LO();
}

delay(1);
}

{
I2C_DATA_HI();

I2C_CLOCK_HI();
delay(1);

unsigned char c = I2C_PIN;

I2C_CLOCK_LO();
delay(1);

return (c >> I2C_DAT) & 1;
}

// Inits bitbanging port, must be called before using the functions below
//
void I2C_Init()
{
I2C_PORT &= ~ ((1 << I2C_DAT) | (1 << I2C_CLK));

I2C_CLOCK_HI();
I2C_DATA_HI();

delay(1);
}

// Send a START Condition
//
void I2C_Start()
{
// set both to high at the same time
I2C_DDR &= ~ ((1 << I2C_DAT) | (1 << I2C_CLK));
delay(1);

I2C_DATA_LO();
delay(1);

I2C_CLOCK_LO();
delay(1);
}

// Send a STOP Condition
//
void I2C_Stop()
{
I2C_CLOCK_HI();
delay(1);

I2C_DATA_HI();
delay(1);
}

// write a byte to the I2C slave device
//
unsigned char I2C_Write(unsigned char c)
{
for (char i = 0; i < 8; i++)
{
I2C_WriteBit(c & 128);

c <<= 1;
}

return 0;
}

// read a byte from the I2C slave device
//
{
unsigned char res = 0;

for (char i = 0; i < 8; i++)
{
res <<= 1;
}

if (ack > 0)
{
I2C_WriteBit(0);
}
else
{
I2C_WriteBit(1);
}

delay(1);

return res;
}```

1. hi raul! i think i'm gonna give your code a try. by the way, what device are you driving and what is your system clock? (for timing references)
do you have external pullup resistors?

thanks!

2. Damned handy example. Thank you kindly for it.

3. Hey, this is exactly what I need, I'll give it a try, thanks a bunch.

4. code is working fine i checked it with atmega16 on 12MHz...

5. Awesome Toshu, thanks for sharing!

The Atmega16 is the server?

6. Thx a lot. This is exactly what I want.

7. A nice code! Tested using ATtiny25&LCD module&DS18B20 temperature sensor&internal ATtiny sensor!

Thank you a lot!!
Marek (Czech Rep.)

8. A nice code! Tested using ATtiny25&LCD module&DS18B20 temperature sensor&internal ATtiny sensor!

Thank you a lot!!
Marek (Czech Rep.)

9. Thanks for the code example. Can you tell me why you are using DDRD when you want to set the port high or low? It seems backwards to me, like you would want PORTD, right?

For example, I'd think the only time you need to set the DDR (data direction) would be during the init, and then on it's always the same. Then when you send data you'd use the PORTD command, but instead you use DDRD to send data?

I'm just ramping up to AVRs so I might be confused. Would appreciate any clarification, thanks!

10. Thanks dude:-)
i am going to refer the same

is it possible to share the datsheet of your microcontroller?

11. Hi Raul, thank you for sharing. I want to use your code as a base for a I2C lib for the http://www.smarthomatic.org project. What kind of license do you want to apply to your code? GPL V3 would be great.
Thanks,
rr2000

12. Hi Raul, thank you for sharing. I want to use your code as a base for a I2C lib for the http://www.smarthomatic.org project. What kind of license do you want to apply to your code? GPL V3 would be great.
Thanks,
rr2000

1. Vi rr2000,

Im happy you find my lib useful, gpl v3 sound good to me if that helps you, feel free to add a reference to this article and give me credits :)
Good luck and all the best!

This code in I2C_WriteBit() seems to be preparing for a stop bit. You don't need it unless the next action is a stop, so it belongs in the beginning of the I2C_Stop() routine, not in I2C_WriteBit().
if (c > 0)
{
I2C_DATA_LO();
}
delay(1);

(This isn't actually wrong, really, but it's an unnecessary step that makes the code a little harder to understand, IMO.)

However, there is something legitimately wrong with this code.
In both I2C_WriteBit() and I2C_ReadBit(), you must wait for clock to be high before moving on. If you read the I2C specification more closely, both the master and the slave(s) can control the clock line. When a slave controls it, it is referred to as "clock stretching" and is used to slow a master that is going too fast to keep up with. If you don't take clock stretching into account, some devices will not function well with this routine.
Therefore, in both functions,
I2C_CLOCK_HI();
delay(1);

must change to:
I2C_CLOCK_HI();
while ((I2C_PIN & (1 << I2C_CLK)) == 0);
delay(1);

for reliability's sake.

Also, I don't know why "return I2C_ReadBit();" is commented out in I2C_Write(). That is the correct thing to do when writing a byte. Maybe you were running into problems with the ACK routine because clock stretching wasn't taken into account above?
Anyway, my apologies for what might be perceived as criticisms -- I think this is a very handy routine, and should do the trick in a number of cases. Just thought that with a few small tweaks, it might be even better and much more robust.
Thanks for posting it!