EmbeddedRelated.com
Forums
The 2024 Embedded Online Conference

Problem of using LPC2138 to SPI interface with DAC LTC1451

Started by ntchien2013 June 6, 2013
i am trying to use LPC2138 to control LTC1451 (DAC) by SPI protocol. I use Keil MDK-ARM uvision4.

LTC1451: 12 bit, CPHA=0, CPOL=0; When CS is low the CLK signal is enable, so the data can be locked in. When CS is pulled high, data is loaded from the shift register into the DAC register. Load MSB first.

My code is as the following. However, when I simulated with Proteus, it is not as the value that I put in.
- It seems that I am only transferring 8 bit even I set 12 bit transfer in the S0SPCR. For example value = 0x3FF to 0x5FF. Output does not change. Vout=4.09.
- It seems that I am sending LSB even though I am sending MSB first in the S0SPCR.
- I have to send twice the signal so that I can somehow work!
- When value = 0x0F, output is 0.24V but when value=0x10, output=2.06V!

Does anybody knows what the problems are? Is that the problem with the code or with the simulation software?
Any suggestion would be very appreciated. I am stuck now!

Code:

An Engineer's Guide to the LPC2100 Series

> i am trying to use LPC2138 to control LTC1451 (DAC) by SPI protocol. I use Keil MDK-ARM uvision4.

Looking at the timing diagram in the datasheet, I don't see how you are raising the CS' signal while the last clock is still high.

I would want to study the timing a bit more because I don't see how you can use the internal SPI gadget to serve the 1451. It looks to me like you would have to bit-bang the pins. However, I haven't spent nearly enough time looking at the actual timing specs.

Try to write a little code that directly controls the data, clock and cs' pins.

I have never used the 1451 so I have no idea if I am correctly reading the timing diagram.

Richard
Dear Richard,

thank you for your reply. I think it should be possible to use SPI protocol to control TLT1451. In the code first I low the CS pin (P0.10) then writing to data register (SP0SPDR). When the transfer is done, I just rise the CS pin high to have signal lock to DAC register of 1451.
I ever try to program GPIO pin of NI card and it worked fine. I though SPI with LPC would be easier!
It could be that you are right. However, your code doesn't work so maybe not...

If you actually have to raise the CS' signal during the last clock, as it shows in the diagram, there is no way you can do it with the SPI gadget. The timing diagram is not what I would expect from a true SPI device.

I would put the write operation in a continuous loop and look at it on a scope or logic analyzer.

I would still recommend bit-banging the signals just to see if it worked. I would only take a few minutes to write the loop. I'm pretty sure the LPC2138 is slow enough on GPIO that you won't violate the timing constraints even if you don't use any delays. Or, slow down the peripheral clock for testing purposes.

First you have to get it to work. Then you can try to make it pretty.

Richard
You've set up the control register to send LSB first. I've looked at the LTC1451 data sheet and SPI mode 0 should work. The only requirement is that the clock should be low when CS is brought low. There is no requirement to raise the CS when the clock is high.

Jeff
Dear Richard and Jeff,

thank you for your comments. I will try with GPIO first to see if it works (I did in the real device with National Instrument card and it worked). However, I will try again with this LPC2138.
@Jeff: I just played around with the setting of sending LSB or MSB first, but the results seems to be all the way around with what I set!
Dear all,

i did the bit bang program as the following. It works fine. 12 bits, MSB first end there was no need to rise the CS at the last clock signal as the timing diagram in the datasheet of LTC1451. The question, however, is still there! Why it is not possible with SPI protocol of LPC2138. Should I use SSP instead of?
code:

> i did the bit bang program as the following. It works fine. 12 bits, MSB first end there was no need to rise the CS at the last clock signal as the timing diagram in the datasheet of LTC1451. The question, however, is still there! Why it is not possible with SPI protocol of LPC2138. Should I use SSP instead of?

Back to your original code:



I usually use the SSEL pin as GPIO, set it to output and use it for CS' but it's your choice.



Otherwise, your code is similar to mine (for a different project):

As a side issue, I would be very careful about using:

IODIR0 = 0x400;

This will destroy any previous settings. It will probably become important in larger programs.

Use something like:

IODIR0 |= 0x400;

Richard
Dear Richard,

thank you for your code. I wonder to know if "spi.h" is important in the code that you posted.
I tested my code with SPI Debugger and found that the LPC2138 only transfers 8 bit data eventhough I set to transfer 12 bit! If your code already work fine with the real device, I think this might the problem with the simulation software. Thank you very much for your help.

The 2024 Embedded Online Conference