EmbeddedRelated.com
Forums

Why should I (not) use an internal oscillator for 8-bit micros

Started by Schwob August 14, 2004
Neil Bradley wrote:
> >> Well, we're mincing words. I'm not saying that > >> a UART is synchronous. I'm saying that for the > >> duration of the byte being transmitted, they need > >> to for all intents and purposes, synchronized, or > >> damned close to it. > > No, by defintion they are not.
This discussion is confusing the general term 'synchronized' with the comms term 'synchronous'. They are not interchangeable in this context. Synchronous comms is where the baud rate is clocked using a signal transmitted in parallel to the data stream; the receiver uses this signal to sample the incoming bits. Start & Stop bits are not used, and it often uses a packet format with leading "sync" bytes and a trailing checksum instead of parity bits. Receiver clockspeed is irrelevant, so long as it can sample once per cycle on the incoming clock. (TX and RX baud rates are synchronized via the clock in the data stream, so the rate can drift without harm.) Asynchronous comms uses no inline clock, and depends on the TX and RX to operate near the same baud rate, i.e., using the same I/O sampling frequency. Bytes are "clocked" individually, and the RX detects the byte via the Start bit that's prefixed. (TX and RX baud rates are synchronized independently from the data stream, and they'd better be close.) With async comms, a bad reference clock (MCU oscillator) causes sampling to miss entire pulses (or start counting them twice, depending on who's running fast). IIRC, TX and RX baud rates have to be within 5-6% of each other or the last pulse in a byte starts getting sampled wrong (regardless of the baud rate).
"Jim Granville" <no.spam@designtools.co.nz> wrote in message 
news:6CXTc.3400$zS6.403349@news02.tsnz.net...
>> I never said it was. The whole point which everyone continues to ignore >> is that if you have baud clocks between two devices, they need to be >> pretty damned close in terms of tolerance (my experience says 1% across >> the board if you want to be compatible with everyone), otherwise the >> synchronous > "pretty damned close" is a rather loose argument for "synchronous". > If you replace your meaning of "synchronous" with everyone else's meaning > of "sampling", the the logic follows better.
Well, of course we all have slightly different definitions of what words mean. Someone else's "sloppy timing" is someone else's "tight timing". So, to be a bit more specific: 1% of 2400bps = 2378-2423bps 1% of 9600bps = 9505-9695bps 1% of 115200bps = 114049-116351bps So I'd say that 1% is "pretty damned close" in this case. ;-) It's also fair to say that "we've synchronized our watches". Of course it's not synchronized down to the millisecond. That's all I was saying. For that brief moment in time when a byte is transferred, both ends are operating independently but are synchronized, not locked together.
>> aspect (the data bits between the start and stop bits) of the >> asynchronous communication will most likely not work correctly, with a >> decreasing chance they'll work the lower the baud rate goes. > OK, I'll bite : Why/how does the absolute baud rate affect the % of clock > skew that can be tolerated ? > If the data frame size is constant.unchanged, then the sampling time skew > that can be tolerated is a fixed percentage of that time ?
I'm having a tough time explaining it in a fashion that will be globally understandable, but here goes. The short answer is that the errors are cumulative. Both devices start their respective internal baud clocking at the rising or falling edge of the start bit. That is a point where the "synchronization" (hehehehe) occurs. Most UARTs run off some sort of timer/counter/divisor arrangement, as does the 16550 or even the 8051. The standard clock chip feeding a PC's 16550 (or equivalent thereof these days) is fed by a 1.8432Mhz part. There is a /16 prescaler in the 16550, followed by a programmable baud rate divisor beyond that. So, 1.8432Mhz/16=115200bps internal clock (that will help minimize jitter). Let's divide it down even further - 2400bps. That'd be a divisor of 48 (115200/48=2400). That assumes a perfectly divisable clock. For the sake of argument, let's turn that in to a 2Mhz crystal (roughly 8% difference from the 1.8432Mhz clock). Let's say that the sender is using a 2Mhz crystal and the receiver is using a 1.8432Mhz crystal - both with a divisor of 1, each aiming for 115200bps. The 2Mhz fed 16550 will yield 12500bps and the 1.8432Mhz fed 16550 will yield 115200bps. At that rate, there's 1.08506944 hz of the sender for every 1Hz of the receiver. Mulitply that out 8 bits and you're at 8.680555552 cycles on the sender and 8 on the receiver, roughly a 7% error after 8 bits, but still within the realm of possibility in terms of sampling since the sender hasn't blasted past the receiver by more than 1Hz. Okay, now let's take that down to 2400bps. 2Mhz fed 16550 yields 2604bps. 1.8432Mhz receiver yields exactly 2400bps. Remember we have a divisor of 48 to get 2400bps, so we need to multiply everything by 48, so 1.08506944 * 48 = 52.083333312Hz vs 48Hz - you're off by over 4Hz, so anything past the first 4 bits of data are guaranteed to be sampled at the wrong time, ending up in incorrect data. Okay, my head hurts now. ;-) I wound up running against this with a really poor crystal on a prototype 8051 weather station controller. I had already used the serial port for PC communications, so I had to do an interrupt driven/bit banged UART using INT0 (if you want the code I'll post it). I had a windspeed/direction PIC I was interfacing with that had a 20% tolerance on the internal clock. It communicated @ 1200baud, so my opportunity for sampling was smaller than a reasonable crystal. -->Neil
"Neil Bradley" <nb_no_spam@synthcom.com> wrote in message
news:10hvv8scv52i489@corp.supernews.com...
> "Doug Dotson" <dougdotson@NOSPAMcablespeed.NOSPAMcom> wrote in message > news:ismdnTKuDKd_T4LcRVn-hQ@cablespeedmd.com... > >I believe that UART stands for "Universal ASYNCRONOUS Receiver > > Transmitter". You need to go back and study the difference between > > sync and async. > > Nope, I understand the concept perfectly. When using a UART, it's required > that both sides of the serial transmission be synchronized. If you don't > believe me, try using a crystal at a low baud rate with a 20% tolerance. > Devices won't be able to talk to it. When the byte comes is the
asynchronous
> part, and that wasn't even the topic being discussed. >
The phrase "when you are in a hole, stop digging" springs to mind... In the context of serial communications, "synchronous" means there is a clock line (or an "embedded clock" in the data signal) going between the sender and receiver, while "asynchronous" means there is no directly shared clock, so each side uses its own time reference to clock the transfers. A "clock" signal does not have to be regular - for synchronous transfers, it can vary as much as you want. For asynchronous transfers, you normally have a regular clock - while you could theoreticly have a varying one, it would be a challange to implement reliably. But if the clocks on the two sides are not joined directly to synchronize their clocks, they are not synchronous - it doesn't matter if their speeds match at 0.00001% accuracy. For standard uart asynchronous communication, the limit for communication over a good line (noise-free, and nice sharp edges) is a 5% match between sender and receiver, so that by the 10th bit they are no more than 50% of a bit in difference. Normally, that means ensuring that each side is within 2.5% of the nominal baud rate. The absolute baud rate does not matter.
> The question was in reference to the baud rate generating clock, not when > the data comes in. For the period of the byte transmission, both sides
must
> be synchronized. There is no common clock between them. If you have
separate
> clock and data lines, the clock can vary wildly with no adverse effect on > communication. No synchronization between devices needed. Is this a hard > concept to grasp? > > You do know that the words synchronous and asynchronous can mean different > things depending on the context, right? > > -->Neil > >
"Doug Dotson" <dougdotson@NOSPAMcablespeed.NOSPAMcom> wrote in message
news:u9adnR6BVMpmtr3cRVn-hw@cablespeedmd.com...
> COmments below. > Doug > > "Neil Bradley" <nb_no_spam@synthcom.com> wrote in message > news:10i09iglmsfuv62@corp.supernews.com... > > "Casey" <cclremovethispart@cox.net> wrote in message > > news:JoUTc.21115$pT5.3217@lakeread05... > > > Neil Bradley said... > > >> "Doug Dotson" <dougdotson@NOSPAMcablespeed.NOSPAMcom> wrote in
message
> > >> Nope, I understand the concept perfectly. When using a UART, it's > > >> required > > >> that both sides of the serial transmission be synchronized. > > > Both sides are NOT synchronized - that's the essence of async > > > communications. > > > > They are for the duration of the byte being transmitted. That's what the > > start bits are for - to synchronize both ends for the duration of the > byte! > > Correct. But that does not make it synchronous communications. The fact > that the difference in tx and rx clock is a factor makes it async. >
The sender and the receiver in uart communications are *never* synchronized. We are not talking about the general usage of a common word here (as in "let's synchronize our watches") - the term "synchronous" has a very specific meaning in the world of electronics, especially for communications. It means there is a shared clock. Two devices can't be "synchronized for a while" (unless you are swiching the clock signal on and off, etc.). They can't be "closely synchronized" - they are either synchronized, or not. When a uart receiver notices a start bit, it does not become synchronized to the sender - it synchronizes its state machine (using its single internal clock) to the start bit as sampled by its own clock. This is going to happen at roughly the same time as the sender transmits the start bit, but not exactly - it depends on transmission delays, over-sampling rates at the receiver, and so on. I.e., the sender and receiever are not synchronized.
> > >> If you don't > > >> believe me, try using a crystal at a low baud rate with a 20% > tolerance. > > > No, but the two ends can use clocks that are 5% off from each other
and
> > > still communicate successfully. > > > > It depends upon the baud rate. The lower the baud rate the more > susceptible > > it is to being > > Correct. All properties of async coms. >
This is a myth - a commonly believed myth, but a myth nonetheless. There are definitely factors that make serial communication harder at higher speeds, such as rounded edges, noise, cross-talk, etc., and closer clock matching might help marginally, but the main effect of the clock speed matching is a relative effect - a 5% error means a 50% *bit time* error after 10 bits, regardless of the length of a bit time. Think about the sort of speeds async communication runs at. Suppose we say 5% is required for 9600 baud. Does that mean we can run 600 baud at 40% error? Or do we need 0.005% to run Profibus at 12 MBit? To say nothing of the 0.5 ppm accuracy crystals needed for gigabit ethernet... It reminds me somewhat of when my father was once buying beer at an off-license. They were having a special offer of 5% discount for a crate of 12 bottles. On seeing this, my father said he would get two crates, and the shop assistant gave him 5% off for the first crate, then another 5% off for the second crate, giving him a 10% discount altogether. He was tempted to buy more, but didn't want to push his luck...
> > >> The question was in reference to the baud rate generating clock, not > when > > >> the data comes in. For the period of the byte transmission, both
sides
> > >> must > > >> be synchronized. > > > Incorrect. > > NO, they are not syncronized. THat is why the drift of sampling points > becomes > a problem. > > > Completely correct. Like I said, FOR THE PERIOD OF THE BYTE transmission > > they need to be synchronized (or closely enough in time with eachother)
in
> > order to receive the byte successfully. > > No, the drift has to be within acceptable bounds. That is not the same as > being > synchronized. > > > >> If you have separate > > >> clock and data lines, the clock can vary wildly with no adverse
effect
> on > > >> communication. No synchronization between devices needed. Is this a > hard > > >> concept to grasp? > > The fact that the tx and rx ends are using the same clock is the
definition
> of > synchronous coms. > > > > The clock can't vary - the two ends have to use the same > > > "synchronized" clock. > > > > Well, we're mincing words. I'm not saying that a UART is synchronous.
I'm
> > saying that for the duration of the byte being transmitted, they need to > be, > > for all intents and purposes, synchronized, or damned close to it. > > No, by defintion they are not. They just have to have the clock error > between > tx and rx within limits. This is not synchronized If it were, then once
the
> start > bit was detected then any difference between tx and rx wouldn;t matter
since
> they would be "synchronized". This is not the case with async comms. > > > -->Neil > > > > > >
On Mon, 16 Aug 2004 09:08:13 +0200, "David Brown"
<david@no.westcontrol.spam.com> wrote:

> >"Neil Bradley" <nb_no_spam@synthcom.com> wrote in message >news:10hvv8scv52i489@corp.supernews.com... >> "Doug Dotson" <dougdotson@NOSPAMcablespeed.NOSPAMcom> wrote in message >> news:ismdnTKuDKd_T4LcRVn-hQ@cablespeedmd.com... >> >I believe that UART stands for "Universal ASYNCRONOUS Receiver >> > Transmitter". You need to go back and study the difference between >> > sync and async. >> >> Nope, I understand the concept perfectly. When using a UART, it's required >> that both sides of the serial transmission be synchronized. If you don't >> believe me, try using a crystal at a low baud rate with a 20% tolerance. >> Devices won't be able to talk to it. When the byte comes is the >asynchronous >> part, and that wasn't even the topic being discussed. >> > >The phrase "when you are in a hole, stop digging" springs to mind... > >In the context of serial communications, "synchronous" means there is a >clock line (or an "embedded clock" in the data signal) going between the >sender and receiver, while "asynchronous" means there is no directly shared >clock, so each side uses its own time reference to clock the transfers. A >"clock" signal does not have to be regular - for synchronous transfers, it >can vary as much as you want. For asynchronous transfers, you normally have >a regular clock - while you could theoreticly have a varying one, it would >be a challange to implement reliably. But if the clocks on the two sides >are not joined directly to synchronize their clocks, they are not >synchronous - it doesn't matter if their speeds match at 0.00001% accuracy. > >For standard uart asynchronous communication, the limit for communication >over a good line (noise-free, and nice sharp edges) is a 5% match between >sender and receiver, so that by the 10th bit they are no more than 50% of a >bit in difference. Normally, that means ensuring that each side is within >2.5% of the nominal baud rate. The absolute baud rate does not matter. > > > > > >> The question was in reference to the baud rate generating clock, not when >> the data comes in. For the period of the byte transmission, both sides >must >> be synchronized. There is no common clock between them. If you have >separate >> clock and data lines, the clock can vary wildly with no adverse effect on >> communication. No synchronization between devices needed. Is this a hard >> concept to grasp? >> >> You do know that the words synchronous and asynchronous can mean different >> things depending on the context, right? >> >> -->Neil >> >> >
It's no wonder that there are so many framing errors on Async comms when some people don't understand the basic concepts of Async and Sync comms. Sync transmits the clock either directly (on a clock line) or indirectly (embedded in the data) so that the receiver know when to look at the data bits. Async uses the "start" bit of each byte to tell the receiver to start timing to look at the bits of this one byte only. The term Asynchronous here means that the sender can send data at anytime without having to worry about whether or not the receiver is in sync. How many designers try to send or receive at say 5% tolerance instead of aiming for as close to 0% as possible. Makes for interesting interfacing when one unit is 5% high and the other 5% low because they've been designed by "prima donnas" who know better than the rest of the world. As for crystal versus baud rate tolerances - if the crystal is 1% low then the baud rate is 1% low. Simple as that - always allowing for the fact that you are using the "correct" frequency crystal to start with and allowing for any division errors in the baud rate generator/clock. Alan ++++++++++++++++++++++++++++++++++++++++++ Jenal Communications Manufacturers and Suppliers of HF Selcall P O Box 1108, Morley, WA, 6943 Tel: +61 8 9370 5533 Fax +61 8 9467 6146 Web Site: http://www.jenal.com e-mail: http://www.jenal.com/?p=1 ++++++++++++++++++++++++++++++++++++++++++
>> > It depends upon the baud rate. The lower the baud rate the more >> > it is to being >> Correct. All properties of async coms. > This is a myth - a commonly believed myth, but a myth nonetheless.
No, you're flatly wrong about that.
> There > are definitely factors that make serial communication harder at higher > speeds, such as rounded edges, noise, cross-talk, etc., and closer clock > matching might help marginally, but the main effect of the clock speed > matching is a relative effect - a 5% error means a 50% *bit time* error
Um... well, originally the question was the tolerance of the *CRYSTAL* inside of microprocessors, not the "bit time" error: "Overall the Philips LPC900 oscillators are specified with +/- 2.5% over temp range and voltage range"
> after 10 bits, regardless of the length of a bit time.
But the error is *ADDITIVE* fron a per bit basis until the next start bit, though. I actually have working experience with this one. We had to swap crystals because lower baud rates wouldn't work with our embedded system with standard PCs because the lower divisors were outside tolerable ranges. Yup, it got worse as the baud rate got lower. See my other response for a working example.
> Think about the sort of speeds async communication runs at. Suppose we > say > 5% is required for 9600 baud. Does that mean we can run 600 baud at 40% > error? Or do we need 0.005% to run Profibus at 12 MBit? To say nothing > of > the 0.5 ppm accuracy crystals needed for gigabit ethernet...
You're not taking in to account that asynchronous communication *HAS* a byte synchronization method - a start bit! -->Neil
"Neil Bradley" <nb_no_spam@synthcom.com> wrote in message
news:10i0qfpscuoanef@corp.supernews.com...
> >> > It depends upon the baud rate. The lower the baud rate the more > >> > it is to being > >> Correct. All properties of async coms. > > This is a myth - a commonly believed myth, but a myth nonetheless. > > No, you're flatly wrong about that. > > > There > > are definitely factors that make serial communication harder at higher > > speeds, such as rounded edges, noise, cross-talk, etc., and closer clock > > matching might help marginally, but the main effect of the clock speed > > matching is a relative effect - a 5% error means a 50% *bit time* error > > Um... well, originally the question was the tolerance of the *CRYSTAL* > inside of microprocessors, not the "bit time" error: >
What difference does that make? If an oscillator is 2% out, the the bit time on a uart based on that oscillator is 2% out. That's how "relative error" works. Here's an example: Suppose you have two microcontrollers, each with an oscillator (of any sort) running at a nominal 9.8304 MHz, and each with a uart running at a nominal 9600 baud (the divisor is therefore 1024, ignoring the typical x16 oversampling for now). Suppose micro1's oscillator is 2.5% too slow, while mico2's is 2.5% too fast. Then micro1 will run at 9.584640 MHz, and generate a baud of 9360, while micro2 will run at 10.076160 MHz and a baud of 9840. You will note that 9360 is 2.5% slower than 9600, and 9840 is 2.5% faster than 9600. Now, suppose micro1 sends out a standard 10-bit uart frame to micro2. Micro2 starts up its receive state machine on encountering the start bit, and aims to sample the first bit after what it believes is 1.5 bit times. In reality, micro2 samples at 0.1524ms after the start bit, which is sooner than the middle of the first bit send by micro1 (at 0.1602ms). The absolute time difference here is 0.078ms, which is well within a bit time, so the receiver samples fine. But by the stop bit, sampled at 9.5 bit times after the start bit's falling edge, things are different. micro2 samples at 0.9654ms, which is quite a bit out from when micro1 send the middle of the stop bit at 1.0149ms. In fact, amazingly enough, it is out 47.5% of a bit time (9.5 times the 5% error - by original "50%" had rounded the half bit time). micro1's final transition from the last data bit to the stop bit occurs at 0.9615ms - just before micro2 samples it. Assuming that the line is noiseless and has sharp transitions with equal delays on rising and falling edges, this is fine - the communication will work with up to, but not beyond, a 50% bit time error at the end of the frame. Now, just for a laugh, go through that example again and replace "MHz" with "GHz", "baud" with "kbaud", and "ms" with "us". Lo and behold, the same calculations hold and show that 5% difference between the sender and receiver clocks is just on the edge of what can work, regardless of the absolute speeds. Now do you understand? Incidently, there are no microprocessors (that I've ever heard of) with a crystal - internal oscillators come in various types (VCO, R-C, ring oscillators, etc.), but crystals are always external to the microprocessor.
> "Overall the Philips LPC900 oscillators are specified with +/- 2.5% > over temp range and voltage range" > > > after 10 bits, regardless of the length of a bit time. > > But the error is *ADDITIVE* fron a per bit basis until the next start bit,
Exactly - your 5% error adds up to 50% (or 47.5%, to be exact) error over the ten bits transmitted. But this is completly independant of the bit time - it is a relative error.
> though. I actually have working experience with this one. We had to swap > crystals because lower baud rates wouldn't work with our embedded system > with standard PCs because the lower divisors were outside tolerable
ranges.
> Yup, it got worse as the baud rate got lower. See my other response for a > working example. >
Are you suggesting that your microcontroller's uart can divide it's clock by a small number without problem, but fails to divide accurately by a larger number? I think you can be pretty confident that there is some other problem, such as incorrectly setting the divisor bits. I too have had to change crystals to get low baud rates, but that is merely because the uart in question (on an avr8515, IIRC) did not have enough bits in the baud rate divisor to reach down to 300 baud from a 8 MHz crystal.
> > Think about the sort of speeds async communication runs at. Suppose we > > say > > 5% is required for 9600 baud. Does that mean we can run 600 baud at 40% > > error? Or do we need 0.005% to run Profibus at 12 MBit? To say nothing > > of > > the 0.5 ppm accuracy crystals needed for gigabit ethernet... > > You're not taking in to account that asynchronous communication *HAS* a
byte
> synchronization method - a start bit! >
Does that mean you think gigabit ethernet needs 0.5ppm crystals and 600 baud modems can run with +/-40 % tolerance crystal, or does it mean you agree with me (and the rest of the world - at least, the tiny part that cares :-) that the actual rate is irrelevant when discussing the percentage error?
> -->Neil > >
"Alan" <me@somewhere.com.au> wrote in message
news:vmo0i0do3b73veve4ojrrrmt7lh8udae6d@4ax.com...
> On Mon, 16 Aug 2004 09:08:13 +0200, "David Brown" > <david@no.westcontrol.spam.com> wrote: > > > > >"Neil Bradley" <nb_no_spam@synthcom.com> wrote in message > >news:10hvv8scv52i489@corp.supernews.com... > >> "Doug Dotson" <dougdotson@NOSPAMcablespeed.NOSPAMcom> wrote in message > >> news:ismdnTKuDKd_T4LcRVn-hQ@cablespeedmd.com... > >> >I believe that UART stands for "Universal ASYNCRONOUS Receiver > >> > Transmitter". You need to go back and study the difference between > >> > sync and async. > >> > >> Nope, I understand the concept perfectly. When using a UART, it's
required
> >> that both sides of the serial transmission be synchronized. If you
don't
> >> believe me, try using a crystal at a low baud rate with a 20%
tolerance.
> >> Devices won't be able to talk to it. When the byte comes is the > >asynchronous > >> part, and that wasn't even the topic being discussed. > >> > > > >The phrase "when you are in a hole, stop digging" springs to mind... > > > >In the context of serial communications, "synchronous" means there is a > >clock line (or an "embedded clock" in the data signal) going between the > >sender and receiver, while "asynchronous" means there is no directly
shared
> >clock, so each side uses its own time reference to clock the transfers.
A
> >"clock" signal does not have to be regular - for synchronous transfers,
it
> >can vary as much as you want. For asynchronous transfers, you normally
have
> >a regular clock - while you could theoreticly have a varying one, it
would
> >be a challange to implement reliably. But if the clocks on the two sides > >are not joined directly to synchronize their clocks, they are not > >synchronous - it doesn't matter if their speeds match at 0.00001%
accuracy.
> > > >For standard uart asynchronous communication, the limit for communication > >over a good line (noise-free, and nice sharp edges) is a 5% match between > >sender and receiver, so that by the 10th bit they are no more than 50% of
a
> >bit in difference. Normally, that means ensuring that each side is
within
> >2.5% of the nominal baud rate. The absolute baud rate does not matter. > > > > > > > > > > > >> The question was in reference to the baud rate generating clock, not
when
> >> the data comes in. For the period of the byte transmission, both sides > >must > >> be synchronized. There is no common clock between them. If you have > >separate > >> clock and data lines, the clock can vary wildly with no adverse effect
on
> >> communication. No synchronization between devices needed. Is this a
hard
> >> concept to grasp? > >> > >> You do know that the words synchronous and asynchronous can mean
different
> >> things depending on the context, right? > >> > >> -->Neil > >> > >> > > > It's no wonder that there are so many framing errors on Async comms > when some people don't understand the basic concepts of Async and > Sync comms. > > Sync transmits the clock either directly (on a clock line) or > indirectly (embedded in the data) so that the receiver know when to > look at the data bits. > > Async uses the "start" bit of each byte to tell the receiver to start > timing to look at the bits of this one byte only. > > The term Asynchronous here means that the sender can send data at > anytime without having to worry about whether or not the receiver is > in sync. >
Not quite - the term "asynchronous" here means "not synchronous" - i.e., the opposite of your correct definition of "synchronous". The receiver is *never* in sync with the sender in async communication, since it does not have a clock signal on which to sychronize.
> How many designers try to send or receive at say 5% tolerance instead > of aiming for as close to 0% as possible. Makes for interesting > interfacing when one unit is 5% high and the other 5% low because > they've been designed by "prima donnas" who know better than the rest > of the world. >
5% tolerance is for the *total* error. Normally that means that you need to be within 2.5% at each end, unless you happen to know for sure that one end will be much tighter tolerance. Of course, only a fool would aim for something that is only just within the theoretical limit of what was possible! A realisitic rule of thumb would be to aim for 1% match - any crystal or ceramic oscillator will do, but an internal oscillator or an R-C oscillator would need trimming.
> As for crystal versus baud rate tolerances - if the crystal is 1% low > then the baud rate is 1% low. Simple as that - always allowing for the > fact that you are using the "correct" frequency crystal to start with > and allowing for any division errors in the baud rate generator/clock. >
That is, of course, correct - I'm slightly stunned that there are people working in this field who apparently fail to grasp that. Hopefullly, "apparently" is the operative word, and that it is merely the wording of their posts that is ambiguous, rather than their understanding.
> > Alan > > ++++++++++++++++++++++++++++++++++++++++++ > Jenal Communications > Manufacturers and Suppliers of HF Selcall > P O Box 1108, Morley, WA, 6943 > Tel: +61 8 9370 5533 Fax +61 8 9467 6146 > Web Site: http://www.jenal.com > e-mail: http://www.jenal.com/?p=1 > ++++++++++++++++++++++++++++++++++++++++++
On Mon, 16 Aug 2004 13:06:48 +0200, "David Brown"
<david@no.westcontrol.spam.com> wrote:
>> >> Async uses the "start" bit of each byte to tell the receiver to start >> timing to look at the bits of this one byte only. >> >> The term Asynchronous here means that the sender can send data at >> anytime without having to worry about whether or not the receiver is >> in sync. >> > >Not quite - the term "asynchronous" here means "not synchronous" - i.e., the >opposite of your correct definition of "synchronous". The receiver is >*never* in sync with the sender in async communication, since it does not >have a clock signal on which to sychronize. >
Perhaps I could have explained it better. But the point is that the Async receiver uses the leading edge of the start bit to trigger it's own internal timing mechanism which should produce sampling at the correct time for the incoming data. It is not, as you say, in sync with the incoming data as it doesn't have a clock to sync to. However the internal sampling clock needs to be less than 5% different from the clock that produced the data to reliably decode the data. This is always presuming that the receiving UART (or software) has been designed properly to sample in the middle of the data bit (in the case of a single sample per bit UART) or at the correct times for a multi-sample per bit UART. In fact multi-sample per bit UARTS "could" make the tolerance situation worse! There is also a third type of synchronous data where the clock is not sent but the receiver and the transmitter have to have accurate clocks which are synchronised by a preamble only.
>That is, of course, correct - I'm slightly stunned that there are people >working in this field who apparently fail to grasp that. Hopefullly, >"apparently" is the operative word, and that it is merely the wording of >their posts that is ambiguous, rather than their understanding. >
It's unfortunate that there appear to be a (large) number of people out there that don't seem to know the basic of data transmission and end up writing code that produce wrong baud rates - especially when it comes to bit-banging. I always try to get as close to 0% tolerance as possible with baud rates to cater for all the funny ones. Alan ++++++++++++++++++++++++++++++++++++++++++ Jenal Communications Manufacturers and Suppliers of HF Selcall P O Box 1108, Morley, WA, 6943 Tel: +61 8 9370 5533 Fax +61 8 9467 6146 Web Site: http://www.jenal.com e-mail: http://www.jenal.com/?p=1 ++++++++++++++++++++++++++++++++++++++++++
That was my point earlier although perhaps I did not make it well.
Just because there is a point of synchronization at the edge of the
start bit doesn;t make it synchronous comms. Way early in this
thread the OP referred to synchronous comms with reference
to the synchronization at the start bit edge. Since then the thread
had gone off into the weeds of error tolerance between tx and rx cloack.

Doug

"Richard" <rh86@no.spam> wrote in message news:41205086.C4CEC453@no.spam...
> Neil Bradley wrote: > > >> Well, we're mincing words. I'm not saying that > > >> a UART is synchronous. I'm saying that for the > > >> duration of the byte being transmitted, they need > > >> to for all intents and purposes, synchronized, or > > >> damned close to it. > > > No, by defintion they are not. > > This discussion is confusing the general term 'synchronized' with the > comms term 'synchronous'. They are not interchangeable in this context. > > Synchronous comms is where the baud rate is clocked using a signal > transmitted in parallel to the data stream; the receiver uses this > signal to sample the incoming bits. Start & Stop bits are not used, and > it often uses a packet format with leading "sync" bytes and a trailing > checksum instead of parity bits. Receiver clockspeed is irrelevant, so > long as it can sample once per cycle on the incoming clock. (TX and RX > baud rates are synchronized via the clock in the data stream, so the > rate can drift without harm.) > > Asynchronous comms uses no inline clock, and depends on the TX and RX to > operate near the same baud rate, i.e., using the same I/O sampling > frequency. Bytes are "clocked" individually, and the RX detects the > byte via the Start bit that's prefixed. (TX and RX baud rates are > synchronized independently from the data stream, and they'd better be > close.) > > With async comms, a bad reference clock (MCU oscillator) causes sampling > to miss entire pulses (or start counting them twice, depending on who's > running fast). IIRC, TX and RX baud rates have to be within 5-6% of > each other or the last pulse in a byte starts getting sampled wrong > (regardless of the baud rate).