I can't find this in any datasheet, but I was thinking one reason why my packets transfer slowly (even on wired network) serially is because the individual bits are probably not in sync. I think its true that crystals can experience "drift".
Anyways, so my server micro is a AT89LP52 with 10pF capacitors fitted to each leg of a 22.1184Mhz crystal. My client micro is an AT89S52 with 33pF capacitors fitted to each leg of another 22.1184Mhz crystal.
I read in various sources and learned that if SMOD bit is set, then the micro scans port 3.0 16x before it determines if the bit is a 1 or a 0. If SMOD is clear then the port is scanned 32x
I also believe that with serial mode 1 (which I'm using for 115200bps) TH1 and TL1 is initially set to FFh
Since there's such thing as "drift", I think theres that chance that one of the 2 micro's have completed the scan of one bit and thus things could go out of sync?
So what I want to do in an event of any packet error (including drift) is somehow reset this whole serial mechanism so that port scanning restarts at zero. so that the port is scanned 16x (or 32x) to correctly validate the bit.
Is it enough to switch serial modes and reset timer values? do I need to set/reset flags? or do I need to do more than that? I can't seem to find such information in the 8051 bible (8051 hardware manuals).
In as far as I understand you this sounds totally wrong. If you are using the processor on chip UART it is designed to over-sample the incoming serial data and effectively processes the incoming serial data stream pretty much in real time, there will be a short delay between the stop bit edge and the micro setting the data ready flag. You can easily measure this by putting some code in that sets a pin when the data byte is received and using a scope to compare the timing with the incoming data. If you don't have a scope I think you need to get one - your task is made some much harder without one as to be close to impossible.
UARTS can work perfectly well with quite large differences in clock frequency - with 10 bit words (1 start, 8 data, 1 stop) I would expect everything to work with a 3% difference, allowing 1.5% (approx) error in each clock.
You can reset the UART between receiving Bytes (Disable-Enable), but the input register clock alignment automatically resets when the start Byte is received.
If you use crystals, I am sure that clock drift is not the problem. Even different cap values will not alter the frequency as much as to allow errors due to clock differences.