EmbeddedRelated.com
Forums
The 2024 Embedded Online Conference

Two Ethernet or One?

Started by like2learn September 28, 2010
I am thinking of developing an embedded device, and the device will
transfer the data to the server via ethernet interface to internet. At
the same time I would like to debug the device in remote. I think I
can either use one ethernet, or two ethernet for the task.
By using one ethernet, I can use one port address to exchange data
between the device and the server, and another port address for
debugging between the developer and the device, so there should be no
confliction.
By using two ethernet, I can assign two IP address to them. So the
debugging task and data transfer task won't interfere each other too.
However, I am not sure which one is the more popular engineer practice
for this kind of applications? What are the pros and cons of these two
solutions? Any insight?
BTW, I may use either DHCP or static IP to allocate the IP address.
Thank you very much!
On Sep 28, 10:07=A0am, like2learn <gpsabo...@yahoo.com> wrote:
> I am thinking of developing an embedded device, and the device will > transfer the data to the server via ethernet interface to internet. At > the same time I would like to debug the device in remote. I think I > can either use one ethernet, or two ethernet for the task. > By using one ethernet, I can use one port address to exchange data > between the device and the server, and another port address for > debugging between the developer and the device, so there should be no > confliction. > By using two ethernet, I can assign two IP address to them. So the > debugging task and data transfer task won't interfere each other too. > However, I am not sure which one is the more popular engineer practice > for this kind of applications? What are the pros and cons of these two > solutions? Any insight? > BTW, I may use either DHCP or static IP to allocate the IP address. > Thank you very much!
The ability to control a device while transferring data from another port is supported in every TCP/IP stack I've seen for quite some time now. Simultaneous FTP and Telnet, for example; no wheel re-inventing. I have had problems with 2-port implementations. For example in one device, I had 100Mb and GigE on separate ports; the 100Mb for control, and the GigE for data. The device application would often try to send the data on the 100Mb port (because the GigE port was busy) and performance would suck. I would go with the one port solution. RK.
On Sep 28, 11:17=A0am, d_s_klein <d_s_kl...@yahoo.com> wrote:
> On Sep 28, 10:07=A0am, like2learn <gpsabo...@yahoo.com> wrote: > > > I am thinking of developing an embedded device, and the device will > > transfer the data to the server via ethernet interface to internet. At > > the same time I would like to debug the device in remote. I think I > > can either use one ethernet, or two ethernet for the task. > > By using one ethernet, I can use one port address to exchange data > > between the device and the server, and another port address for > > debugging between the developer and the device, so there should be no > > confliction. > > By using two ethernet, I can assign two IP address to them. So the > > debugging task and data transfer task won't interfere each other too. > > However, I am not sure which one is the more popular engineer practice > > for this kind of applications? What are the pros and cons of these two > > solutions? Any insight? > > BTW, I may use either DHCP or static IP to allocate the IP address. > > Thank you very much! > > The ability to control a device while transferring data from another > port is supported in every TCP/IP stack I've seen for quite some time > now. =A0Simultaneous FTP and Telnet, for example; no wheel re-inventing. > > I have had problems with 2-port implementations. =A0For example in one > device, I had 100Mb and GigE on separate ports; the 100Mb for control, > and the GigE for data. =A0The device application would often try to send > the data on the 100Mb port (because the GigE port was busy) and > performance would suck. > > I would go with the one port solution. > > RK.
Thank you for sharing your experience!
I would stick to one port until you have proven the stack and hardware are robust then you can move over to 2, we usually only put 2 
ports on when we want to seperate the control from the outside world.

joolz



-- 
--------------------------------- --- -- -
Posted with NewsLeecher v4.0 Beta 20
Web @ http://www.newsleecher.com/?usenet
------------------- ----- ---- -- -

>I would stick to one port until you have proven the stack and hardware are
robust then you can move over to 2, we usually only put 2
>ports on when we want to seperate the control from the outside world. > >joolz > > > >-- >--------------------------------- --- -- - >Posted with NewsLeecher v4.0 Beta 20 >Web @ http://www.newsleecher.com/?usenet >------------------- ----- ---- -- - > >
I may be saying something stupid, but how he can control the band sharing among both (debug and control/data)? Using one connection would reduce the data trhoughtput no? That could lead to a non real-time behavior. Just to make sure, the discussion is if he should use two physical ports or two TCP ports right? Cya --------------------------------------- Posted through http://www.EmbeddedRelated.com
On 29.09.2010 11:11, Sink0 wrote:

> I may be saying something stupid, but how he can control the band sharing > among both (debug and control/data)?
That's what you have an operating system and its TCP/IP stack for, among other things. No need to re-invent every single wheel in the universe.
> Using one connection would reduce the > data trhoughtput no?
It might. But he didn't give any believable indication of being so severely bandwidth limited that that would actually make a difference. Frankly, if had managed to saturate a 100Mb/s or faster Ethernet link with payload data, he would surely be well beyond asking the kind of questions he does.
> That could lead to a non real-time behavior.
All debugging usually does that.
>On 29.09.2010 11:11, Sink0 wrote: > >> I may be saying something stupid, but how he can control the band
sharing
>> among both (debug and control/data)? > >That's what you have an operating system and its TCP/IP stack for, among >other things. No need to re-invent every single wheel in the universe. > >> Using one connection would reduce the >> data trhoughtput no? > >It might. But he didn't give any believable indication of being so >severely bandwidth limited that that would actually make a difference. >Frankly, if had managed to saturate a 100Mb/s or faster Ethernet link >with payload data, he would surely be well beyond asking the kind of >questions he does. > >> That could lead to a non real-time behavior. > >All debugging usually does that. >
Very good answers but the first. I think you did not understand my point and it is pure ignorance of my part. Is there any way to hard share the network band, something like this port CANT at any moment use more than certain amount of the band? Probably there is someway on how to do that, i just dont know how. But after all the answers...yes, one ethernet would be the best option i think. --------------------------------------- Posted through http://www.EmbeddedRelated.com
like2learn wrote:
> I am thinking of developing an embedded device, and the device will > transfer the data to the server via ethernet interface to internet. At > the same time I would like to debug the device in remote. I think I > can either use one ethernet, or two ethernet for the task. > By using one ethernet, I can use one port address to exchange data > between the device and the server, and another port address for > debugging between the developer and the device, so there should be no > confliction. > By using two ethernet, I can assign two IP address to them. So the > debugging task and data transfer task won't interfere each other too. > However, I am not sure which one is the more popular engineer practice > for this kind of applications? What are the pros and cons of these two > solutions? Any insight? > BTW, I may use either DHCP or static IP to allocate the IP address. > Thank you very much!
Unless you truly *need* two i/f's, stick with one. Some possible reasons for *needing* two include: - the pipe for your "normal" interface is "almost full" - you have particular QoS requirements (that your stack doesn't correctly implement) - the traffic on the "debug" i/f is "sensitive" in nature (and you don't want to risk a packet sniffer's attention) - you are using UDP and worry about the traffic on one wire resulting in lots of dropped packets, (etc) You haven't explained what "this kind of applications" is so it's not possible to tell you what *is* "the more popular engineering practice" for your domain. If you go with two i/f's, try to put them on separate subnets so your stack doesn't get upset trying to route over a particular i/f (else you end up with *one* i/f doing all the work and/or other unpredictable behavior).
On Sep 29, 3:11=A0am, "Sink0" <sink00@n_o_s_p_a_m.gmail.com> wrote:
> >I would stick to one port until you have proven the stack and hardware a=
re
> > robust then you can move over to 2, we usually only put 2 > > >ports on when we want to seperate the control from the outside world. > > >joolz > > >-- > >--------------------------------- --- -- - > >Posted with NewsLeecher v4.0 Beta 20 > >Web @http://www.newsleecher.com/?usenet > >------------------- ----- ---- -- - > > I may be saying something stupid, but how he can control the band sharing > among both (debug and control/data)? Using one connection would reduce th=
e
> data trhoughtput no? That could lead to a non real-time behavior. Just to > make sure, the discussion is if he should use two physical ports or two T=
CP
> ports right? > > Cya > > --------------------------------------- =A0 =A0 =A0 =A0 > Posted throughhttp://www.EmbeddedRelated.com
Hi Cya, Our application is not a real-time application, since we don't have to send out the data in real-time. However, we do hope the device respond to the control command in real-time. So bandwidth is really not a concern. Thanks!
On Sep 29, 4:03=A0pm, D Yuniskis <not.going.to...@seen.com> wrote:
> like2learn wrote: > > I am thinking of developing an embedded device, and the device will > > transfer the data to the server via ethernet interface to internet. At > > the same time I would like to debug the device in remote. I think I > > can either use one ethernet, or two ethernet for the task. > > By using one ethernet, I can use one port address to exchange data > > between the device and the server, and another port address for > > debugging between the developer and the device, so there should be no > > confliction. > > By using two ethernet, I can assign two IP address to them. So the > > debugging task and data transfer task won't interfere each other too. > > However, I am not sure which one is the more popular engineer practice > > for this kind of applications? What are the pros and cons of these two > > solutions? Any insight? > > BTW, I may use either DHCP or static IP to allocate the IP address. > > Thank you very much! > > Unless you truly *need* two i/f's, stick with one. > Some possible reasons for *needing* two include: > - the pipe for your "normal" interface is "almost full" > - you have particular QoS requirements (that your > =A0 =A0stack doesn't correctly implement) > - the traffic on the "debug" i/f is "sensitive" in nature > =A0 =A0(and you don't want to risk a packet sniffer's attention) > - you are using UDP and worry about the traffic on one wire > =A0 =A0resulting in lots of dropped packets, (etc) > > You haven't explained what "this kind of applications" is so > it's not possible to tell you what *is* "the more popular > engineering practice" for your domain. > > If you go with two i/f's, try to put them on separate subnets > so your stack doesn't get upset trying to route over a particular > i/f (else you end up with *one* i/f doing all the work and/or > other unpredictable behavior).- Hide quoted text - > > - Show quoted text -
Good points. I will try my best to describe the applications. It is a non-life-critical, not-so-real-time, not-so-bandwidth-hungry type of applications.

The 2024 Embedded Online Conference