[lnkForumImage]
TotalShareware - Download Free Software

Confronta i prezzi di migliaia di prodotti.
Asp Forum
 Home | Login | Register | Search 


 

Forums >

microsoft.public.dotnet.framework.aspnet.webservices

What way to send large data from .NET to Linux-platform

Jonah Olsson

6/10/2004 12:22:00 PM

Dear All,

I'm currently developing a solution where large amounts of personalised
emails are being created (and no, this is not spam...) on the ASP.NET
platform and being delivered by a Debian Linux server running Qmail and
mySQL. Currently the .NET application just connects to the SMTP-port on the
Linux server and sends each mail one by one. This creates an awful lot of
traffic and isn't really a good way of handling >100.000 emails/month.

I would like a solution where all this data first being prepared on the .NET
platform, and then transferred to the Linux platform to be handled and sent.
But how should I solve this both secure/reliable and efficient?

So basically I have two questions;

Should I prepare a large XML dataset and ship this to the Linux server to be
handled locally (Perl + mySQL + Qmail). This would need some kind of status
check since if the Linux server would go down, some mail might already have
been sent.

Can I use Web Services here? If so, I suppose I should create two Web
Services. One on the Linux platform to receive the dataset with personalised
emails, and one on the .NET platform to receive status and results.

Am I missing something out here? Qmail is currently the most reliable part
here I think, since it basically never looses mail even if the network or
server goes down. But the data sent to Qmail might be lost due to network
trouble etc. This is an important part of the problem.

Someone with similar experience?
Thanks for any kind of help/hints!

Best regards
Jonah Olsson
Generation Software



4 Answers

Kevin Spencer

6/10/2004 2:46:00 PM

0

Hi Jonah,

You could certainly use Web Services, but you only need the service on one
end. The client that consumes the service is at the other end. It calls a
WebMethod on the Web Service. If the service is on the Windows machine, the
Method can return the data needed by the Unix machine. However, you should
be aware that by having the computers use a Web Service to send all the
emails to the Unix machine, and having that machine email them all at once,
you may actually be causing a total sum of MORE processing and memory usage
across both computers than the simpler method you're already using. You're
adding an extra SOAP layer to the process. On the other hand, if one or the
other of the machines is under heavy load, you may be able to balance it out
somewhat by using more of the other machine's resources.

Good question!

--
HTH,
Kevin Spencer
.Net Developer
Microsoft MVP
Big things are made up
of lots of little things.

"Jonah Olsson" <jonah@IHATESPAM.com> wrote in message
news:eCXzPTtTEHA.2324@TK2MSFTNGP10.phx.gbl...
> Dear All,
>
> I'm currently developing a solution where large amounts of personalised
> emails are being created (and no, this is not spam...) on the ASP.NET
> platform and being delivered by a Debian Linux server running Qmail and
> mySQL. Currently the .NET application just connects to the SMTP-port on
the
> Linux server and sends each mail one by one. This creates an awful lot of
> traffic and isn't really a good way of handling >100.000 emails/month.
>
> I would like a solution where all this data first being prepared on the
.NET
> platform, and then transferred to the Linux platform to be handled and
sent.
> But how should I solve this both secure/reliable and efficient?
>
> So basically I have two questions;
>
> Should I prepare a large XML dataset and ship this to the Linux server to
be
> handled locally (Perl + mySQL + Qmail). This would need some kind of
status
> check since if the Linux server would go down, some mail might already
have
> been sent.
>
> Can I use Web Services here? If so, I suppose I should create two Web
> Services. One on the Linux platform to receive the dataset with
personalised
> emails, and one on the .NET platform to receive status and results.
>
> Am I missing something out here? Qmail is currently the most reliable part
> here I think, since it basically never looses mail even if the network or
> server goes down. But the data sent to Qmail might be lost due to network
> trouble etc. This is an important part of the problem.
>
> Someone with similar experience?
> Thanks for any kind of help/hints!
>
> Best regards
> Jonah Olsson
> Generation Software
>
>
>


Bruce Barker

6/10/2004 7:45:00 PM

0

its hard to believe you could come up with something better. SMTP mail is
pretty simple, you do a socket connect and send the data. the SMTP demon
just write the data to directory (after validating the headers). another
demon scans the directory for new email and sends it on its way. this is why
spamming is so cheap.

-- bruce (sqlwork.com)


"Jonah Olsson" <jonah@IHATESPAM.com> wrote in message
news:eCXzPTtTEHA.2324@TK2MSFTNGP10.phx.gbl...
> Dear All,
>
> I'm currently developing a solution where large amounts of personalised
> emails are being created (and no, this is not spam...) on the ASP.NET
> platform and being delivered by a Debian Linux server running Qmail and
> mySQL. Currently the .NET application just connects to the SMTP-port on
the
> Linux server and sends each mail one by one. This creates an awful lot of
> traffic and isn't really a good way of handling >100.000 emails/month.
>
> I would like a solution where all this data first being prepared on the
.NET
> platform, and then transferred to the Linux platform to be handled and
sent.
> But how should I solve this both secure/reliable and efficient?
>
> So basically I have two questions;
>
> Should I prepare a large XML dataset and ship this to the Linux server to
be
> handled locally (Perl + mySQL + Qmail). This would need some kind of
status
> check since if the Linux server would go down, some mail might already
have
> been sent.
>
> Can I use Web Services here? If so, I suppose I should create two Web
> Services. One on the Linux platform to receive the dataset with
personalised
> emails, and one on the .NET platform to receive status and results.
>
> Am I missing something out here? Qmail is currently the most reliable part
> here I think, since it basically never looses mail even if the network or
> server goes down. But the data sent to Qmail might be lost due to network
> trouble etc. This is an important part of the problem.
>
> Someone with similar experience?
> Thanks for any kind of help/hints!
>
> Best regards
> Jonah Olsson
> Generation Software
>
>
>


Jonah Olsson

6/14/2004 12:32:00 AM

0

Hi Kevin and thanks for your reply.
I'm sorry I haven't responded earlier, but I'm on a short vacation.

I now realise that such a solution discussed below will probably require a
lot more system resources (and development resources as well) than the
current version (or slightly modified).
Maybe I should stick to an SMTP connection and let Qmail do the entire
queuing, like what Bruce Barker suggested in his reply?

However, a Web Service would probably be well suited on the .NET server to
receive bounce statistics from the Linux mail server!

Thanks!
/Jonah

"Kevin Spencer" <kspencer@takempis.com> wrote in message
news:u$5bGnuTEHA.3752@TK2MSFTNGP12.phx.gbl...
> Hi Jonah,
>
> You could certainly use Web Services, but you only need the service on one
> end. The client that consumes the service is at the other end. It calls a
> WebMethod on the Web Service. If the service is on the Windows machine,
the
> Method can return the data needed by the Unix machine. However, you should
> be aware that by having the computers use a Web Service to send all the
> emails to the Unix machine, and having that machine email them all at
once,
> you may actually be causing a total sum of MORE processing and memory
usage
> across both computers than the simpler method you're already using. You're
> adding an extra SOAP layer to the process. On the other hand, if one or
the
> other of the machines is under heavy load, you may be able to balance it
out
> somewhat by using more of the other machine's resources.
>
> Good question!
>
> --
> HTH,
> Kevin Spencer
> .Net Developer
> Microsoft MVP
> Big things are made up
> of lots of little things.



Jonah Olsson

6/14/2004 12:36:00 AM

0

Hi Bruce and thanks for your reply.

So basically there will be no trouble sending 30.000+ in a row (as they're
being created) to the Linux (mail-)server?

/Jonah


"bruce barker" <nospam_brubar@safeco.com> skrev i meddelandet
news:OvnhFLxTEHA.1984@TK2MSFTNGP12.phx.gbl...
> its hard to believe you could come up with something better. SMTP mail is
> pretty simple, you do a socket connect and send the data. the SMTP demon
> just write the data to directory (after validating the headers). another
> demon scans the directory for new email and sends it on its way. this is
why
> spamming is so cheap.
>
> -- bruce (sqlwork.com)
>
>
> "Jonah Olsson" <jonah@IHATESPAM.com> wrote in message
> news:eCXzPTtTEHA.2324@TK2MSFTNGP10.phx.gbl...
> > Dear All,
> >
> > I'm currently developing a solution where large amounts of personalised
> > emails are being created (and no, this is not spam...) on the ASP.NET
> > platform and being delivered by a Debian Linux server running Qmail and
> > mySQL. Currently the .NET application just connects to the SMTP-port on
> the
> > Linux server and sends each mail one by one. This creates an awful lot
of
> > traffic and isn't really a good way of handling >100.000 emails/month.
> >
> > I would like a solution where all this data first being prepared on the
> .NET
> > platform, and then transferred to the Linux platform to be handled and
> sent.
> > But how should I solve this both secure/reliable and efficient?
> >
> > So basically I have two questions;
> >
> > Should I prepare a large XML dataset and ship this to the Linux server
to
> be
> > handled locally (Perl + mySQL + Qmail). This would need some kind of
> status
> > check since if the Linux server would go down, some mail might already
> have
> > been sent.
> >
> > Can I use Web Services here? If so, I suppose I should create two Web
> > Services. One on the Linux platform to receive the dataset with
> personalised
> > emails, and one on the .NET platform to receive status and results.
> >
> > Am I missing something out here? Qmail is currently the most reliable
part
> > here I think, since it basically never looses mail even if the network
or
> > server goes down. But the data sent to Qmail might be lost due to
network
> > trouble etc. This is an important part of the problem.
> >
> > Someone with similar experience?
> > Thanks for any kind of help/hints!
> >
> > Best regards
> > Jonah Olsson
> > Generation Software
> >
> >
> >
>
>