KevinB
5/2/2008 3:38:00 PM
To answer my own question,
Here is my original code, below it is what I did to fix my problem.
System.Net.WebClient wc = new System.Net.WebClient();
> > > System.Net.WebProxy wp = new System.Net.WebProxy();
> > > wp.UseDefaultCredentials = true;
> > > wc.Headers.Add("User-Agent", "Mozilla/4.0+");
> > > wc.Proxy = wp;
> > > strm = wc.OpenRead(nextSubURL);
> > > System.IO.StreamReader sr = new System.IO.StreamReader(strm);
> > > System.IO.StringReader strReader = new
> > > System.IO.StringReader(sr.ReadToEnd());
the fix:
sr.DiscardBufferedData();
and of course you should always:
sr.close();
> > > strm = wc.OpenRead(nextSubURL);
> > > System.IO.StreamReader sr = new System.IO.StreamReader(strm);
> > > System.IO.StringReader strReader = new
> > > System.IO.StringReader(sr.ReadToEnd());
is in a loop grabbing a different nextSubURL from a table in my Db and I was
putting a different HTML page in the StreamReader everytime, clearing it
would be a tremendous help solve all timeout problems. Since the sr object
doesn't have a timeout method that isn't an option.
It's a neat little windows app, works great.
--
Kevin C. Brown
Developer
"KevinB" wrote:
> Scott, while I certainly appreciate the obvious answer and I'm absolutly a
> fan of sarcasm; you're not helping.
>
> Thanks anyway.
>
> Kevin
> --
> Kevin C. Brown
> Developer
>
>
> "Scott M." wrote:
>
> > I think is is because occassionally what you are reading takes too long to
> > read. The only thing you can do is increase your timeout time.
> >
> > -Scott
> >
> >
> > "KevinB" <Kevin.Brown@DarbyDentalSupply.com> wrote in message
> > news:0CE425DD-3858-4DD7-8EEB-C657FDE85081@microsoft.com...
> > > I'm doing screen scraping from a Windows Application and dumping certain
> > > data
> > > into a SqlDb, it's one of our parent websites, through a Proxy server
> > > using
> > > the the WebClient class. I have used the WebRequest/Response classes but
> > > the
> > > only benefit I got from that was the abiliy to control the timeout on the
> > > WebResponse. I can control the timeout on the ProxyServer here and that's
> > > all I need. For the issue I'm having the code below should suffice:
> > >
> > > System.Net.WebClient wc = new System.Net.WebClient();
> > > System.Net.WebProxy wp = new System.Net.WebProxy();
> > > wp.UseDefaultCredentials = true;
> > > wc.Headers.Add("User-Agent", "Mozilla/4.0+");
> > > wc.Proxy = wp;
> > > strm = wc.OpenRead(nextSubURL);
> > > System.IO.StreamReader sr = new System.IO.StreamReader(strm);
> > > System.IO.StringReader strReader = new
> > > System.IO.StringReader(sr.ReadToEnd());
> > >
> > > Here's my problem, eventually I'm getting a timeout on sr.ReadToEnd. I'm
> > > not getting it everytime; could it be that some pages being dumed into the
> > > StreamReader are just to big? I know that I can't control that but if
> > > there a
> > > way to stop that from timing out?
> > >
> > > Thanks everyone.
> > >
> > > --
> > > Kevin C. Brown
> > > Developer
> >
> >
> >