[lnkForumImage]
TotalShareware - Download Free Software

Confronta i prezzi di migliaia di prodotti.
Asp Forum
 Home | Login | Register | Search 


 

BeeJ

7/24/2012 4:35:00 PM

I am reading and writing large files across my local network and from
drive to drive on the same PC. All my code is written to handle files
GT 2GBytes, typically 7GB (e.g. recorded TV).

Is there a recommended buffer size to use for my VB app? i.e. should
it be calculated each time based on the disk sector sizes? or should
it be based on available RAM or what? Recommended chunk size, say
10MByte buffers or?

Want the most efficient/fastest transfer rates possible. I am very
disappointed with Win7 Pro Windows Explorer that can only achieve
around 8MBPS disk to disk on the same PC. These are fast SATA drives.
Virus scanner is OFF. This is a 3.2GHz Quad with 8GBytes RAM. I can
get much higher rates with VB code I have written. I tried both native
VB6 copy and API copy, Both are much faster than Win7 WE but still want
to optimize by doing the copy better.


2 Answers

Clive Lumb

7/26/2012 7:25:00 AM

0



"BeeJ" <nospam@spamnot.com> a écrit dans le message de groupe de discussion
: jumis9$o7g$1@dont-email.me...
> I am reading and writing large files across my local network and from
> drive to drive on the same PC. All my code is written to handle files GT
> 2GBytes, typically 7GB (e.g. recorded TV).
>
> Is there a recommended buffer size to use for my VB app? i.e. should it
> be calculated each time based on the disk sector sizes? or should it be
> based on available RAM or what? Recommended chunk size, say 10MByte
> buffers or?
>
> Want the most efficient/fastest transfer rates possible. I am very
> disappointed with Win7 Pro Windows Explorer that can only achieve around
> 8MBPS disk to disk on the same PC. These are fast SATA drives. Virus
> scanner is OFF. This is a 3.2GHz Quad with 8GBytes RAM. I can get much
> higher rates with VB code I have written. I tried both native VB6 copy
> and API copy, Both are much faster than Win7 WE but still want to optimize
> by doing the copy better.
>

Windows - in its default configuration - will always fall over when you copy
a file that is larger than 50% of the installed (and addressable) RAM due to
the destination system running out of cache space.

Try this link
http://blogs.technet.com/b/askperf/archive/2007/05/08/slow-large-file-copy-i...
And this explanation
http://blogs.msdn.com/b/ntdebugging/archive/2007/11/27/too-much-...

Hope this helps

BeeJ

7/28/2012 1:33:00 AM

0

Clive Lumb explained on 7/26/2012 :
>
> "BeeJ" <nospam@spamnot.com> a écrit dans le message de groupe de discussion
> : jumis9$o7g$1@dont-email.me...
>> I am reading and writing large files across my local network and from drive
>> to drive on the same PC. All my code is written to handle files GT
>> 2GBytes, typically 7GB (e.g. recorded TV).
>>
>> Is there a recommended buffer size to use for my VB app? i.e. should it be
>> calculated each time based on the disk sector sizes? or should it be based
>> on available RAM or what? Recommended chunk size, say 10MByte buffers or?
>>
>> Want the most efficient/fastest transfer rates possible. I am very
>> disappointed with Win7 Pro Windows Explorer that can only achieve around
>> 8MBPS disk to disk on the same PC. These are fast SATA drives. Virus
>> scanner is OFF. This is a 3.2GHz Quad with 8GBytes RAM. I can get much
>> higher rates with VB code I have written. I tried both native VB6 copy and
>> API copy, Both are much faster than Win7 WE but still want to optimize by
>> doing the copy better.
>>
>
> Windows - in its default configuration - will always fall over when you copy
> a file that is larger than 50% of the installed (and addressable) RAM due to
> the destination system running out of cache space.
>
> Try this link
> http://blogs.technet.com/b/askperf/archive/2007/05/08/slow-large-file-copy-i...
> And this explanation
> http://blogs.msdn.com/b/ntdebugging/archive/2007/11/27/too-much-...
>
> Hope this helps

Interesting stuff but why did I buy a Quad PC running Win7 64bit and 8
G RAM? You would think that MS would know how to write software to do
this simple task.