Losing connection after large file download

Nov 9, 2012 at 8:38 PM


I am using Powershell to pull files from our corporate FTP site (in Seattle) to an Azure server (east coast?). If I pull smaller files it works fine but if I pull a large file (250MB) I get an error. The error occurs after the file has completed being sent (verified on FTP site). The error is "Exception calling "GetFile" with "2" argument(s): "Unable to read data from the transport connection: A connection attempt failed because the connected party did not respond after a period of time...".

The time it takes to pull the file is just over 10 minutes. I did notice that the FTP server indicated it had completed sending the file about a minute before the error was thrown on the client.

There are no errors in the firewall log, event log. I am able to pull the large file across using Filezilla. No errors are logged on the FTP server.

The code loops through all the files on the specified FTP path. Once the error occurs all subsequent GetFile calls fail with the same error. 

Are there any timeout settings that may be at play here? Any advice would be appreciated.

Thank you


Nov 9, 2012 at 11:24 PM

Additional info:

This is Non SSL, FTP is IIS.

Here is the code minus the passed in params.

		add-type -path "C:\Scripts\FTPS\AlexPilotti.FTPS.Client.dll"
		$SSLMode = [AlexPilotti.FTPS.Client.ESSLSupportMode]::ClearText

		$c = new-object "AlexPilotti.FTPS.Client.FTPSClient"
		$cred = New-Object System.Net.NetworkCredential($username,$password)

		# Open the connection
		$c.Connect($ftpServer, $cred, $SSLmode)
		# Get a directory list
		$files = $c.GetDirectoryList($remotePickupDir) | select-Object Name
		# Get a files
		foreach ($File in $files){
			write-host (Get-Date)	
			$from = $remotePickupDir + $File.Name
			$To = $localFromFTPPath + $File.Name
			$c.GetFile($from, $To)
			write-host (Get-Date)

I referenced the .dll in a C# project and looked at the GetFile method overrides and noted that some have a timeout attribute. Is that for the SSL/cert functionality?

Thank you


Nov 12, 2012 at 12:30 PM

Hi Roy,

after Connect(...) you can call: 


Let me know if it works for you. 



Nov 12, 2012 at 4:07 PM

Hello Alessandro,

After adding that line I get a message indicating the invocation failed because [AlexPilotti.FTPS.Client.FTPSClient] doesn't contain a method name 'StartKeepAlive'.

I downloaded the FTPS zip on Friday from CodePlex and the .dll has a version of 1.1.0.


Thanks again



Nov 12, 2012 at 4:31 PM

Hi Roy,

You have to fetch the latest sources.

Nov 12, 2012 at 6:16 PM


I downloaded the source, compiled it and pushed it to the Azure server. I ran the PS script and the first file (Data1.zip, 263MB) downloaded fine however I received the following error "Exception calling "GetFile" with "2" argument(s): "Invalid FTP protocol reply: 226 File sent ok." At C:\Scripts\FTPSfer.ps1:32 char 14" on the second file.  After the script attempted to get the third file an exception of "A connection attempt failed because the connected party did not properly respond after a period of time...." Only the data1.zip file was pulled from the FTP site.

I am currently trying to pull the full build zipped up as one file to see if that works. If it does I will be OK but it is curious why it is failing on the second and third files. I put the StartKeepAlive() before the foreach loop and the StopKeepAlive() after the foreach loop but before the Close() function.

Thank you


Nov 12, 2012 at 10:01 PM

Hi a simple workaround to keep you going is to Close() and Connect() after each file. 

Nov 12, 2012 at 10:08 PM

Yes that will work if I need to move multiple zip files. Since I am in control of the upload from the head office to the FTP site I can ensure 1 file is uploaded and transferred to the Azure server. 

Thank you for this super .dll. It is working much better than the PSFTP module I was originally using.