"Cannot issue a new command while waiting for a previous one to complete"

Sep 22, 2011 at 1:53 PM

Hi,

Hope you can help. This is all in Powershell.

I get a list of the remote server download folder, do some checks and produce a list of filenames that I actually need to download. All good so far. However, when I try to loop the GetFile, it errors.

[some code snippets as it's quite long]

Add-Type -Path C:\PowerShell\AlexFTPS_bin_1.1.0\AlexFTPS-1.1.0\AlexPilotti.FTPS.Client.dll
$SSLMode = [AlexPilotti.FTPS.Client.ESSLSupportMode]::ClearText
$ftp = new-object "AlexPilotti.FTPS.Client.FTPSClient"
$cred = New-Object System.Net.NetworkCredential($username,$password)
$ftp.Connect($ftpServer,$cred,$SSLMode)
$ftp.SetCurrentDirectory($remoteOutgoing)

foreach ($item in $downloadList){

$ftp.GetFile($item)

}

Exception calling "GetFile" with "1" argument(s): "Cannot issue a new command while waiting for a previous one to complete"
At C:\PowerShell\FTP\dev-2.ps1:101 char:14
+     $ftp.GetFile <<<< ($file)
    + CategoryInfo          : NotSpecified: (:) [], MethodInvocationException
    + FullyQualifiedErrorId : DotNetMethodException

Am I missing something obvious? There is a FileTransferCallback overload?

Any help will be much appreciated.

Regards,

Gary

Jan 4, 2012 at 11:37 PM

I get the same error using with Powershell. Hopefully someone can provide some insight. This is a great tool, but if I can't solve this I'll need to move to something else.

I'm putting the files up to an FTP server, and it works fine for some number of files < 200 or so, most of the time. Sometimes it will do more. I only found it wasn't working when the user had over a 1000 files to move and it failed.

At first I was using *.* so I tried getting each item and sending 1 by 1 with a time delay. Sometimes with time delay of 50 ms it would transfer all 1000 files. Other times it fails even with dealy of 1500 ms. Time delay gets it farther, but adding 1500 secs to the process makes it a bit long already.

Here's some code

connection: it works, and isnce I'm inside sending to dmz, I need an active mode connection as there is no nat. That means setting up the full ssl connection even though I'm not using ssl (add connection mode to the basic connection?)

$ftp.Connect($ftpServer, $Port, $cred, $SSLMode, $CertVal, $cert, 1024, 1024, 10, 100, $false, $ConnectMode)

then the put

Get-ChildItem $XferDir | %{
                                     $FileName = $_.name;
                                     $ftp.PutFiles($XferDir,$remoteFileDir, $FileName, [AlexPilotti.FTPS.Client.EPatternStyle]::Wildcard, $false, $Null);
                                     Start-Sleep -Milliseconds 3000
                                       }

Only a novice with PS, so handling the exception is the next challenge. Any help is greatly appreciated

Thanks,

Neil

Coordinator
Jan 5, 2012 at 9:56 AM

Neil,

do you get the "Cannot issue a new command..." exception after you received a previous exception from PutFiles?

An easy workaround is to Close(), re-instantiate the client and Connect() again, but I want to find the source of your issue.

Exception handing with the workaround could be something like:

 

try {
  $ftp.PutFiles(...)
}
catch {
  $ftp.Close()
  $ftp = New-Object ...
  $ftp.Connect(...)
}

 

 

Jan 5, 2012 at 4:03 PM

Alex –

Thanks for the reply. Got it to run all the way through with a 3 sec delay between files. Whether that’s consistent don’t know yet. Takes a while to run it.

Yes. I do get the exception repeatedly for each successive file after the first exception occurs. I had a similar thought on workaround, but I only closed and connected , did not re-instantiate the client. That gave exceptions from the 1st iteration. I will add the instantiation and try that as well as add the error handling.

It feels like the script is queuing up the files and as its pushing them, something stalls, maybe on the ftp server, and it never recovers because the next file keeps pushing.

You’ll probably get my next update by the time you see this one.

Thanks

Neil

From: alexp [email removed]
Sent: Thursday, January 05, 2012 2:57 AM
To: Neil Reuben
Subject: Re: "Cannot issue a new command while waiting for a previous one to complete" [ftps:273472]

From: alexp

Neil,

do you get the "Cannot issue a new command..." exception after you received a previous exception from PutFiles?

An easy workaround is to Close(), re-instantiate the client and Connect() again, but I want to find the source of your issue.

Exception handing with the workaround could be something like:

try {
  $ftp.PutFiles(...)
}
catch {
  $ftp.Close()
  $ftp = New-Object ...
  $ftp.Connect(...)
}

Jan 5, 2012 at 5:52 PM

Alex

The workaround is in place and works fine. It handled the 1000+ files w/o a problem, except on one run the whole thing stopped and I got this error:

Exception calling "PutFiles" with "6" argument(s): "Unable to read data from th

e transport connection: A connection attempt failed because the connected party

did not properly respond after a period of time, or established connection fai

led because connected host has failed to respond."

At C:\Shares\Scripts\FTP\uts\UTS_All_dwgs_outgoing_4.ps1:144 char:21

+ $ftp.PutFiles <<<< ($XferDir,$remoteFileDir, $Fil

eName, [AlexPilotti.FTPS.Client.EPatternStyle]::Wildcard, $false, $Null);

+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException

+ FullyQualifiedErrorId : DotNetMethodException

I’m guessing that’s the timeout value I set of 100 – seems about the time it took to give up. So now it’s the error handling in case of this occurrence. But a big step forward.

The FTP server and the Script execution host are both virtualized so that may be a factor

Thanks

Neil

From: alexp [email removed]
Sent: Thursday, January 05, 2012 2:57 AM
To: Neil Reuben
Subject: Re: "Cannot issue a new command while waiting for a previous one to complete" [ftps:273472]

From: alexp

Neil,

do you get the "Cannot issue a new command..." exception after you received a previous exception from PutFiles?

An easy workaround is to Close(), re-instantiate the client and Connect() again, but I want to find the source of your issue.

Exception handing with the workaround could be something like:

try {
  $ftp.PutFiles(...)
}
catch {
  $ftp.Close()
  $ftp = New-Object ...
  $ftp.Connect(...)
}

Jan 6, 2012 at 12:50 AM

Alex

Last update for the day here (PST)

I had first created the process outlined in my last email. Basically a loop with a new object, connect, put close.

I then created the script as you shown below, with try and catch implemented. Then went back and added try and catch to the approach above.

Here’s the basic code from each: $XferDir is the directory holding the source files.

Method 1 – open/close for each file

Get-ChildItem $XferDir | %{

$FileName = $_.name;

Try {

$ftp = new-object 'AlexPilotti.FTPS.Client.FTPSClient'

$ftp.Connect($ftpServer, $Port, $cred, $SSLMode, $CertVal, $cert, 1024, 1024, 10, 100, $false, $ConnectMode)

$ftp.PutFiles($XferDir,$remoteFileDir, $FileName, [AlexPilotti.FTPS.Client.EPatternStyle]::Wildcard, $false, $Null);

$ftp.Close()

}

Catch {

Out-File -FilePath $logFilePath -Append -InputObject $Error[0]

$TimeStamp = Get-Date

Out-File -FilePath $logFilePath -Append -InputObject ([String]$QtyFTP + " " + [String]$TimeStamp + " - ERROR CHECK FILE:" + $FileName)

$ftp.Close()

$ftp = new-object 'AlexPilotti.FTPS.Client.FTPSClient'

$ftp.Connect($ftpServer, $Port, $cred, $SSLMode, $CertVal, $cert, 1024, 1024, 10, 100, $false, $ConnectMode)

$ftp.PutFiles($XferDir,$remoteFileDir, $FileName, [AlexPilotti.FTPS.Client.EPatternStyle]::Wildcard, $false, $Null);

$ftp.Close()

}

$QtyFTP = $QtyFTP + 1

$TimeStamp = Get-Date

Out-File -FilePath $logFilePath -Append -InputObject ([String]$QtyFTP + " " + [String]$TimeStamp + " - " + $FileName)

}

Method 2 – open 1 time, then after each catch to recover.

$ftp = new-object 'AlexPilotti.FTPS.Client.FTPSClient'

$ftp.Connect($ftpServer, $Port, $cred, $SSLMode, $CertVal, $cert, 1024, 1024, 10, 100, $false, $ConnectMode)

Get-ChildItem $XferDir | %{

$FileName = $_.name;

Try {

$ftp.PutFiles($XferDir,$remoteFileDir, $FileName, [AlexPilotti.FTPS.Client.EPatternStyle]::Wildcard, $false, $Null);

}

Catch {

Out-File -FilePath $logFilePath -Append -InputObject $Error[0]

$TimeStamp = Get-Date

Out-File -FilePath $logFilePath -Append -InputObject ([String]$QtyFTP + " " + [String]$TimeStamp + " - ERROR CHECK FILE:" + $FileName)

$ftp.Close()

$ftp = new-object 'AlexPilotti.FTPS.Client.FTPSClient'

$ftp.Connect($ftpServer, $Port, $cred, $SSLMode, $CertVal, $cert, 1024, 1024, 10, 10, $false, $ConnectMode)

$ftp.PutFiles($XferDir,$remoteFileDir, $FileName, [AlexPilotti.FTPS.Client.EPatternStyle]::Wildcard, $false, $Null);

#$QtyFTP = $QtyFTP - 1

}

$QtyFTP = $QtyFTP + 1

$TimeStamp = Get-Date

Out-File -FilePath $logFilePath -Append -InputObject ([String]$QtyFTP + " " + [String]$TimeStamp + " - " + $FileName)

}

Both method 1 and 2 will generate 1 or 2 exceptions on a run of 1200 files. The catch should attempt to upload the same file again. However many times that file will be 0 bytes. This always happens with permission denied exception. Sometimes the 2nd try writes the file correctly, and one time it has written a partial file.

Exception calling "PutFiles" with "6" argument(s): "Unable to read data from th

e transport connection: A connection attempt failed because the connected party

did not properly respond after a period of time, or established connection fai

led because connected host has failed to respond."

At C:\Shares\Scripts\FTP\uts\UTS_All_dwgs_outgoing_4.ps1:144 char:21

+ $ftp.PutFiles <<<< ($XferDir,$remoteFileDir, $Fil

eName, [AlexPilotti.FTPS.Client.EPatternStyle]::Wildcard, $false, $Null);

+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException

+ FullyQualifiedErrorId : DotNetMethodException

Exception calling "PutFiles" with "6" argument(s): "Permission denied."

At C:\Shares\Scripts\FTP\uts\UTS_All_dwgs_outgoing_5.ps1:156 char:21

+ $ftp.PutFiles <<<< ($XferDir,$remoteFileDir, $Fil

eName, [AlexPilotti.FTPS.Client.EPatternStyle]::Wildcard, $false, $Null);

+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException

+ FullyQualifiedErrorId : DotNetMethodException

Not sure why the permission denied errors occur – its not a file security issue – same source and destination directories each time. Don’t think it’s the same files either as I have gotten a few runs where all the files go successfully.

Probably will shift to method 2 as 1 is not 100% reliable either and it flaps the connection up and down for each file. However method 2 seems to generate lower average cpu on the task mgr, probably because its not driven continuously.

Not sure what to do at this point – maybe put a slight wait in the catch block before putting the file again.

Best,

Neil

From: alexp [email removed]
Sent: Thursday, January 05, 2012 2:57 AM
To: Neil Reuben
Subject: Re: "Cannot issue a new command while waiting for a previous one to complete" [ftps:273472]

From: alexp

Neil,

do you get the "Cannot issue a new command..." exception after you received a previous exception from PutFiles?

An easy workaround is to Close(), re-instantiate the client and Connect() again, but I want to find the source of your issue.

Exception handing with the workaround could be something like:

try {
  $ftp.PutFiles(...)
}
catch {
  $ftp.Close()
  $ftp = New-Object ...
  $ftp.Connect(...)
}

Coordinator
Jan 6, 2012 at 4:44 PM

Neil,

it looks like your FTP server does not handle the amount of traffic that you are requesting. Putting some "sleep" between the Put commands might help.

The reliability in this case is not given by the client but by the server. I'd choose the following approach:

Connect

function trasferfiles(localdir ) {

Get local files and directory list

create remore dir if necessary

  cd remotedir

Loop on each file

Issue PutFile (not putfiles) for each file that you have to transfer. In case of error, sleep and repeat until it succeeds or a max number of failed attempt is reached

Loop on each directory

transferfiles(dir)

}

Close

 

I'm thinking about adding a retry attempts option in the PutFiles and GetFiles methods

 

Best,

Alessandro

Coordinator
Jan 6, 2012 at 6:43 PM

Neil,

I solved the "Connect" bug and added a new feature called KeepAlive that basically sends NOOP commands to the server at given intervals.

Your issue could be related to a timeout during PutFile commands. Can you try with the KeepAlive feature on?

All you need to start it is calling the following method after Connect(...):

 

$c.StartKeepAlive()

 

Best,

Alessandro

 

Jan 6, 2012 at 10:13 PM
Supposedly it was a very capable ftp product – they claim all these commercial and government installations, and charge an arm and leg for the advanced features. I've watched the cpu and memory utilization and it seems to be fairly low, but there may be some other contention with the VM storage backend that causes it to loose it.

I think your suggestion is the right track, and had been thinking along similar lines. When I put some big sleep between each put previously it finally ran, but took forever. Now that I can catch the errors, and then sleep might be reasonable.

From: alexp <notifications@codeplex.com>
Reply-To: <ftps@discussions.codeplex.com>
Date: Fri, 6 Jan 2012 09:44:33 -0800
To: Neil Reuben <neilr@foga.com>
Subject: Re: "Cannot issue a new command while waiting for a previous one to complete" [ftps:273472]

From: alexp

Neil,

it looks like your FTP server does not handle the amount of traffic that you are requesting. Putting some "sleep" between the Put commands might help.

The reliability in this case is not given by the client but by the server. I'd choose the following approach:

Connect

function trasferfiles(localdir ) {

Get local files and directory list

create remore dir if necessary

cd remotedir

Loop on each file

Issue PutFile (not putfiles) for each file that you have to transfer. In case of error, sleep and repeat until it succeeds or a max number of failed attempt is reached

Loop on each directory

transferfiles(dir)

}

Close

I'm thinking about adding a retry attempts option in the PutFiles and GetFiles methods

Best,

Alessandro

Jan 6, 2012 at 10:13 PM
Will do – thanks.

From: alexp <notifications@codeplex.com>
Reply-To: <ftps@discussions.codeplex.com>
Date: Fri, 6 Jan 2012 11:43:53 -0800
To: Neil Reuben <neilr@foga.com>
Subject: Re: "Cannot issue a new command while waiting for a previous one to complete" [ftps:273472]

From: alexp

Neil,

I solved the "Connect" bug and added a new feature called KeepAlive that basically sends NOOP commands to the server at given intervals.

Your issue could be related to a timeout during PutFile commands. Can you try with the KeepAlive feature on?

All you need to start it is calling the following method after Connect(...):

$c.StartKeepAlive()

Best,

Alessandro

Jan 10, 2012 at 9:34 PM

Alex, sorry but got a bit distracted. I guess the fixes that are in the code are on the source code download page. Could you give me a brief explanation on building it? Have visual studio 2010, and opened the solution, the cmd app gave a bunch of errors regarding c5, plossum and stuff in the options file. Not really much of a developer these days

Thx

Neil

From: alexp [email removed]
Sent: Friday, January 06, 2012 11:44 AM
To: Neil Reuben
Subject: Re: "Cannot issue a new command while waiting for a previous one to complete" [ftps:273472]

From: alexp

Neil,

I solved the "Connect" bug and added a new feature called KeepAlive that basically sends NOOP commands to the server at given intervals.

Your issue could be related to a timeout during PutFile commands. Can you try with the KeepAlive feature on?

All you need to start it is calling the following method after Connect(...):

$c.StartKeepAlive()

Best,

Alessandro

Jan 10, 2012 at 9:38 PM

Alex, sorry but got a bit distracted. I guess the fixes that are in the code are on the source code download page. Could you give me a brief explanation on building it? Have visual studio 2010, and opened the solution, the cmd app gave a bunch of errors regarding c5, plossum and stuff in the options file. Not really much of a developer these days

Thx

Neil

From: alexp [email removed]
Sent: Friday, January 06, 2012 11:44 AM
To: Neil Reuben
Subject: Re: "Cannot issue a new command while waiting for a previous one to complete" [ftps:273472]

From: alexp

Neil,

I solved the "Connect" bug and added a new feature called KeepAlive that basically sends NOOP commands to the server at given intervals.

Your issue could be related to a timeout during PutFile commands. Can you try with the KeepAlive feature on?

All you need to start it is calling the following method after Connect(...):

$c.StartKeepAlive()

Best,

Alessandro

Coordinator
Jan 11, 2012 at 10:53 PM

Neil,

I just added the libraries to the repository along with the sources, can you please check now if you can compile?

Alessandro

Jan 11, 2012 at 10:56 PM

OK – thx. I’ve got the current version with a 3 sec sleep on error working reliably so I will work this in along with some other tasks.

From: alexp [email removed]
Sent: Wednesday, January 11, 2012 3:54 PM
To: Neil Reuben
Subject: Re: "Cannot issue a new command while waiting for a previous one to complete" [ftps:273472]

From: alexp

Neil,

I just added the libraries to the repository along with the sources, can you please check now if you can compile?

Alessandro

Jan 13, 2012 at 11:35 PM

Alessandro,

OK – managed to compile and execute the new code. Ran the scripts first w/o keepalive and then with.

I still get some errors, under either situation. If I stop script w/o closing ftp I can see the noop coming across on the ftp server side. Here’s the exceptions I’m getting w keepalive enabled:

The first int is the file # out of about 1200. Each time it recovered successfully after waiting 3 secs and retrying. I think problem is on receiving side. So I think noop only helps when the sending side stops.

614 01/13/2012 16:14:42 - fpe_09.dwg

Exception calling "PutFiles" with "6" argument(s): "Invalid FTP protocol reply:

200 Command OK."

At C:\Shares\Scripts\FTP\uts\Old\UTS_All_dwgs_outgoing_5.ps1:149 char:21

+ $ftp.PutFiles <<<< ($XferDir,$remoteFileDir, $Fil

eName, [AlexPilotti.FTPS.Client.EPatternStyle]::Wildcard, $false, $Null);

+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException

+ FullyQualifiedErrorId : DotNetMethodException

ERROR 01/13/2012 16:14:43 - Retrying File:fpe_10.dwg

615 01/13/2012 16:14:47 - fpe_10.dwg

665 01/13/2012 16:15:06 - fp_02.dwg

Exception calling "PutFiles" with "6" argument(s): "Unable to read data from th

e transport connection: A non-blocking socket operation could not be completed

immediately."

At C:\Shares\Scripts\FTP\uts\Old\UTS_All_dwgs_outgoing_5.ps1:149 char:21

+ $ftp.PutFiles <<<< ($XferDir,$remoteFileDir, $Fil

eName, [AlexPilotti.FTPS.Client.EPatternStyle]::Wildcard, $false, $Null);

+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException

+ FullyQualifiedErrorId : DotNetMethodException

ERROR 01/13/2012 16:15:08 - Retrying File:fp_03.dwg

666 01/13/2012 16:15:12 - fp_03.dwg

672 01/13/2012 16:15:17 - fp_09.dwg

Exception calling "PutFiles" with "6" argument(s): "Unable to read data from th

e transport connection: A connection attempt failed because the connected party

did not properly respond after a period of time, or established connection fai

led because connected host has failed to respond."

At C:\Shares\Scripts\FTP\uts\Old\UTS_All_dwgs_outgoing_5.ps1:149 char:21

+ $ftp.PutFiles <<<< ($XferDir,$remoteFileDir, $Fil

eName, [AlexPilotti.FTPS.Client.EPatternStyle]::Wildcard, $false, $Null);

+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException

+ FullyQualifiedErrorId : DotNetMethodException

ERROR 01/13/2012 16:15:18 - Retrying File:fp_10.dwg

673 01/13/2012 16:15:21 - fp_10.dwg

805 01/13/2012 16:16:12 - GLZ_WL3_F05_ELEV.dwg

Exception calling "PutFiles" with "6" argument(s): "Unable to read data from th

e transport connection: A connection attempt failed because the connected party

did not properly respond after a period of time, or established connection fai

led because connected host has failed to respond."

At C:\Shares\Scripts\FTP\uts\Old\UTS_All_dwgs_outgoing_5.ps1:149 char:21

+ $ftp.PutFiles <<<< ($XferDir,$remoteFileDir, $Fil

eName, [AlexPilotti.FTPS.Client.EPatternStyle]::Wildcard, $false, $Null);

+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException

+ FullyQualifiedErrorId : DotNetMethodException

ERROR 01/13/2012 16:16:13 - Retrying File:GLZ_WU1_F01_ELEV.dwg

806 01/13/2012 16:16:16 - GLZ_WU1_F01_ELEV.dwg

851 01/13/2012 16:16:37 - ka_05_Corrugated Glazing.dwg

Exception calling "PutFiles" with "6" argument(s): "Invalid FTP protocol reply:

200 Command OK."

At C:\Shares\Scripts\FTP\uts\Old\UTS_All_dwgs_outgoing_5.ps1:149 char:21

+ $ftp.PutFiles <<<< ($XferDir,$remoteFileDir, $Fil

eName, [AlexPilotti.FTPS.Client.EPatternStyle]::Wildcard, $false, $Null);

+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException

+ FullyQualifiedErrorId : DotNetMethodException

ERROR 01/13/2012 16:16:37 - Retrying File:ka_05_Ground Floor Glazing.dwg

852 01/13/2012 16:16:40 - ka_05_Ground Floor Glazing.dwg

From: alexp [email removed]
Sent: Friday, January 06, 2012 11:44 AM
To: Neil Reuben
Subject: Re: "Cannot issue a new command while waiting for a previous one to complete" [ftps:273472]

From: alexp

Neil,

I solved the "Connect" bug and added a new feature called KeepAlive that basically sends NOOP commands to the server at given intervals.

Your issue could be related to a timeout during PutFile commands. Can you try with the KeepAlive feature on?

All you need to start it is calling the following method after Connect(...):

$c.StartKeepAlive()

Best,

Alessandro

Mar 22, 2012 at 2:55 PM
Edited Mar 24, 2012 at 12:18 PM

Hi Alex:

Thanks for the great job with the FTP client.  It save me a lot of time.

delving into the code I've noticed that when running GetFile(string remoteFileName) it never Close() the Stream and therefore streamClosedCallback() is never executed. I solve this problem rewriting the Getfile Methods.

        public FTPStream GetFilestream(string remoteFileName)
        {
            SetupDataConnection();
            RetrCmd(remoteFileName);
            return EndStreamCommand(FTPStream.EAllowedOperation.Read);
        }

        public byte[] GetFile(string remoteFileName)
        {
            var s = GetFilestream(remoteFileName);
            var buffer = new byte[16 * 1024];
            var ms = new MemoryStream();
                int read;
                while ((read = s.Read(buffer, 0, buffer.Length)) > 0)
                {
                    ms.Write(buffer, 0, read);
                }
            s.Close();
            return ms.ToArray();
        }

        In  the other Getfile methos I call to GetFilestream instead of Getfile

    I wish that this can help you to find the final solution