The process of uploading multiple files from a blob storage to an SFTP server is encountering a recurring issue characterized by socket timeout exceptions.
Renci.SshNet.Common.SshConnectionException:
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=8.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
at Renci.SshNet.Session.WaitOnHandle (Renci.SshNet, Version=2024.0.0.0, Culture=neutral, PublicKeyToken=1cee9f8bde3db106)
The occurrence stems from connection timeouts, despite the SFTP client closing the connection after each upload attempt.
A proposed solution entails:
1. Retrieving multiple files from a blob using a prefix.
2. Iterating through each file, downloading it into a memory stream.
3. Establishing an SFTP connection and uploading the file.
public async Task<bool> DownloadListOfFiles(ILogger log, string prefix, int? segmentSize)
{
bool isSuccess = true;
try
{
// Call the listing operation and return pages of the specified size.
var resultSegment = containerClient.GetBlobsByHierarchyAsync(prefix: prefix, delimiter: "/")
.AsPages(default, segmentSize);
// Enumerate the blobs returned for each page.
await foreach (Page<BlobHierarchyItem> blobPage in resultSegment)
{
// A hierarchical listing may return both virtual directories and blobs.
foreach (BlobHierarchyItem blobhierarchyItem in blobPage.Values)
//Parallel.ForEach(blobPage.Values, blobhierarchyItem =>
{
if (blobhierarchyItem.IsPrefix)
{
// Call recursively with the prefix to traverse the virtual directory.
await DownloadListOfFiles(log, blobhierarchyItem.Prefix, null);
}
else
{
// Write out the name of the blob.
var blobClient = containerClient.GetBlobClient(blobhierarchyItem.Blob.Name);
string FileName = blobhierarchyItem.Blob.Name;
var remoteFilePath = $"SFTPPath/ {FileName}";
using (var stream = new MemoryStream())
{
await blobClient.DownloadToAsync(stream);
stream.Position = 0;
await PostToSFTP(stream.ToArray(), log, remoteFilePath );
}
}
};
}
}
catch (RequestFailedException e)
{
isSuccess = false;
}
return isSuccess;
}
private async Task PostToSFTP(byte[] fileBytes, ILogger log, string remoteFilePath)
{
bool isSuccess = true;
// Create an SFTP client object using the host, port, username, and password
using (SftpClient sftpclient = new SftpClient(SFTPHost, SFTPPort, SFTPUserName, SFTPPassword))
{
try
{
// Connect to the server
sftpclient.Connect();
sftpclient.KeepAliveInterval = new TimeSpan(0, 1, 0);
sftpclient.OperationTimeout = new TimeSpan(0, 1, 0);
using (Stream stream = new MemoryStream(fileBytes))
{
// Upload the file to the remote path
sftpclient.UploadFile(stream, remoteFilePath, true);
}
sftpclient.Disconnect();
}
catch (SshException ex)
{
sftpclient.Disconnect();
isSuccess = false;
if (ex.InnerException != null && ex.InnerException.Message.Contains("Socket read operation has timed out"))
{
Thread.Sleep(100);
}
}
catch (Exception ex)
{
// Close the stream and the connection
sftpclient.Disconnect();
isSuccess = false;
}
finally
{
// Close the stream and the connection
sftpclient.Disconnect();
sftpclient.Dispose();
}
}
}
The process of uploading 60 small files from a blob storage to an SFTP server is taking significantly longer than anticipated, with an average upload time of around 5 minutes. Additionally, the operation is being interrupted by multiple socket timeout exceptions, further impeding the efficiency and reliability of the process.
2
Answers
Solutions: After experimenting with various solutions, most of which resulted in timeout exceptions, I discovered one approach that not only succeeded but also optimized the overall execution time. The key difference in this successful method compared to others lies in bypassing the step of downloading the file into a stream from the blob. Instead, I directly uploaded the file to the SFTP server. This adjustment not only resolved the timeout issues but also streamlined the execution process, resulting in improved efficiency.
Do not create and connect a new instance of SftpClient for each file. Create and connect once at the beginning of the process, then reuse the sftpClient instance to do the upload. It will be much quicker.