I’m currently using C# driver (v2.17.1 ) to connect to local Mongo Database and then loop through every document and update a field inside the document.
One iteration might take 2 to 5 seconds .
This works up until 1.4k to 1.5k documents .
Later there is an exception which is like below :
MongoDB.Driver.MongoCursorNotFoundException: Cursor 4299221987179083891 not found on server localhost:27017 using connection 1465.
at MongoDB.Driver.Core.Operations.AsyncCursor1.ExecuteGetMoreCommand(IChannelHandle channel, CancellationToken cancellationToken) at MongoDB.Driver.Core.Operations.AsyncCursor1.GetNextBatch(CancellationToken cancellationToken)
at MongoDB.Driver.Core.Operations.AsyncCursor1.MoveNext(CancellationToken cancellationToken) at MongoDB.Driver.Core.Operations.AsyncCursorEnumerator1.MoveNext()
My C# code looks like below :
string connectionString = @“mongodb://localhost:27017/”;
MongoClient dbClient = new MongoClient(connectionString );
string dbName = “mongodbname”;
string collectionName = “mongodbcollectionname”;
var db_sample = dbClient.GetDatabase(dbName);
var collection = db_sample.GetCollection(collectionName);
foreach (DataModel cdm in collection.AsQueryable())
{
//Iteration and updation of documents .
}
Any help is much appreciated !
Thanks in advance.
2
Answers
This issue can occur when the server takes longer than expected to process the request and the cursor times out. This can happen when iterating over a large number of documents, as you mentioned.
To address this issue, you can try using a different approach to iterate over the documents in smaller batches. One way to achieve this is by using the
Find
method with a filter and a limit and then using a loop to process the documents in batches. Here’s an example:In this example, we use the
Find
method with a filter (if needed) and aLimit
to fetch a batch of documents. We then process each document in the batch and keep track of the total count. If the batch size is less than the specified batch size, it means we have reached the end of the collection and we break out of the loop. Otherwise, we useSkip
to move the cursor to the next batch and continue the iteration.By processing the documents in smaller batches, you can reduce the load on the server and avoid the cursor timeout issue you are experiencing. Adjust the batch size as needed based on the performance of your system.
I hope this helps! Let me know if you have any further questions.
The server has timeout value
cursorTimeoutMillis
for cursor. You can read about it here. The doc says that by default it should be a 10 mins + time before nextgetMore
will happen (when you’re iterating data on the client, most of the time you use a batch data and only when the batch is ended you callgetMore
to get a new batch). It’s a bit suspicious that it takes 40-50 mins in your case, but try to set this value let’s say to 2hours and see what will happen. Here you can see how to set it. With c# driver you can do something like this:However, I would not recommend using so long living cursor, if something happens with a server during this long operation, a new server election that will happen on the server side won’t affect your cursor, ie cursor always communicates with the initial server which may down at some point.