I have a very small test base, with 5 documents and 4 arguments. I want to update these documents when importing a new file (add new fields, replace old values with new ones) during the JSON file import process.
Previously, I was able to do this process in the process of importing a CSV file.
Code for CSV file:
def update_and_add_with_csv(self, data, key):
""" The function update all documents in collection databases using csv file
(add new columns and change old value). Using pandas """
df = pd.read_csv(data, low_memory=False)
df = df.to_dict('records')
key = key
try:
startTime = time.time()
for row in df:
self.collection.update_one({key: row.get(key)}, {'$set': row}, upsert=True)
endTime = time.time()
totalTime = endTime - startTime
totalTime = str('{:>.3f}'.format(totalTime))
How can this be done with JSON?
JSON file like this:
2
Answers
Yes, exactly, it works in a similar way. Might be useful to someone
I think the best way to do this is to not update these documents but replace them.
I’m assuming your date fields can be used as unique identifiers.
Not sure how your json file is formatted but if it is formatted the same way as your schema this should work and make it easier to add fields dynamically and take advantage of MongoDBs structure less feature.