I’m attempting to import a large CSV file into mongodb using mongodb compass. The data originally came from BigQuery via GDELT, then was dumped into 40+csv files.
Over half of the files are not able to be imported as they get partially through and then they just stop. Compass stops with the error "Interior hyphen". There appears to be no documentation of why or what this might be.
At import, there are a few csv columns that are specified as numeric but everything else is considered a string, and specified as such in the CSV.
There are documented errors of mongodb issues when the table names use a hyphen, this is not the case here. Has anyone had this issue and solved it?
2
Answers
So while I haven't been able to find an answer. There is a workaround if required. For some reason, when you import these files through pymongo from dataframes, that pipeline does not appear to have any errors. Basically the work around that gets the data where it needs to be is like so:
This specifies the csvs, which are in a single folder (those that failed), and then specifies the datatypes, without which it throw errors because things like the event code, which are categorical variables, are ambiguous; so I'm importing them as strings (e.g., 050 is one code, so it is imported as '050' rather than 50).
You must ensure that you are selecting the proper data type during import. In Mongo, there exists both a Timestamp and a BSON native Date type. The Timestamp type is a 64-bit value which is mostly used for internal purposes. The Date type is more commonly used. The error
interior hyphen
will be returned if you select the Timestamp option when you are importing a timestamp that is of type Date.mongodb compass field options