skip to Main Content

I’ve tried to use the import table feature on mysql workbench to transfer csv excel files into mysql workbench but my files appear to be too large to import, it’s stuck for hours and eventually fails (see picture). I’ve even tried using xampp myphpadmin, but it times out even though I’ve amended the processing seconds to hours!

I need to be able to bring in data into sql workbench that’s more than 1,048,576 rowa which is the limit on excel. Anyone got any ideas on another way I can import data fast?

(I’ve tried the load data in file option too, but it’s not worked, not sure if I am doing this wrong, but open to suggestions)

Thanks in advance.

enter image description here

2

Answers


  1. Chosen as BEST ANSWER

    Thanks for your input, I followed your instructions and added the code to the terminal, but it comes up with the below error:

    Is there any way to get around this at all?

    Thanks

    enter image description here


  2. Regarding the large file size and also the resources that graphical user interfaces like Workbench and other applications require and the fact that maybe some machines have not it available, my suggestion is closing Workbench and all other unnecessary applications specially those with GUI and using the maximum power of your machine to import your data directly using Terminal. Open your Terminal (Command Prompt on Windows) and type:

    mysql -u [USERNAME] -p
    

    And give it the MySQL password. Replace [USERNAME] with an actual MySQL username E.x.:

    mysql -u root -p
    

    After pressing Enter you’ll be asked for the password of that MySQL user. Typr the password and press Enter again.

    Then navigate to the database that you want to insert the data into:

    use [DATABASE]
    

    Change [DATABASE] with the actual database name.

    Then you can import the info using a single command:

    load data local infile '[PATH_TO_CSV_FILE]' into table [TABLE_NAME] fields terminated by ',' lines terminated by 'n' ignore 1 rows;
    

    Don’t forget to change [PATH_TO_CSV_FILE] and [TABLE_NAME] with the actual strings.

    After the command is completed you’ll see a message like this:

    Query OK, 1,048,576 rows affected (x.xx sec)
    Records: 1,048,576 Deleted: 0 Skipped: 0 Warnings: 0
    

    After that you can see the changed on Workbench, phpMyAdmin, and by selecting from the DB.

    To logout you can simply type exit or quit.

    Meanings of the command

    • load data: Loads the data inside the CSV file local: On the machine
    • into table: Inserts into the target table fields terminated by ‘,’:
    • Indicator for end of fields on CSV file lines terminated by ‘n’:
    • Indicator for end of line on CSV file ignore 1 rows: Skips the 1st
      row of the CSV file (use 1 if you have set headers, otherwise type
      0).

    Note:

    • This is communicating with the MySQL server face to face.
    • You won’t have any GUI.
    • No extra resources or software issue middle in the process so it’ll go as fast as possible.
    • Consider taking a backup of your database before doing this if you’re not familiar with command line.

    We have published a full course on MySQL that is available on our website and also Youtube. Also, we have a video for this specific case that you can watch right now at:

    https://www.youtube.com/watch?v=phXIGogwuzY

    Hope it helped.
    Keep it up DataMaster!

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search