skip to Main Content

I want to create a dev environment of my website on the same server. But I have a 7Gb of database which contains 479 tables and I want to make a copy of that database to the new DB.

I have tried this with the help of PHPmyadmin >> Operations >> copy database to functionality. But every time it will fail and return the error

Error in processing request Error code: 500 Error text: Internal Error. 

Please let me know there is any other method/ solution to copy this database to a new database from cpanel please advise

9

Answers


  1. Create an export of your database. This should be easily done thru PhpMyAdmin interface. Once you downloaded the DB export, you need to create a new DB where you will put your exported data. This, too, should be easily done thru PhpMyAdmin user interface.

    To upload it, we cannot use Import -> Browse your computer because it has a limit of 2MB. One solution is to use Import -> Select from the web server upload directory /var/lib/phpMyAdmin/upload/. Upload your exported data in this directory. After that, your uploaded data should be listed in the dropdown next to it.

    enter image description here

    If this fails too, you can use the command line import.

    mysql -u user -p db_name < /path/to/file.sql
    
    Login or Signup to reply.
  2. The easiest way is to try exporting the data from phpmyadmin. It will create the backup of your data.

    But Sometimes, transferring large amount of data via import/export does result into errors.

    You can try mysqldump to backup the data as well.

    I found a few links for you here and here.

    This is the mysqldump database backup documentation.

    Hope it helps. 😀

    Login or Signup to reply.
  3. I suspect that PHPMyAdmin will handle databases of that size (PHP upload/download limits, memory constraints, script execution time).
    If you have acccess to the console, i would recommend doing export/import via the mysql command line:

    Export:

        $ mysqldump -u <user> -p<pass> <liveDatabase> | gzip > export.sql.gz
    

    And Import:

        $ gunzip < export.sql.gz | mysql -u <user> -p<pass> <devDatabase>
    

    after you have created the new dev database in e.g. PHPMyAdmin or via command line.

    Otherwise, if you only have access to an Apache/PHP environment, I would look for an export utility that splits export in smaller chunks. MySQLDumper comes to mind, but it’s a few years old and AFAIK it is no longer actively maintained and is not compatible with PHP 7+.
    But I think there is at least a pull request out there that makes it work with PHP7 (untested).

    Edit based on your comment:

    If the export already exists and the error occurs on import, you could try to increase the limits on your PHP environment, either via entries in .htaccess, changing php.ini or ini_set, whatever is available in your environment. The relevant settings are e.g. for setting via .htaccess (keep in mind, this will work only for apache environments with mod_php and also can be controlled by your hoster):

          php_value max_execution_time 3600
          php_value post_max_size 8000M
          php_value upload_max_filesize 8000M
          php_value max_input_time 3600
    

    This may or may not work, depending on x32/x64 issues and/or your hosters restrictions.
    Additionally, you need to adjust the PHPmyadmin settings for ExecTimeLimit – usually found in the config.default.php for your PHPMyAdmin installation:
    Replace

          $cfg['ExecTimeLimit'] = 300;
    

    with

          $cfg['ExecTimeLimit'] = 0;
    

    And finally, you probably need to adjust your MySQL config to allow larger packets and get rid of the ‘lost connection’ error:
    [mysqld] section in my.ini :

          max_allowed_packet=256M
    
    Login or Signup to reply.
  4. NOTE: I have just read your comment, and as I can understand you don’t have access to command line. Please check Solution Two, this will definitely work.

    The only solution that will work for you (which work for me at 12GB database) is directly from the command line:

    Solution One

    mysql -u root -p
    
    set global net_buffer_length=1000000; --Set network buffer length to a large byte number
    
    set global max_allowed_packet=1000000000; --Set maximum allowed packet size to a large byte number
    
    SET foreign_key_checks = 0; --Disable foreign key checking to avoid delays, errors and unwanted behavior
    
    source file.sql --Import your sql dump file
    
    SET foreign_key_checks = 1; --Remember to enable foreign key checks when the procedure is complete!
    

    If you have root access you can create bash script:

    #!/bin/sh 
    
    # store start date to a variable
    imeron=`date`
    
    echo "Import started: OK"
    dumpfile="/home/bob/bobiras.sql"
    
    ddl="set names utf8; "
    ddl="$ddl set global net_buffer_length=1000000;"
    ddl="$ddl set global max_allowed_packet=1000000000; "
    ddl="$ddl SET foreign_key_checks = 0; "
    ddl="$ddl SET UNIQUE_CHECKS = 0; "
    ddl="$ddl SET AUTOCOMMIT = 0; "
    # if your dump file does not create a database, select one
    ddl="$ddl USE jetdb; "
    ddl="$ddl source $dumpfile; "
    ddl="$ddl SET foreign_key_checks = 1; "
    ddl="$ddl SET UNIQUE_CHECKS = 1; "
    ddl="$ddl SET AUTOCOMMIT = 1; "
    ddl="$ddl COMMIT ; "
    
    echo "Import started: OK"
    
    time mysql -h 127.0.0.1 -u root -proot -e "$ddl"
    
    # store end date to a variable
    imeron2=`date`
    
    echo "Start import:$imeron"
    echo "End import:$imeron2"
    

    Source

    Solution Two

    Also, there is another option which is very good for those who are on shared hosting and don’t have command line access. This solution worked for me on 4-5GB files:

    1. MySQL Dumper: Download (You will able to backup/restore SQL file directly from “MySQL Dumper” you don’t need phpmyadmin anymore).
    2. Big Dump: Download (Just restore from Compress file and SQL file, need BIGDUMP PHP file editing for big import $linespersession = 3000; Change to $linespersession = 30000;)

    Solution Three:

    This solution definitely works, it is slow but works.

    Download Trial version of (32 or 64 bit): Navicat MySQL Version 12

    Install -> and RUN as Trial.

    After that Add your Computer IP (Internet IP, not local IP), to the MySQL Remote in cPanel (new database/hosting). You can use wildcard IP in cPanel to access MySQL from any IP.

    Goto Navicat MySQL: click on Connection put a connection name.

    In next “Hostname/IP” add your “Hosting IP Address” (don’t use localhost).
    Leave port as it is (if your hosting defined a different port put that one here).

    add your Database Username and Password

    Click Test Connection, If it’s successful, click on “OK”

    Now on the Main Screen you will see all the database connected with the username on the left side column.

    Double click on your database where you want to import SQL file:

    Icon color of the database will change and you will see “Tables/views/function etc..”.

    Now right click on database and select “Execute SQL file” (http://prntscr.com/gs6ef1).
    choose the file, choose “continue on error” if you want to and finally run it. Its take some time depending on your network connection speed and computer performance.

    Login or Signup to reply.
  5. You can use mysqldump as follow

    mysqldump —user= —password= --default-character-set=utf8
    

    You can also make use of my shell script, which actually wrote long back for creating back-up of MySQL database on regular basis using cron job.

    #!/bin/sh
    now="$(date +'%d_%m_%Y_%H_%M_%S')"
    filename="db_backup_$now".gz
    backupfolder=“"
    fullpathbackupfile="$backupfolder/$filename"
    logfile="$backupfolder/"backup_log_"$(date +'%Y_%m')".txt
    echo "mysqldump started at $(date +'%d-%m-%Y %H:%M:%S')" >> "$logfile"
    mysqldump —user= —password= --default-character-set=utf8  | gzip > "$fullpathbackupfile"
    echo "mysqldump finished at $(date +'%d-%m-%Y %H:%M:%S')" >> "$logfile"
    chown  "$fullpathbackupfile"
    chown  "$logfile"
    echo "file permission changed" >> "$logfile"
    find "$backupfolder" -name db_backup_* -mtime +2 -exec rm {} ;
    echo "old files deleted" >> "$logfile"
    echo "operation finished at $(date +'%d-%m-%Y %H:%M:%S')" >> "$logfile"
    echo "*****************" >> "$logfile"
    exit 0
    

    I have already written an article on Schedule MySQL Database backup on CPanel or Linux.

    Login or Signup to reply.
  6. Here’s how I handled that problem when I faced it… Unfortunately this only works for Mac OS.

    • Download Sequel Pro – Completely free, and it has worked really well for me for over a year now.
    • Remotely connect to your server’s database. You will probably need to add your ip address to the “Remote MYSQL” section in CPANEL. If you don’t have the credentials, you can probably get them from your website’s config file.
    • Once you’re in the server, you can select all of your tables, secondary click, and select Export > As SQL Dump. You probably won’t need to edit any of the settings. Click “Export”.
    • Login to your local servers database, and select “Query” from the top menu.
    • Drag and drop the file that was downloaded from the export and it will automatically setup the database from the sql dump.

    I hope this helps. It’s a little work around, but it’s worked really well for me, especially when PMA has failed.

    Login or Signup to reply.
  7. Since the requirements include PHPMyAdmin, my suggestion is to:

    1. select the database you need
    2. go to the “Export” tab
    3. click the “Custom – display all possible options” radio button
    4. in the “Save output to a file” radio button options, select “gzipped” for “Compression:”
    5. Remove the “Display comments” tick (to save some space)
    6. Finish the export

    Then try to import the generated file in the new Database you have (if you have sufficient resources – this should be possible).

    Note: My previous experience shows that using compression allows larger DB exports/import operations but have not tested what is the upper limit in shared hosting environments (assuming this by your comment for cPanel).

    Edit: When your export file is created, select the new database (assuming it is already created), go to the “Import” tab, select the file created from the export and start the import process.

    Login or Signup to reply.
  8. Limited to phpMyAdmin? Don’t do it all at once

    Large data-sets shouldn’t be dumped (unless it’s for a backup), instead, export the database without data, then copy one table at a time (DB to DB directly).

    Export/Import Schema

    First, export only the database schema via phpMyAdmin (uncheck data in the export options). Then import that onto a new database name.

    Alternatively, you could use something like below to generate statements like below, once you’ve created the DB. The catch with this method is that you’re likely to lose constraints, sprocs, and the like.

    CREATE TABLE [devDB].[table] LIKE [prodDB].[table]

    Copy data, one table at a time.

    Use a good editor to create the 470 insert statements you need. Start with a list of table names, and use the good old find-and-replace.

    INSERT INTO [devDB].[table] SELECT * FROM [prodDB].[table];

    This may choke, depending on your environment. If it does, drop and recreate the dev database (or empty all tables via phpMyAdmin). Then, run the INSERT commands a few tables at a time.

    Database Administration requires CLI

    The real problem you’re facing here is that you’re trying to do database administration without access to the Command Line Interface. There are significant complicated details to migrating large sets of data efficiently, most of which can only be solved using tools like mysqldump.

    Login or Signup to reply.
  9. If you have you database in your local server, you can export it and use BigDump to inserting to new database on the global server BigDump

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search