Tuesday, February 8, 2011

How do I split the output from mysqldump into smaller files?

I need to move entire tables from one MySQL database to another. I don't have full access to the second one, only phpMyAdmin access. I can only upload (compressed) sql files smaller than 2MB. But the compressed output from a mysqldump of the first database's tables is larger than 10MB.

Is there a way to split the output from mysqldump into smaller files? I cannot use split(1) since I cannot cat(1) the files back on the remote server.

Or is there another solution I have missed?

Edit

The --extended-insert=FALSE option to mysqldump suggested by the first poster yields a .sql file that can then be split into importable files, provided that split(1) is called with a suitable --lines option. By trial and error I found that bzip2 compresses the .sql files by a factor of 20, so I needed to figure out how many lines of sql code correspond roughly to 40MB.

  • First dump the schema (it surely fits in 2Mb, no?)

    mysqldump -d --all-databases
    

    and restore it.

    Afterwards dump only the data in separate insert statements, so you can split the files and restore them without having to concatenate them on the remote server

    mysqldump --all-databases --extended-insert=FALSE --no-create-info=TRUE
    
  • Try csplit(1) to cut up the output into the individual tables based on regular expressions (matching the table boundary I would think).

    From jj33
  • You can dump individual tables with mysqldump by running mysqldump database table1 table2 ... tableN

    If none of the tables are too large, that will be enough. Otherwise, you'll have to start splitting the data in the larger tables.

    From skoob
  • You say that you don't have access to the second server. But if you have shell access to the first server, where the tables are, you can split your dump by table:

    for T in `mysql -N -B -e 'show tables from dbname'`; \
       do echo $T; \
       mysqldump [connecting_options] dbname $T \
       | gzip -c > dbname_$T.dump.gz ; \
       done

    This will create a gzip file for each table.

    Another way of splitting the output of mysqldump in separate files is using the --tab option.

    mysqldump [connecting options] --tab=directory_name dbname 

    where directory_name is the name of an empty directory. This command creates a .sql file for each table, containing the CREATE TABLE statement, and a .txt file, containing the data, to be restored using LOAD DATA INFILE. I am not sure if phpMyAdmin can handle these files with your particular restriction, though.

  • Using PHPMyAdmin for anything that modifies the database is a disaster. It is not possible to restore arbitrary backups using PHPMyAdmin, it will potentially fail in some unpredictable and perhaps silent way.

    Without proper access, you cannot do anything useful. If you have a database accessed only by PHPMyAdmin, you may as well not have a database at all as it is entirely useless. Sorry.

    Get shell access on the box. Get access to do stuff from the command line without any arbitrary (time, file size) limits. Then you might be in an environment where you can get useful work done. Just don't accept anything less.

    From MarkR
  • You don't need ssh access to either of your servers. Just a mysql[dump] client is fine. With the mysql[dump], you can dump your database and import it again.

    In your PC, you can do something like:

    $ mysqldump -u originaluser -poriginalpassword -h originalhost originaldatabase | mysql -u newuser -pnewpassword -h newhost newdatabase

    and you're done. :-)

    hope this helps

    From davi
  • Check out SQLDumpSplitter 2, I just used it to split a 40MB dump with success. You can get it at the link below:

    http://websitebuildersresource.com/2008/11/26/great-mysql-mysqldump-split-utility/

    Hope this help.

    Thanks.

    Nick

    nickgs.com

    From direct

0 comments:

Post a Comment