How to Export and Import MySQL Databases via SSH
Last Updated: 2025-01-01
3 min read
Overview
Exporting and importing MySQL databases via SSH is faster and more reliable than using phpMyAdmin, especially for large databases. The mysqldump utility creates a SQL dump file, and the mysql client imports it.
Exporting a Database (mysqldump)
Basic export
mysqldump -u username -p database_name > backup.sql
You’ll be prompted for your MySQL password. This creates a backup.sql file containing all tables, data, and structure.
Export with common recommended flags
mysqldump -u username -p --single-transaction --quick --lock-tables=false database_name > backup.sql
| Flag | Purpose |
|---|---|
--single-transaction | Consistent snapshot without locking (InnoDB) |
--quick | Retrieves rows one at a time — reduces memory usage |
--lock-tables=false | Avoids locking tables during the dump |
Export and compress simultaneously
For large databases, pipe directly to gzip:
mysqldump -u username -p database_name | gzip > backup.sql.gz
This saves significant disk space and time.
Export a specific table
mysqldump -u username -p database_name table_name > table_backup.sql
Export only the structure (no data)
mysqldump -u username -p --no-data database_name > structure.sql
Export only the data (no structure)
mysqldump -u username -p --no-create-info database_name > data.sql
Importing a Database
Basic import
mysql -u username -p database_name < backup.sql
Important: The target database must already exist. Create it first if needed:
mysql -u username -p -e "CREATE DATABASE database_name;"
Import a gzip-compressed dump
gunzip < backup.sql.gz | mysql -u username -p database_name
Or equivalently:
zcat backup.sql.gz | mysql -u username -p database_name
Import with progress monitoring (for large files)
Install pv (pipe viewer) for a progress bar:
pv backup.sql | mysql -u username -p database_name
For compressed files:
pv backup.sql.gz | gunzip | mysql -u username -p database_name
Handling Large Databases
Large databases (1 GB+) require extra care:
Use screen for long imports
screen -S db-import
mysql -u username -p database_name < large-backup.sql
# Press Ctrl+A, D to detach — the import continues in the background
Increase timeout settings
If imports timeout, add MySQL flags:
mysql -u username -p --max_allowed_packet=512M database_name < backup.sql
Split large dumps
Export individual tables if the full dump is too large to handle:
mysqldump -u username -p database_name table1 table2 > partial.sql
Common Flags Reference
mysqldump flags
| Flag | Description |
|---|---|
--single-transaction | Consistent dump without locking (InnoDB) |
--quick | Memory-efficient row-by-row retrieval |
--routines | Include stored procedures and functions |
--triggers | Include triggers (enabled by default) |
--all-databases | Dump all databases |
--no-data | Structure only, no data |
--no-create-info | Data only, no structure |
--add-drop-table | Add DROP TABLE before each CREATE (default) |
mysql flags
| Flag | Description |
|---|---|
-u | MySQL username |
-p | Prompt for password |
-h | MySQL host (default: localhost) |
-P | MySQL port (default: 3306) |
-e "QUERY" | Execute a query without entering the MySQL shell |
Tips
- Always create a backup before importing — overwriting data is irreversible.
- Use
--single-transactionfor InnoDB tables to avoid locking your live site during export. - Compress dumps with
gzipto reduce file size by 70-90%. - Use
screenfor imports that take more than a few minutes to protect against SSH disconnections. - Store backups outside your web root for security.
Tags:
ssh
linux
mysql
database
backup