Currently, every day before I start work, I complete the following procedure:
- ssh to the production server
- gzip our daily database dump file
- scp the gzipped dump file over to my computer
- gunzip the dump file
- dropdb mydatabase
- createdb mydatabase
- psql mydatabase < dump.sql
Is it possible (I'm sure it is) to automate this process on Mac OSX? This way it is done by the time I get to work in the morning. If so, what is the quickest and easiest way
-
Yeah - script and schedule the gzip/scp (or better yet, make it rsync) portion from your server. If it's also MacOS, I'm not sure of what the detailed steps to do that are, but that's the outline of what you'd want.
Then, if you know it will always be done by a certain time, just schedule a script on your computer to gunzip/dropdb/createdb/psql. If it's not always done by the same time, then you could do something fancy with checking file status, modified times, etc.
What parts of the above are you unsure of, or feel you need more details with? Are you new to scripting, or not sure how to schedule things with cron or what-have-you?
Does PostGRES support any sort of log-shipping or mirroring natively? If so, that might be a better all-in-one solution.
Russ : im not sure if postgres does log shipping or mirroring natively. i couldnt find a crontab on the mac, we use crontabs on our servers and that would seem like an ideal solution but im not sure how to begin scripting that.From mfinni -
To allow ssh login to your production server without you typing the password you'll need to set up some non-interactive authentication if it's not already set up.
http://www.walkernews.net/2008/06/06/how-to-setup-non-interactive-ssh-login/
http://troy.jdmz.net/rsync/index.htmlThen it's basically a matter of adding the commands for each step to a shell script and setting cron to run that shell script every morning.
Russ : as i mentioned in the other answer, I couldnt find a `/etc/crontab` file on the mac, is it in a different location?Loopo : "crontab -e" should hopefully create one for you.From Loopo -
First you will need to set up your ssh server so that it accepts RSA keys, instead of a passphrase. This is documented elsewhere. Then you will need to set up the account that you plan to use to have the appropriate DSA/RSA key, which will allow that account to auto-login.
* ssh to the production server
This is not necessary. You can directly execute commands on the remote server using the
-C
option. For example, the next two steps you mention...* gzip our daily database dump file * scp the gzipped dump file over to my computer
...would look similar to this:
ssh user@some.server.net -C 'gzip dump > dump.gz' scp user@some.server.net:/path/to/dump/dump.gz /path/to/destination/dump.gz
Place each command just mentioned, along with the rest of the commands you are manually typing, into a single file. The commands must be in the same sequence that you would normally type them in as. We'll call it
dump-script.sh
for this example. Change the ownership and permissions as needed, so that the file will execute.WARNING:
There is a very high possibility of data loss occuring if you do not perform error checking in your script. Until you add error handling to the script, there is no guarantee that it will work correctly 100% of the time. With the command to drop your database, it is quite easy under the right circumstances to loose all of the data on the machine that is receiving the dump. You will most likely want to expand and embellish your script to allow for error detection, so that it will only DROP DATABASE and import the new data when the transfer is successful. If you are not comfortable with this, do not run it, nor create the crontab entries.
Once you have your script debugged and you can confirm that it will handle errors correctly, you can consider adding it to a regular cron schedule. While logged in as the account you wish to use that will perform these actions, type:
crontab -e
....which should get you an editor. Create an entry like this:
# Nightly Database Dump, at Midnight 0 0 * * * /path/to/my/dump-script.sh
...then type
:wq
in that order, and press enter (assuming thatvi
is your default text editor).
A better solution: If you have network access to the PostgreSQL database, why not just dump the database over the network using a remote connection? This is much cleaner, and can be combined with a little bit of scripting to make it a direct 'pipe' of data from one to the other.
From Avery Payne
0 comments:
Post a Comment