我希望能定期将数据库转储存档到我的AWS账户中的S3,最终存储在冰川中。是否有一种方法可以从dyno内部将postgresql数据库转储到文件系统中(然后我可以从那里将文件发送到AWS)?在dyno上似乎没有psql和pg_dump可用,并且我不知道如何在dyno内运行pgbackups。
目前我能想到的最好方法是使用pgbackups插件(我一直在使用),然后每天从s3中拉取最新备份并将其重新上传到我的存储桶中。如果启用了此插件,Heroku会公开PGBACKUPS_URL环境变量。其余步骤大致如下:
# boto and requests are required, aws access credentials are in the settings file
url = settings.PGBACKUPS_URL + "/latest_backup"
dumpfile = "./db_dump"
# get latest backup info
r = requests.get(url)
dump_url = r.json()["public_url"]
dump_timestamp = r.json()["finished_at"].replace("/", "-")
dump_name = "db_dumps/" + dump_timestamp
# write dump to file
r = requests.get(dump_url, stream=True)
if r.status_code == 200:
with open(dumpfile, 'wb') as f:
for chunk in r.iter_content():
f.write(chunk)
conn = S3Connection(settings.AWS_ACCESS_KEY_ID, settings.AWS_SECRET_ACCESS_KEY)
bucket = conn.get_bucket(settings.AWS_DB_DUMPS_BUCKET)
key = bucket.new_key(dump_name)
key.set_contents_from_filename(dumpfile)
我还不确定是否可以通过PGBACKUPS_URL来触发备份。