Skip to content

Instantly share code, notes, and snippets.

@psamaan
Created December 9, 2014 22:03
Show Gist options
  • Save psamaan/b67c54bbb4138cd808b4 to your computer and use it in GitHub Desktop.
Save psamaan/b67c54bbb4138cd808b4 to your computer and use it in GitHub Desktop.
This PHP script recursively copies all files from a specific FTP location to an AWS Glacier archive. Follow these instructions first for installing dependencies: http://aws.amazon.com/developers/getting-started/php/
<?php
// Include the SDK using the Composer autoloader
require 'vendor/autoload.php';
use Aws\Glacier\GlacierClient;
use Aws\Glacier\Model\MultipartUpload\UploadBuilder;
$client = GlacierClient::factory(array(
'key' => 'key_here',
'secret' => 'secret_here',
'region' => 'region_here', // (e.g., us-west-2)
));
$vaultName = 'glacier_archive_here';
$ftp_server = 'destination_server_here';
$ftp_user_name = 'ftp_username_here';
$ftp_user_pass = 'ftp_password_here';
function callbackDir($ftppath,$filename)
{
global $client, $vaultName;
$localFile = $filename;
echo "copy started for $filename\n";
#get the file
if(copy($ftppath.$filename,$localFile)) { echo "copied $filename to localdisk\n";}
#upload the file
$uploader = UploadBuilder::newInstance()
->setClient($client)
->setSource($localFile)
->setVaultName($vaultName)
->setPartSize(4 * 1024 * 1024)
->setConcurrency(3) // Upload 3 at a time in parallel
->build();
echo "uploading $localFile\n";
try {
$result = $uploader->upload();
$archiveId = $result->get('archiveId');
echo "$localFile uploaded to archive. Operation ID is $archiveId\n";
} catch (\Aws\Common\Exception\MultipartUploadException $e) {
// If the upload fails, get the state of the upload
echo "error while uploading to Glacier - ";
$state = $e->getState();
try {
echo "retrying upload - ";
$resumedUploader = UploadBuilder::newInstance()
->setClient($client)
->setSource($filename)
->setVaultName($vaultName)
->resumeFrom($state)
->build();
$resumeResult = $resumedUploader->upload();
$resumeId = $resumedResult->get('archiveId');
echo "$localFile uploaded after successful resume. Operation ID is $resumeId\n";
} catch (\Aws\Common\Exception\MultipartUploadException $e) {
echo "retry failed. $localFile not uploaded\n";
}
}
# delete the local file
echo "deleting $localFile from localdisk\n";
unlink($localFile);
}
function walkDir($dir,$fx)
{
$dir = $dir . '/';
if( ($dh=opendir($dir)) )
{ while( ($file=readdir($dh))!==false )
{ if( $file=='.' || $file=='..' ) continue;
if( is_dir("$dir$file") )
{
walkDir("$dir$file",$fx);
}
else {
$fx($dir,$file);
}
}
closedir($dh);
}
}
walkDir("ftp://${ftp_user_name}:${ftp_user_pass}@${ftp_server}",callBackDir);
@psamaan
Copy link
Author

psamaan commented Nov 9, 2015

Filename must be added to the metadata, which this script doesn't do yet. This is crucial because otherwise you won't know what file in glacier used to have what name on the ftp server.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment