Glacier Tools
- shell script uploads whole or parts
Glacier Cheat Sheet
not recommended. Use S3 for Glacier instead as it's easier, cheaper, and has more functionality
- list vaults:
aws glacier list-vaults --account-id 123456789012 - upload archive to existing vault (less than 5G):
aws glacier upload-archive --vault-name awsexamplevault --account-id 123456789012 --body archive.zip -
S3
making a bucket publicly accessible
- Navigate to
Permissions - If
Block all public accessunderBlock public access (bucket settings)ison, edit and switch tooff - While still in the
Permissionstab editBucket Policy - add the following statement to the policy
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicReadGetObject",
"Effect": "Allow",
"Principal": "*",
"Action": [
"s3:GetObject"
],
"Resource": [
"arn:aws:s3:::<PUT_BUCKET_NAME_HERE>/*"
]
}
]
}
enable static website hosting in the← is this required?Propertiestab
S3 for Glacier
- upload file to S3 deep storage class (Glacier):
aws s3 cp foo.txt s3://bucketname/foo.txt --storage-class DEEP_ARCHIVE
- upload folder to S3 deep storage class (Glacier):
aws s3 cp folder s3://bucketname/foldername --storage-class DEEP_ARCHIVE --recursive
- this will create the folder and upload the files in the newly created folder
- to download first restore:
aws s3api restore-object --bucket sample2.txt --key sample.txt --restore-request '{"Days":5,"GlacierJobParameters":{"Tier":"Standard"}}' - then check status:
aws s3api head-object --bucket DOC-EXAMPLE-BUCKET --key dir1/example.obj - after file is restored, copy to local machine:
aws s3 cp s3://mybucket/test.txt test2.txt