===== Glacier Tools =====
* [[https://github.com/numblr/glaciertools|shell script]] uploads whole or parts
==== Glacier Cheat Sheet ====
//not recommended. Use S3 for Glacier instead as it's easier, cheaper, and has more functionality//
* list vaults: ''aws glacier list-vaults --account-id 123456789012''
* upload archive to existing vault (less than 5G): ''aws glacier upload-archive --vault-name awsexamplevault --account-id 123456789012 --body archive.zip''
* [[https://docs.aws.amazon.com/cli/latest/userguide/cli-services-glacier.html#cli-services-glacier-initiate|upload archives > 5G to existing vault]]
* [[https://docs.aws.amazon.com/amazonglacier/latest/dev/downloading-an-archive-using-cli.html|download archive]]
* [[https://jmanteau.fr/posts/delete-a-vault-in-aws-glacier/|delete contents in vault]]
* [[https://gist.github.com/veuncent/ac21ae8131f24d3971a621fac0d95be5|similar information with comments]]
==== S3 ====
=== making a bucket publicly accessible ===
// [[https://saturncloud.io/blog/how-to-make-an-s3-bucket-public/|source]]//
* Navigate to ''Permissions'' --> ''Bucket Policy'' in bucket
* add the following statement to the policy
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicReadGetObject",
"Effect": "Allow",
"Principal": "*",
"Action": [
"s3:GetObject"
],
"Resource": [
"arn:aws:s3:::/*"
]
}
]
}
* enable static website hosting in the ''Properties'' tab
==== S3 for Glacier ====
* upload file to S3 deep storage class (Glacier): aws s3 cp foo.txt s3://bucketname/foo.txt --storage-class DEEP_ARCHIVE
* upload folder to S3 deep storage class (Glacier): aws s3 cp folder s3://bucketname/foldername --storage-class DEEP_ARCHIVE --recursive
* this will create the folder and upload the files in the newly created folder
* to download first restore: ''aws s3api restore-object --bucket sample2.txt --key sample.txt --restore-request '{"Days":5,"GlacierJobParameters":{"Tier":"Standard"}}' ''
* then check status: ''aws s3api head-object --bucket DOC-EXAMPLE-BUCKET --key dir1/example.obj''
* after file is restored, copy to local machine: aws s3 cp s3://mybucket/test.txt test2.txt