lwc:archive:s3_glacier

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
lwc:archive:s3_glacier [2024/08/19 14:29] John Harrisonlwc:archive:s3_glacier [2025/02/13 18:35] (current) John Harrison
Line 10: Line 10:
   * [[https://jmanteau.fr/posts/delete-a-vault-in-aws-glacier/|delete contents in vault]]   * [[https://jmanteau.fr/posts/delete-a-vault-in-aws-glacier/|delete contents in vault]]
     * [[https://gist.github.com/veuncent/ac21ae8131f24d3971a621fac0d95be5|similar information with comments]]     * [[https://gist.github.com/veuncent/ac21ae8131f24d3971a621fac0d95be5|similar information with comments]]
 +
 +==== S3 ====
 +=== making a bucket publicly accessible ===
 +// [[https://saturncloud.io/blog/how-to-make-an-s3-bucket-public/|source]]//
 +  * Navigate to ''Permissions''  --> ''Bucket Policy'' in bucket
 +  * add the following statement to the policy
 +<code>
 +{
 +    "Version": "2012-10-17",
 +    "Statement": [
 +        {
 +            "Sid": "PublicReadGetObject",
 +            "Effect": "Allow",
 +            "Principal": "*",
 +            "Action": [
 +                "s3:GetObject"
 +            ],
 +            "Resource": [
 +                "arn:aws:s3:::<PUT_BUCKET_NAME_HERE>/*"
 +            ]
 +        }
 +    ]
 +}
 +</code>
 +  * enable static website hosting in the ''Properties'' tab
  
 ==== S3 for Glacier ==== ==== S3 for Glacier ====
   * upload file to S3 deep storage class (Glacier): <code>aws s3 cp foo.txt s3://bucketname/foo.txt --storage-class DEEP_ARCHIVE</code>   * upload file to S3 deep storage class (Glacier): <code>aws s3 cp foo.txt s3://bucketname/foo.txt --storage-class DEEP_ARCHIVE</code>
-  * upload folder to S3 deep storage class (Glacier): <code>aws s3 cp folder s3://bucketname/foldername --storage-class DEEP_ARCHIVE</code> --recursive+  * upload folder to S3 deep storage class (Glacier): <code>aws s3 cp folder s3://bucketname/foldername --storage-class DEEP_ARCHIVE --recursive</code>
     * this will create the folder and upload the files in the newly created folder     * this will create the folder and upload the files in the newly created folder
   * to download first restore: ''aws s3api restore-object --bucket sample2.txt --key sample.txt --restore-request '{"Days":5,"GlacierJobParameters":{"Tier":"Standard"}}' ''   * to download first restore: ''aws s3api restore-object --bucket sample2.txt --key sample.txt --restore-request '{"Days":5,"GlacierJobParameters":{"Tier":"Standard"}}' ''
   * then check status: ''aws s3api head-object --bucket DOC-EXAMPLE-BUCKET --key dir1/example.obj''   * then check status: ''aws s3api head-object --bucket DOC-EXAMPLE-BUCKET --key dir1/example.obj''
   * after file is restored, copy to local machine: <code>aws s3 cp s3://mybucket/test.txt test2.txt</code>   * after file is restored, copy to local machine: <code>aws s3 cp s3://mybucket/test.txt test2.txt</code>
  • lwc/archive/s3_glacier.1724095761.txt.gz
  • Last modified: 2024/08/19 14:29
  • by John Harrison