I have several TB of borg backups. Uploaded them on backblaze b2. I could immediately see how much resources i was using, how many api calls, and so on. Very easy to see and predict the next bill. I can see exactly which bucket uses more resource, and which is growing over time.

Because I’m cheap, I want to upload those files on aws glacier, which theoretically costs a quarter of b2 for storage, but API calls are extremely expensive. So I want to know the details. I won’t like to get a bill with $5 in storage and $500 in API calls.

Uploaded a backup, but nowhere in AWS I can see how much resources i am using, how much I’m going to pay, how many API calls, how much the user XYZ spent, and so on.

It looks like it’s designed for an approach like “just use our product freely, don’t worry about pricing, it’s a problem for the financial department of your company”.

In AWS console I found “s3 storage lens”, but it says i need to delegate the access to someone else because reasons. Tried to create another user in my 1-user org, but after wasting 2 hours I wasn’t able to find a way to add those permissions.

Tried to create a dashboard in “AWS cost explorer” but all the indicators are null or zero.

So, how can I see how many API calls and storage is used, to predict the final bill? Or the only way is to pray and wait the end of the month and hopefully there everything it’s itemized in detail?

1 point

UPDATE: after some days, the bill under https://us-east-1.console.aws.amazon.com/billing/home?region=us-east-1#/bills is populated in much detail. Now it’s much clear.

With rclone, my test sending 131 files / 2500 mb, set to don’t do upload chunking and don’t do HEAD on glacier created:

  • 110 PutObject requests to Glacier Deep Archive
  • 5 InitiateMultipartUpload requests
  • 5 CompleteMultipartUpload requests
  • 5 UploadPart requests
  • 192 PUT, COPY, POST, or LIST requests
  • 111 GET and all other requests

I think now i can safely upload everything and it shouldn’t be too expensive

permalink
report
reply
6 points
Deleted by creator
permalink
report
reply
5 points

S3 glacier definitely bills API calls, and are relatively expensive

From this page, which feels designed to be confusing and unreadable on mobile: PUT, COPY, POST, LIST cost $0.05 for each 1.000 calls. So if you upload 100k files each week that’s $5. Same if the program asks detail on each file to see if it’s updated or not

This user for example missed that part and got an expensive lesson https://noellh.com/blog/rclone-to-s3-glacier/

permalink
report
parent
reply
18 points
*

As many others have said, AWS have a pricing calculator that lets you determine your likely costs.

As a rough calc in the tool for us-east-2 (Ohio), if you PUT (a paid action) 1,000 objects per month of 1024MB each (1TB), and lifecycle transitioned all 1,000 objects each month into Glacier Deep Archive (another paid action), you’ll pay around $1.11USD per month. You pay nothing to transfer the data IN from the internet.

Glacier Deep Archive is what I use for my backups. I have a 2N+C backup strategy, so I only ever intend to need to restore from these backups should both of my two local copies of my data are unavailable (eg. house fire). In that instance, I will pay a price for retrieval, as well as endure a waiting period.

permalink
report
reply
2 points
Deleted by creator
permalink
report
parent
reply
3 points

1000x1024mb is.

permalink
report
parent
reply
1 point
Deleted by creator
permalink
report
parent
reply
2 points

It says 1,000 * 1,024

permalink
report
parent
reply
5 points

The idea of Glacier is for you to have this data as a disaster recovery or very deep archive where the chances are very low that you will ever use it. Otherwise Glacier doesn’t make sense, as the price for data retrieval is very high, as well as the time to recover your information can be a couple of hours.

My suggestion is to stay away from it. Other than that you can go to AWS Cost explorer and see your cost and an estimation how high your bill might be at the end of the month based on your current and past consumption.

permalink
report
reply
3 points

But wouldn’t that suit OP’s use case? Storing BorgBackups? That’s how I use this storage tier - just in case my local copies aren’t recoverable.

permalink
report
parent
reply
0 points

Ultimately he needs to take that decision, I am just saying what is usually Glacier used for.

permalink
report
parent
reply
3 points

Exactly this. It is meant for last resort all of your other backups have failed. Literally stored on tape, cold meaning it’s not even turned on. Great for archive and last resort backups, but I would only use it as backup number 3 or higher.

permalink
report
parent
reply
36 points
*
Deleted by creator
permalink
report
reply

Selfhosted

!selfhosted@lemmy.world

Create post

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don’t control.

Rules:

  1. Be civil: we’re here to support and learn from one another. Insults won’t be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it’s not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don’t duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

Community stats

  • 3.4K

    Monthly active users

  • 3.3K

    Posts

  • 71K

    Comments