Dash Platform storage cost per GB?

I'm not going to discord just to ask this. Tbh, I thought it was quite a basic question that many would want to know.
 
I'm not sure the math has been worked out for that, @qwizzie asked much the same over in Discord, don't think he got an answer either.

That's a little disappointing. I had a simple database in mind which I know will go through about 1GB+ of data every two months, so I was wondering how viable it would be.

I would also like to know how much queries would cost against that data. For example, if I were to pull 3000 rows per hour.
 
You need to go through the fees and hunt for storage fees specifically in here : https://github.com/dashpay/platform/pull/1971/commits/c3b4156312ff1b847e28a3411ef2269f59237059

I found this bit of storage fee : 73253660 credits per byte :

Knipsel.JPG


Maybe you can find other snips of storage fee information in above mentioned pull request.
Paul from Research (and from TUI) seems to be the one you should ask about this, as he is the one behind the latest fees changes.

Fixed Dash - Credits internal rate conversion :
1 dash = 100,000,000,000 credits (100 billion credits)

See also this : https://chatgpt.com/share/9fda7c6d-214d-4212-b7c9-eb4203b1c611 (not sure how accurate ChatGPT is with regards to calculating the storage fee, but i do think Platform can not be used to simply store 1 GB of data each month. It will at most be in KB i think and is intended for storage of meta data, like contracts and documents. Storage fees can be returned / credited to document owner / identity, once that data get manually deleted, if i remember correctly.
 
Last edited:
Thanks for that.

I don't get it though, what's the point of developing such a database if an app is only going to work with just a few KB at a time? I mean, okay, a side chain to store some data but all that dev time for that? More so, during the HPMN proposal, Sam had the audacity to lecture me with "surely you want it to be cheap?" I suppose the answer is smart contracts, but right now it just looks like over engineering to me.
 
for significant storage of information, the requirements for evonodes will increase greatly. on evonodes, the recommended volume will be tens of terabytes?
 
If you want to achieve massive use of the platform, it must offer a lot at a low cost. The other alternative is to offer little at a high cost, so we all know how it will end.

To store minor data, I don't know if it will be of high interest. A kind of reddit on the Dash network would be very encouraging, but that requires huge amounts of TB available at a high storage cost.

I still do not see this experiment as a primary added value for the network, but if it manages to attract attention and, above all, users, perhaps over time it will be possible to recover all the money and time put here, something that today I doubt.

Since I also did not know how to see projects like Aphabet, Meta or others, I hope that something similar happens here and achieves the goal that many have in mind of reaching important levels of use.

Greetings and health.
 
for significant storage of information, the requirements for evonodes will increase greatly. on evonodes, the recommended volume will be tens of terabytes?

I think this is the conundrum that Platform finds itself in. Way overpriced per byte, "but this is not the use case". So then what was the point of building out queries on a few KB of data? I mean, there's going to be usernames, presumably millions, so the expense of that compute is what exactly?
 
Back
Top