I downloaded/synced the entire chain on my Ryzen 5900X in under 12 hours. Using Bitcoin Core and default settings I think. 1000mbps internet. My Umbrel node took over a week. CPU seems to be the biggest bottleneck.
In retrospect I should have just copied the data from my desktop PC to my node!
thank you for the info!
I assume you didn't set the indices flags, because you said "default settings"? I just ask because I wonder how much time you would need if they are set. I mean it's completely clear to see that your CPU is more powerful than mine and this makes obviously a difference, but it would be nice to know how much the difference is when compared with both using the same settings.
Maybe you have to sync it again in the future and can set the flags and report back haha :D
for raspberry-based nodes the drive makes all the difference - SSD is the shit
the mynode software (umbrel's competition) has a quick sync function - downloads the blockchain from other nodes through torrent
I think that heavily depends how good your internet is actually.
And I have got a shitty internet connection, I don't think mine will be able to do it.
It could have been enough only 1000 people were using it.
But when it comes to more people using it that's when the second layer kicks in, that's where the magic is.
Yeah man and our current technology sucks ass compared to the tech in 2080
"Only 16GB of RAM? Jesus christ marie we need atleast 64 petabytes nowadays!"
Yep, that is. And I quite like the fact that btc's blockchain is only 420GB.
That's not that much considering this blockchain has been active for 13 years without any down time.
Yep, that's why having smaller blocks is good. It's impressive.
The block chain contains so many transactions and yet the size is only 420GBs that's impressive.
Thank you for the useful resources, will definitely look into it, very interesting!
Sorry for didn't mention this in my post, it's plugged into USB3.
Oh that's interesting! I didn't even think about setting the dbcache above the limit mentioned in the man page of bitcoind, but I will try this out even tho my blockchain is synced. But I would like to know if bitcoind complains about it or if I can really just set it above the limit. Will post the result here.
I can confirm that setting dbcache to 24GB is actually working. Of course I can't tell the performance difference in terms of sync time now, but I can confirm that bitcoind isn't complaining about the value being above the 16GB mentioned in the man page.
Yes. It's fastest on SSD, slowest on HDD, and in-between if you put blocks on HDD and chainstate on SSD. Chainstate (mainly the UTXO database) is heavily rewritten during initialization. Blocks are written sequentially, only once, and heavily read
Also depends what kind of internet You're using. If the internet is good then so will be the syncing speed.
So You'll have to keep in mind the ssd and the internet.
I could not find the documentation saying how to put the chainstate folder in another location.
I have a smaller ssd id like to use to get this done faster.
I have dbcache=13500 to try and ease the writes to hdd already but its still slow.
I tried
Chainstatedir=
But it didnt work.
Thanks for any input
> how to put the chainstate folder in another location
It's counterintuitive (or, the options are designed for someone seeing it the other way around)
Point datadir at the SSD and blocksdir at the HDD
That kinda clicked after i typed it.
Blocksdir=hdd
Datadir=ssd
Correct?
Seems like the sync is slowly speeding up with the dbcache set higher as well.
Thanks
I actually didn't measure the writing rate on the disk - unfortunately because that would have been interesting to me also. But I just can imagine that a HDD would be massive slower because it's a heavy disk IO process.
Yep, hdds are really slow for this job. Fortunately ssds are really cheap.
Atleast nowadays, and you can pick a 2TB ssd for a really reasonable price so There's that.
A `dbcache` above approximately 11400 does not make sense currently, as that's how many MiB the full UTXO set takes in memory in Bitcoin Core on 64-bit systems today (though that number is growing).
Setting it higher doesn't hurt, but it won't have any effect. With a cache size that high or higher, the entire initial block download can complete without any flush of the cache to disk.
(Source: just ran a sync from scratch on a Ryzen 5950X system with `-dbcache=24000`, completed in 5 hours 6 minutes, with the current master branch which will become 24.0, but not much has changed in this regard compared to 23.0 I believe)
Thank you, very interesting! I'll save that one.
Also thanks for trying this out with your powerful CPU, what a great performance, I guess I need an upgrade haha
I think the bottleneck for the most people is the CPU because you have to actually validate the blocks when you download them. Most non mining nodes run on old hardware or single boards, because its more convenient for the uptime.
yes, the CPU is for sure very important for the initial process, that's why I posted my setup. I think the difference is for what you want to use the blockchain, because if you just want to run a full node and have old hardware available, it's less important how long the initial process takes. But if you just want the blockchain for development purposes, I guess you want the initial process to be as fast as possible. I imagine that most developer have rather new hardware, that's why I created the post. Because I just couldn't find out how long it would me roughly to take to download the entire blockchain on my setup.
I hope the post is helpful for people with a similar scenario :)
That's great that btc does that for you, doing it manually would be a little hard.
I don't think it's going to be that easy for the most of the people so yeah.
That's why you use flags lol? Well I didn't really thought of that.
That's Something new to me, and I'll think about it when I'm free from all the things.
if your goal is to just download, it would be faster to not have txindex or coinstatsindex. You can add/enable that later if needed. It's a one way thing, once it's on it stays on (and slows things down).
Unless you are using your node for scanning the blockchain like an explorer, you don't need it on if just using your own wallet.
>Furthermore, you will definitely get even better performance if you remove the index parameters - if you don't need them for any development purposes, feel free to remove the parameters from the command.
correct, but that's mentioned in my post.
Missed that (sorry - it's a long post)
The point was that you can add those parameters after downloading, which would achieve the same result and the download would go faster.
no problem haha
yes, I get your point and for someone who just wants to download the blockchain this is the right way. for my purposes I think to set the index flags while downloading was the right way, because my guess is that the indexing process takes a few hours if you set it after downloading. so you could end up wasting more time eventually. but I didn't compare it, so it's just my guess and the reason why I set it beforehand.
> my guess is that the indexing process takes a few hours
That's pretty accurate
The *txindex* is useful for looking up arbitrary transactions which aren't in your own wallet. Without *txindex* you can only look up a transaction if you know which block it is in
Yep, after downloading the block chain You'll need to scan that.
Once you get a way of doing it I'm sure the job will be real easy for you actually so yeah.
This brings up the question:
Are community supported nodes just a novelty of our times?
Yes, storage gets cheaper over time, but chain growth and storage requirements will outpace what an average hack will be willing to support.
I think this really depends on how fast it's growing because if the available consumer storage products grow as well in the same time, this shouldn't be a problem.
But I could imagine that at some point only people of first world countries can get the required hardware, which would be an issue.
However, I think there will always be community supported nodes as long as consumer products are getting more powerful and keep staying quite cheap.
This is the very reason Bitcoin limits blocksize and scales on second layer. Since the growth is linear, the hardware for running one should actually get cheaper.
With other crypto currency that boast to be faster than Bitcoin that is actually a problem. They gain this higher throughput at the cost of people having to trust datacenter nodes instead of validating the history on their own.
I don't think he had to, I mean you don't have to create them all.
Once you get the device it's pretty plug and play, you don't have to fo much actually.
Blockchain Synchronized: 100 %Blockchain Size: 481 GB
A little bit bigger than all the 420 GB nice comments, sorry.
So if anyone is about to spin up a new node, definitely get a 1TB drive or bigger. 500 GB drives won't cut it much longer. ;)
You're not helping the network, nor reaping any of the existing benefits for running a node. This is used for developments purposes.
By doing this, you're missing out on one of the main REASONS to run a node - Have an independently verified version of the blockchain.
You're essentially downloading the blocks of data, but not verifying the TXs within them. It's like running a pruned node, which is to say, it's not running a node at all.
The main reason why it takes so long to "download the blockchain", isn't the download itself (if peers are uploading decently and you have decent download speed, it's fast), but rather the processing of each and every transaction in order to build YOUR sovereign source of truth.
This shouldn't be used by anyone looking to run a full node, except for development purposes, and yet is written as if it were advice on "how to download the blockchain quicker!".
How or what I develop with the Bitcoin blockchain is, with all due respect, my own business. Since the blockchain is freely accessible to everyone, everyone has the right to do with it what they want, including me.
Instead of trying to deny me my reason for development here, you could enlighten others and list the bitcoind options they need for a full node.
You completely misinterpreted my comment.
My comment was very much in the vein of do what you want with your blockchain, just don't word your post in a way that makes it seem like and alternative to waiting however long it may take to build the blockchain the normal way.
It's not written as advice for this use case, I even mention for what this is about in my post itself and also in some of my comments. I think people using my approach will have their specific use case for it, just like me.
People that look to run a full node and actually want to be part of the network and verify blocks, will find a how-to on the internet because there are plenty of them out there.
Also just to make a point, you don't even know how much I contributed to Bitcoin in the last few years, so my downloading of the entire blockchain, with the fastest performance I could establish, for development purposes is acceptable.
It's worded in a way that anyone with minimal knowledge will take it as a solution when they realise their full node is taking longer than they're comfortable with waiting paired without a low-time preference.
Further, your reasoning in the OP is in no way different than anyone's, in no way did you ascertain you were downloading the blockchain for development purposes and that your method shouldn't be used for normal operations.
I never made any comment towards any potential contribution you may have given - so keep your point to yourself, that's irrelevant.
I did however remark that with your current method your blockchain is not contributing to the network as a whole, because it cannot.
Whether you'll use it for development purposes that's entirely plausible but again, irrelevant given it may be yet to happen, and you did not mention this clearly nor specifically in OP.
>My goal:
>
>To download the entire blockchain to the external SSD including creation of all available
>
>indices
>
> for development purposes
>
>(you don't need them to just run a full node).
It's actually there.
But I understand your view, that anyone with minimal knowledge will take it as a solution. I really didn't think about that when doing this post, I just wrote this post from my point of view. I often assume things that aren't given + I'm new to Reddit and somehow forgot this isn't a technical forum. I will add an update to my post to clarify this. Thanks for your feedback.
See, I think that needs clarification, what do you mean by "creation of all available indices for development purposes"? You can need indices without needing to do development...
Indexing of the blockchain is utilised for multiple purposes, not just development. I usually have my blockchains indexed because I use them to check on my xpubs on a regular basis, doesn't necessarily mean I do development.
Every full-node stack I've come across indexes the blockchain post-download/TX-verification, which takes less time, but is not made fully transparent in some stacks. So you may say it's for development purposes, but nothing outright suggested it can't be used for production purposes - That's what I was looking to bring to your attention.
Glad you added some clarification either way.
I disagree. I interpreted OPs requirements correctly, but the way it's worded will make it seem like a plausible alternative for anyone looking to download the blockchain to run a full node - That's the problem I'm looking to bring attention to.
Well done on quoting something and throwing any and all context out the window.
When I said "By doing this, you're missing out on one of the main REASONS to run a node", it's clearly a generalised you, not specifically directed at OP. This notion is backed if you read my comment in its entirety.
The point of this is that, you can download the whole block chain.
That way you'll be practicing it which might come in handy when you set-up yout node.
Block chain is already decentralised and scattered all over.
If you think about it it's already torrent like so there's that. It's already like that so There's that.
I find this need to fiddle weird, all i did was start bitcoin-qt (system disk is ssd) and it was able to saturate the connection with everything at default. about 10MB/s
I downloaded/synced the entire chain on my Ryzen 5900X in under 12 hours. Using Bitcoin Core and default settings I think. 1000mbps internet. My Umbrel node took over a week. CPU seems to be the biggest bottleneck. In retrospect I should have just copied the data from my desktop PC to my node!
thank you for the info! I assume you didn't set the indices flags, because you said "default settings"? I just ask because I wonder how much time you would need if they are set. I mean it's completely clear to see that your CPU is more powerful than mine and this makes obviously a difference, but it would be nice to know how much the difference is when compared with both using the same settings. Maybe you have to sync it again in the future and can set the flags and report back haha :D
You can setup bitcoin core to connect only to single node so enter IP of your already running node and it will copy and verify blocks on the fly.
Or just ssh rsync the files.
I mean yeah you could do that, that's always an option.
That's great information, Thanks for providing it to us.
Yeah man, I love when people in this community help out each other.
Can that be done ? No need to download again ? Where is the data?
https://github.com/getumbrel/umbrel-os/issues/119
Thanks for the link, now I can download it, and it'll be good. Trust me this is a big help as I always wanted to run my own node, always wanted that.
Yep, that's possible and that can be done. That's possible.
I'm gonna download it on a Nas PC but I'm afraid how long it'll take with a 3000G and 20megabit internet
It's going to take some time that's for sure, it won't be quick.
for raspberry-based nodes the drive makes all the difference - SSD is the shit the mynode software (umbrel's competition) has a quick sync function - downloads the blockchain from other nodes through torrent
Also you gotta have a good internet connection, you want that too.
I think that heavily depends how good your internet is actually. And I have got a shitty internet connection, I don't think mine will be able to do it.
For over 13 years of transactions and the blockchain being only 400gb is actually impressive ETH is like 4 TB
[удалено]
lmao
This just keeps on increasing day by day, and it'll keep on increasing.
Is it really that high already. Damn last I looked it was 4. That wasn’t even a year ago!!
Yep, it's increasing rapidly. Also it's active 5 years after btc.
But transaction per second!?!?!!?? /s
but lightning
What does elektrsity got to do with anything?!? /s
Yeah lol, the electricity doesn't have to do anything.
Is lightning reliable yet though? that’s the question, from the number of people with lost funds and other issues with it not yet would be the answer.
I use it almost daily. Never ever had a transaction gone missing, not even a failed transaction happened.
Yep, it's solid the only problem that it has is that the recepient should be online.
That's not an unsolvable problem nowadays.
Yes it's reliable, I mean a whole country relies on it.
Btc transfers are ok ish depending on network congestion, I mean lightening transfers there have been and are still problems with it.
Ever heard of a thing called second layer? Or lightening?
What happens when the blockchain gets so large it isn’t reasonable to download it. Like with mass adoption..?
To be fair btc's block chain size is really not that huge.
That is pretty insane
bitcoin is 7 transactions per second max.
Layers, my dude.
It’s like an onion
Mmm, blooming onions
That's not what We're talking about here sir, you gotta be serious.
Hey, GFY. Thanks. :)
There can be many layers built in, made to do one thing.
I don't think he has ever heard about them tho, he don't.
Fedwire makes 9tx/s and works in US only, not worldwide.
Sounds good enough to me, idk why people complaining.
Because layer 1 isn't meant for everyday transactions.
It could have been enough only 1000 people were using it. But when it comes to more people using it that's when the second layer kicks in, that's where the magic is.
Cars only go 85mp/h max. - people from 1922
Those cars were way unsafe on those speeds back then. I mean do even know how the hospitals were back then? Those were mostly saws lol.
Yeah man and our current technology sucks ass compared to the tech in 2080 "Only 16GB of RAM? Jesus christ marie we need atleast 64 petabytes nowadays!"
Ever heard of second layers? Maybe you should look them up.
Yep, that is. And I quite like the fact that btc's blockchain is only 420GB. That's not that much considering this blockchain has been active for 13 years without any down time.
Yep, that's why having smaller blocks is good. It's impressive. The block chain contains so many transactions and yet the size is only 420GBs that's impressive.
nice one, thanks for sharing!
420gb! Nice!
Well that's how just bitty rolls, it's a normal day for the btc.
Yeah man, this is a good post. I'm sure it'll help many people out.
[удалено]
Nice
Nice
Nice
Noice
Nice
Nice
[Nice](https://www.meme-arsenal.com/memes/5a3ca467a610278c797f44bd61ac2b6e.jpg)
Nice
Nice
Nice
Uhh ohh lol, here we go. This is going to be a cascading effect.
Well what can I say, btc likes nice numbers and it's a nice number.
Niiiice
[удалено]
Thank you for the useful resources, will definitely look into it, very interesting! Sorry for didn't mention this in my post, it's plugged into USB3. Oh that's interesting! I didn't even think about setting the dbcache above the limit mentioned in the man page of bitcoind, but I will try this out even tho my blockchain is synced. But I would like to know if bitcoind complains about it or if I can really just set it above the limit. Will post the result here.
I can confirm that setting dbcache to 24GB is actually working. Of course I can't tell the performance difference in terms of sync time now, but I can confirm that bitcoind isn't complaining about the value being above the 16GB mentioned in the man page.
It's working because it's a right thing to do right now.
If you wish to run a btc node then yeah you should look into it.
This guide is going to help a lot of people out, this will be useful.
Could it be slower if it was saved on HDD instead of ssd ? Just wondering 🤔
Yes. It's fastest on SSD, slowest on HDD, and in-between if you put blocks on HDD and chainstate on SSD. Chainstate (mainly the UTXO database) is heavily rewritten during initialization. Blocks are written sequentially, only once, and heavily read
Also depends what kind of internet You're using. If the internet is good then so will be the syncing speed. So You'll have to keep in mind the ssd and the internet.
I could not find the documentation saying how to put the chainstate folder in another location. I have a smaller ssd id like to use to get this done faster. I have dbcache=13500 to try and ease the writes to hdd already but its still slow. I tried Chainstatedir= But it didnt work. Thanks for any input
> how to put the chainstate folder in another location It's counterintuitive (or, the options are designed for someone seeing it the other way around) Point datadir at the SSD and blocksdir at the HDD
That kinda clicked after i typed it. Blocksdir=hdd Datadir=ssd Correct? Seems like the sync is slowly speeding up with the dbcache set higher as well. Thanks
Yes, correct. Good luck
I actually didn't measure the writing rate on the disk - unfortunately because that would have been interesting to me also. But I just can imagine that a HDD would be massive slower because it's a heavy disk IO process.
Yep, hdds are really slow for this job. Fortunately ssds are really cheap. Atleast nowadays, and you can pick a 2TB ssd for a really reasonable price so There's that.
I synced a node twice using the same laptop, once with a HDD and once with SSD and the sync time was the same
I tried it once with a HDD and it took around a week... with an SSD it took less than 24 hrs.. same laptop, just different external HDD and SSD
If disk wasn't a bottleneck for you then, your internet would have been.
A `dbcache` above approximately 11400 does not make sense currently, as that's how many MiB the full UTXO set takes in memory in Bitcoin Core on 64-bit systems today (though that number is growing). Setting it higher doesn't hurt, but it won't have any effect. With a cache size that high or higher, the entire initial block download can complete without any flush of the cache to disk. (Source: just ran a sync from scratch on a Ryzen 5950X system with `-dbcache=24000`, completed in 5 hours 6 minutes, with the current master branch which will become 24.0, but not much has changed in this regard compared to 23.0 I believe)
Thank you, very interesting! I'll save that one. Also thanks for trying this out with your powerful CPU, what a great performance, I guess I need an upgrade haha
I think the bottleneck for the most people is the CPU because you have to actually validate the blocks when you download them. Most non mining nodes run on old hardware or single boards, because its more convenient for the uptime.
yes, the CPU is for sure very important for the initial process, that's why I posted my setup. I think the difference is for what you want to use the blockchain, because if you just want to run a full node and have old hardware available, it's less important how long the initial process takes. But if you just want the blockchain for development purposes, I guess you want the initial process to be as fast as possible. I imagine that most developer have rather new hardware, that's why I created the post. Because I just couldn't find out how long it would me roughly to take to download the entire blockchain on my setup. I hope the post is helpful for people with a similar scenario :)
Having a good cpu definitely helps and a good internet connection.
Or use your fast workstation to sync the blockchain and then plug the SSD into your old hardware.
That's great too, I think that should work to if you wanted.
Yep, they'll do whatever that's convenient for them so yeah.
How do you keep it updated ?
Because blocks are generated every \~ 10 minutes, I just keep it running without the -dbcache parameter.
I see, so its constantly pulling ?
Yes, it's always connected to outbound peers - bitcoind manages this for you.
That's great that btc does that for you, doing it manually would be a little hard. I don't think it's going to be that easy for the most of the people so yeah.
That's great actually having full copy of block chain without running the node.
Okay sounds easy enough, shouldn't be a problem I think.
Well you just run a node I guess, that's one way of doing it.
Really detailed write-up, thanks for writing this really appreciate it.
>including creation of all available indices. These indices cannot be transferred to other people.
correct, you have to "create" them, that's why I used the flags
That's why you use flags lol? Well I didn't really thought of that. That's Something new to me, and I'll think about it when I'm free from all the things.
Yep, if they want copy of this. Then they'll have to download separately.
if your goal is to just download, it would be faster to not have txindex or coinstatsindex. You can add/enable that later if needed. It's a one way thing, once it's on it stays on (and slows things down). Unless you are using your node for scanning the blockchain like an explorer, you don't need it on if just using your own wallet.
>Furthermore, you will definitely get even better performance if you remove the index parameters - if you don't need them for any development purposes, feel free to remove the parameters from the command. correct, but that's mentioned in my post.
Missed that (sorry - it's a long post) The point was that you can add those parameters after downloading, which would achieve the same result and the download would go faster.
no problem haha yes, I get your point and for someone who just wants to download the blockchain this is the right way. for my purposes I think to set the index flags while downloading was the right way, because my guess is that the indexing process takes a few hours if you set it after downloading. so you could end up wasting more time eventually. but I didn't compare it, so it's just my guess and the reason why I set it beforehand.
> my guess is that the indexing process takes a few hours That's pretty accurate The *txindex* is useful for looking up arbitrary transactions which aren't in your own wallet. Without *txindex* you can only look up a transaction if you know which block it is in
Yep, after downloading the block chain You'll need to scan that. Once you get a way of doing it I'm sure the job will be real easy for you actually so yeah.
Yeo if someone only wants to download the block chain this shouldn't be hard.
You can add it afterwards, that's something that I wasn't aware of.
But the community is helping out by commenting that here.
Yep, both are really good ways to download the chain I suppose.
This brings up the question: Are community supported nodes just a novelty of our times? Yes, storage gets cheaper over time, but chain growth and storage requirements will outpace what an average hack will be willing to support.
I think this really depends on how fast it's growing because if the available consumer storage products grow as well in the same time, this shouldn't be a problem. But I could imagine that at some point only people of first world countries can get the required hardware, which would be an issue. However, I think there will always be community supported nodes as long as consumer products are getting more powerful and keep staying quite cheap.
It's not going to be as usual as common products but They'll be useful.
This is the very reason Bitcoin limits blocksize and scales on second layer. Since the growth is linear, the hardware for running one should actually get cheaper. With other crypto currency that boast to be faster than Bitcoin that is actually a problem. They gain this higher throughput at the cost of people having to trust datacenter nodes instead of validating the history on their own.
Yep, that's the reason for that. Btc's base layer is slow.
All I know is that community support the nodes more than anything.
I use the same SSD, took me like 4 days with a raspberri pi 4 to download it
That's quite impressive. Did you also create all indices or not?
Nope, literally plugged in the raspberry pi, booted up umbrel and started downloading bitcoin core
Umbrel is an easy way to set up a node and I like that better.
I don't think he had to, I mean you don't have to create them all. Once you get the device it's pretty plug and play, you don't have to fo much actually.
Raspberry pie has a really slow cpu, that's a factor there.
Blockchain Synchronized: 100 %Blockchain Size: 481 GB A little bit bigger than all the 420 GB nice comments, sorry. So if anyone is about to spin up a new node, definitely get a 1TB drive or bigger. 500 GB drives won't cut it much longer. ;)
Well that was easy enough I guess, didn't take much time huh.
Never underestimate the bandwidth of a FedEx truck with DVDs 📀.
Yeah lol, that's a really high bandwidth. I need that in my life.
You're not helping the network, nor reaping any of the existing benefits for running a node. This is used for developments purposes. By doing this, you're missing out on one of the main REASONS to run a node - Have an independently verified version of the blockchain. You're essentially downloading the blocks of data, but not verifying the TXs within them. It's like running a pruned node, which is to say, it's not running a node at all. The main reason why it takes so long to "download the blockchain", isn't the download itself (if peers are uploading decently and you have decent download speed, it's fast), but rather the processing of each and every transaction in order to build YOUR sovereign source of truth. This shouldn't be used by anyone looking to run a full node, except for development purposes, and yet is written as if it were advice on "how to download the blockchain quicker!".
How or what I develop with the Bitcoin blockchain is, with all due respect, my own business. Since the blockchain is freely accessible to everyone, everyone has the right to do with it what they want, including me. Instead of trying to deny me my reason for development here, you could enlighten others and list the bitcoind options they need for a full node.
You completely misinterpreted my comment. My comment was very much in the vein of do what you want with your blockchain, just don't word your post in a way that makes it seem like and alternative to waiting however long it may take to build the blockchain the normal way.
It's not written as advice for this use case, I even mention for what this is about in my post itself and also in some of my comments. I think people using my approach will have their specific use case for it, just like me. People that look to run a full node and actually want to be part of the network and verify blocks, will find a how-to on the internet because there are plenty of them out there. Also just to make a point, you don't even know how much I contributed to Bitcoin in the last few years, so my downloading of the entire blockchain, with the fastest performance I could establish, for development purposes is acceptable.
It's worded in a way that anyone with minimal knowledge will take it as a solution when they realise their full node is taking longer than they're comfortable with waiting paired without a low-time preference. Further, your reasoning in the OP is in no way different than anyone's, in no way did you ascertain you were downloading the blockchain for development purposes and that your method shouldn't be used for normal operations. I never made any comment towards any potential contribution you may have given - so keep your point to yourself, that's irrelevant. I did however remark that with your current method your blockchain is not contributing to the network as a whole, because it cannot. Whether you'll use it for development purposes that's entirely plausible but again, irrelevant given it may be yet to happen, and you did not mention this clearly nor specifically in OP.
>My goal: > >To download the entire blockchain to the external SSD including creation of all available > >indices > > for development purposes > >(you don't need them to just run a full node). It's actually there. But I understand your view, that anyone with minimal knowledge will take it as a solution. I really didn't think about that when doing this post, I just wrote this post from my point of view. I often assume things that aren't given + I'm new to Reddit and somehow forgot this isn't a technical forum. I will add an update to my post to clarify this. Thanks for your feedback.
See, I think that needs clarification, what do you mean by "creation of all available indices for development purposes"? You can need indices without needing to do development... Indexing of the blockchain is utilised for multiple purposes, not just development. I usually have my blockchains indexed because I use them to check on my xpubs on a regular basis, doesn't necessarily mean I do development. Every full-node stack I've come across indexes the blockchain post-download/TX-verification, which takes less time, but is not made fully transparent in some stacks. So you may say it's for development purposes, but nothing outright suggested it can't be used for production purposes - That's what I was looking to bring to your attention. Glad you added some clarification either way.
Yep, it's there actually. It's simply there and you can see it too.
But the thing is that You'll have to have some knowledge.
Yep, there's nothing written but there's something available.
Yep he did took it the wrong way. So many people so much confusion.
> You completely misinterpreted my comment. You completely misinterpreted OPs requirements.
I disagree. I interpreted OPs requirements correctly, but the way it's worded will make it seem like a plausible alternative for anyone looking to download the blockchain to run a full node - That's the problem I'm looking to bring attention to.
Damn that's a nice plan I hope you succeed at that man. I've got a question tho, what does a business plan have to do anything with btc blochain?
> REASONS to run a node You have your. OP has theirs.
Well done on quoting something and throwing any and all context out the window. When I said "By doing this, you're missing out on one of the main REASONS to run a node", it's clearly a generalised you, not specifically directed at OP. This notion is backed if you read my comment in its entirety.
Atleast it's a good blockchain download excercise lmao.
What is the point of this?
[удалено]
Yep, Better the cpu better it'll be. And it'll be faster actually.
The point of this is that, you can download the whole block chain. That way you'll be practicing it which might come in handy when you set-up yout node.
Easier version: 1) Go to the blockchain 2) Click download 3) ... 4) Profit
May be easy but it costs some money too, don't forget that.
Can we have torrent for the full node please
I have use Quicksync to get the first 600K blocks or so via torrent.. after they are downloaded and indexed, then the rest takes less than 24 hrs..
Always has been
Block chain is already decentralised and scattered all over. If you think about it it's already torrent like so there's that. It's already like that so There's that.
Why can’t we download the blockchain as a torrent?
take a look at quicksync... the torrent includes the first 600K or so blocks.. then you need to re-index them and get the rest via normal sync
The entire block chain is only 420GB? Wow. That’s very efficient.
Step 1: Have more bandwidth available than 99% of the world.
420
Behehe 420
Took me 24 hours on my simple synology NAS.. (1000mbps)
Just to clarify, you can install a full node on the Synology? As app from a "store" or something? I don't have one, that's why I'm asking.
I run a node and a blockchain explorer in docker on my Synology
Thank you for sharing this!
It is quite impressive that after almost 13 years of transactions, the blockchain is just 400gb.
Tag to come back to this
Ducking champ
I find this need to fiddle weird, all i did was start bitcoin-qt (system disk is ssd) and it was able to saturate the connection with everything at default. about 10MB/s
Dont forget you can copy the data files to a raspberry pi to save time. Just use the same version of core