Reddit Posts
Bitcoin Filters Work By Default, and That's a Good Thing | To Filter Spam From Your Bitcoin Core Node, set “permitbaremultisig=0” & “datacarrier=0” in your Bitcoin.conf File | Use "blocksonly=1" to turn off your mempool entirely
On Kraken. Do I need to provide ID in order to send BTC, trade to XMR and withdraw?
450 MiB RAM Bare Minimum (With Custom Settings) bitcoind config?
IMF paper: Assessing Macrofinancial Risks from Crypto Assets - Discussion/Thoughts?
Chromebook or another device good for DEFI?
Which mobile phone is best for multiple crypto wallet ?
Utopia Messenger provides 100% security on your communication + ChatGPT assistant.
About Bitcoin Cash - Questions - Relevant feedback is welcome
Newly Developed Crypto Exchange Platform for Sale
Mac OS Compromised with Atomic Hack
Connecting Bitcoin Core to Sparrow Remotely - Unable to make it work. Help!
best hardware wallet ever? also the most secure PC ever? amazing
Another post on how to secure a seed passphrase
Bitcoin Core takes ages to download and verify blockchain
You need to be a multi-millionaire to simply have a chance of being a validator on Binance Smart Chain network. And it gets worse from there. [SERIOUS] ly how did we ever accept this?
Any tips etc on how to use my day to day PC as a pruned node with the blockchain data stored on an external SSD?
How to turn your Raspberry Pi, or any computer for that matter, into a little money-making machine trading Bitcoin with open-source software
Question: Are there cryptos that can be mined with low-recourse home computers?
I had high hopes for Solana. Sad to see how devs are forced to leave the ecosystem now.
Be your own bank! Self-host your Bitcoin full node with $200 or less
Revisited: What TPS does Algorand need to be Sustainable?
People need to realize that market caps have nothing to do with the quality of a crypto project
Vertu's web3 phone costs USD41,000 (*only* 28eth)
(Joke) Based on the last few months of my life I'm 99% sure we will see prices skyrocket mid of next month
Noob - I have collected VERY minor crypto before (BAT/BTC/Doge), IT dept "gifted" me THIRTY (30) HP t630 Thin Client
Can I run a full node in my old 2009 toshiba notebook?
Is it considered safe to generate Bitcoin wallet keys using Electrum on Tails OS?
Is the UTXO index in RAM or is it a file system object?
Unreal Death - New game on BNB Chain - We are launching TODAY, 5pm utc, Please join our community, be a part of UD team! Game is already! Huge Potential Token
Crypto-Sceptic here. I want to share a huge global stock watchlist with crypto community. Was wondering if members of this community have similar watchlists for hundreds of crypto currencies?
Unreal Death - New game on BNB Chain - We are launching on Aug 26, 5pm UTC, Please join our community, be a part of UD team!
In light of the only true Bitcoin's (BitcoinSV) recent achievement of surpassing a Blockchain size of 5,000 Gb, I thought I would help you guys get started on running a full node! (Satire, not factually correct)
Solana (SOL) Launches First-Ever Crypto Mobile Phone Saga: Why is This Crucial?
[Polygon partnership][MATIC] Portus Network - Connecting blockchains to the world [Not a memecoin]
Bitcoin Core: Blockchain fully synced in 11 years
Buy the new Razer Blade 17 and pay with Crypto to get 3% off!
Buy the new Razer Blade 17 and pay with Bitcoin to get 3% off!
A NEAR Protocol thesis and why I think it'll be one of the biggest L1s in 2022 and beyond
Any advice for a complete beginner looking to potentially mine crypto.
[Polygon partnership][MATIC] Portus Network - Connecting blockchains to the world [Not a memecoin]
[Polygon partnership][MATIC] Portus Network - Connecting blockchains to the world [Not a memecoin]
[Polygon partnership][MATIC] Portus Network - Connecting blockchains to the world [Not a memecoin]
Is this a good mining rig for future 2-3years?
Anyone want to exchange crypto with a value equivalent to Apple MacBook Pro (13.3 inci, M1, 2020) 8GB RAM, 512GB SSD?
Why can't a good hacker or hardware engineer, recover the seed words from my hardware wallet, if I lose it? (Looking for someone who understands the technical side)
Help me buy a PC and get something back
A big deal of the top 20 coins on the market are VERY overhyped.
What mining software should I use?
Solana is the McDonalds ice cream machine of the crypto world.
Mentions
I wish people bought bitcoin instead of RAM. Someone please scam ram altman to buy bitcoin
Could make some nice profit out of RAM these days.
It would be the same even if u paid for the RAM in bitcoin
You are 100% right to call that out. Using the word 'Keys' in a Bitcoin sub was a massive unforced error on my part. To be crystal clear: I meant Infrastructure Credentials (API Keys, SSH Configs, Passwords), NOT Seed Phrases. NEVER paste a Seed Phrase or Private Key into a browser. I would report that post too. Regarding the 'IRC/Pastebin' comparison—you aren't wrong about the utility, but the Architecture is different. Pastebin/IRC: Writes data to a database on a hard drive. If the server is seized, the history exists. This Tool: Runs in volatile RAM. Logs are piped to /dev/null at the OS level. If the power is cut, the data doesn't just delete; it ceases to have ever existed. It’s a tool for metadata minimization, not wallet management. Thanks for keeping the standard high, seriously.
My friends were shocked over the price of RAM now. I explained well currencies around the world are fucked. Add that to the demand on RAM and you should be paying 4x + what NewEgg/Amazon/whoever used to sell it at 5 years ago.
Yea but the actual action of trading the gold for goods and services only uses energy with your arm muscles. Crypto currency is sucking up energy via mining AND transactions. It’s also causing GPU and RAM prices to skyrocket. Look at RAM currently… that has to be factored in as well.
The fact that someone used AI to make this... bro. I just want to buy RAM
Depends when. Late 80s? Plenty of PCs didn't even have 1MB of RAM. But even later, just *loading* a large executable from a floppy to RAM was slow and loud.
> 10,000 TPS via Parallel Execution: Monad is the first EVM-compatible chain to implement optimistic parallel execution. You are missing the REAL BIG PICTURE. That is not the story. Any chain can crank up massive TPS if they crank up the hardware requirement up the wazoo. According to [https://docs.monad.xyz/node-ops/hardware-requirements](https://docs.monad.xyz/node-ops/hardware-requirements), their node only needs 32GB+ of RAM + 300 Mbps bandwidth. It is a very low requirement compared to many "fast" EVM chain. Take the two Cosmos shitcoin EVM chains, Sei and Injective, for example. Both want validators to run1 Gbps+ bandwidth and 128 GB RAM. And Monad claims to be faster than both. If the claims are valid, Monad is really ***crazily optimized*** compared to the shit EVMs we have on today's market. Only Ethereum's ZkEVM roadmap aims for a lower hardware footprint to scale. But even then, only the verifiers run low-end hardware while their proof builders run more demanding hardware, e.g., ZkSync says builders need 2 5090 GPUs. This Monad shit is probably the most optimized EVM for the next year or two, until ETH carries out its full ZkEVM plan. MegaETH doesn't count because it uses more demanding hardware. > This is a very low percentage, leading to concerns about centralization and a market heavily reliant on a small circulating supply. Unfortunately, this is what happens with newer chains. I haven't seen anyone with a better idea for solving it. If you give more supply to airdrop, the airdrop jeets will rekt your chart and paint a Mount Everest. If you give more supply for sale, then the crowd would scream "ExTrAcToR! WhY dO yOu NeEd tO RaIsE So MuCh MoNeY!" > Fully Diluted Valuation (FDV) suggests that all future success is already priced in. Crypto valuations for L1s are hard to make sense of most of the time, especially for new L1s with no history of social Lindyness. It is mainly relative hopium to existing valuations. It is monolithic in design. Instinctively, you want to rank its FDV against Sui in the short term and Solana in the long term.
I use MX Linux live distro to do offline signing. The live distro doesnt have any persistence and I run it completely on RAM so it doesnt leave any trace on my computer.
It will take some time, but 0.08 percent per hour is very slow. You should identify what is the bottleneck. Is it the slow connection / not sufficient number of peers who send you data? Have you configured it to run on multiple threads and make use of the RAM you have? Is the chainstate stored on a SSD or on a HDD? Especially the connection and the chainstate location are important for catching up.
You can speed it up quite a bit by increasing the size of the cache to slightly less than your RAM size. It's an option in the settings.
In your bitcoin.conf file, adjust dbcache=8000 If you have 8 gigs of spare RAM. That will speed it up significantly.
Linux pc, pretty good internet / RAM and SSD
Post is by: PreviousGarbage7586 and the url/text [ ](https://goo.gl/GP6ppk)is: /r/CryptoMarkets/comments/1oxzaqw/crypto_tracker_tool_minimal_memory_usage_just_the/ Just shipped a tiny little menu-bar crypto tracker for macOS — **Crypto Live Price**. If you’re like me and don’t want a giant exchange app eating RAM just to check BTC/ETH prices, this thing might help. It sits quietly in your Mac’s status bar and shows live prices, 24h change, and whatever coins you care about. No login, no tracking, no “AI assistant popup nonsense”. Just prices. **What it does:** • Real-time crypto prices right in the menu bar • Add/remove your own coin list • Tiled or scrolling display modes • Tiny footprint (around 6 MB) and basically zero noise Not trying to replace TradingView — it’s just a lightweight “glance and move on” tool. If you want something minimal and not bloated, give it a look. Mac App Store link: [App Store](https://apps.apple.com/us/app/crypto-live-price/id6742907089) *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/CryptoMarkets) if you have any questions or concerns.*
Yeah they can't make enough of it for the AI build-out. If I didn't have money tied up in worthless crypto I would be investing in RAM manufacturers.
Also, RAM does something actually useful 😂
If only alt prices went up half as much as RAM prices increased in the last 3 months
Alright, time to close this shit ass chrome browser with reddit, charts and all the bullshit. Not only eating 5gb of my RAM but plenty of GBs of my sanity as well.
A node costs mainly in storage if you want to run a full node. A pruned relay node can get away with very modest requirements. But also have limited use as it is not keeping the history and you cannot use it to look up past transactions. A minimum specification for a full node is Some CPU. 16GB RAM 1 TB of storage Minimum recommended configuration Dual core CPU 32GB RAM 2 TB of storage A pruned node can get away with maybe 25 GB storage You really want at least 32GB RAM, at least during the initial download of the Blockchain or if you need to reindex the Blockchain.
You want 8GB or RAM and at least 1.5TB SSD storage. You can do this on a Raspberry Pi4B or Pi5, so CPU is not a huge issue
That sounds very familiar — I had the same issues when I tried to run a node on a Raspberry Pi. The hardware just couldn’t keep up during the initial sync, and it would crash or get stuck around the same years you mentioned. In my case, moving to a stronger machine with more RAM and a faster SSD fixed everything. Once the node finished syncing, it’s been rock-solid ever since. Sometimes it’s not the software — it’s just that the hardware needs a bit more power to handle the full chain.
The node itself uses very little power — just a modest old desktop motherboard with 16 GB RAM and a 2 TB NVMe SSD running 24/7. It’s roughly 10–15 W, so basically the same as a small light bulb. The Bitaxe units are super efficient too — each one draws about 15–20 W. Even running two of them 24/7 barely makes a dent in my electricity bill. I haven’t found a block yet, so no sats mined so far — but that’s not the goal. I’m doing it to support decentralization and learn how the system really works from the ground up.
Yes, I’m running it on a dedicated old desktop motherboard — nothing fancy, just solid hardware I repurposed for the node. It has 16 GB of RAM and a 2 TB NVMe SSD, so it keeps the entire blockchain locally and runs smoothly 24/7. I like having it isolated from my main PC. I’m using Bitcoin Core v28.1, not Knots or v30. It’s stable and works perfectly for my setup with two Bitaxe Gamma 601 units mining directly against my node. Once v30 matures a bit more, I’ll probably test it in parallel — but for now, 28.1 has been rock-solid.
I can't remember off the top of my head but if I remember correctly, the chipset in the Safe 5 is ironically a bit slower than the Safe 3 but has more RAM. If you ever plan on moving to a multisig at some point, you become RAM constrained when making larger transactions with numerous UTXOs. Lopp has done testing on the older ones if you want to look it up but you can see some HWWs taking a few seconds vs others that take 45 minutes or completely crash and can't process larger multisig transactions. You might want to spend more to future proof yourself. It's also nice having a full screen vs a few buttons to click on.
Cloud providers don’t nuke nodes because random bytes land in RAM; they act on abuse (malware distribution, open hosting of illicit data, or DDoS traffic) or legal orders. They don’t inspect tenant memory, and a Bitcoin node parsing arbitrary data isn’t executing it. Practical mitigations: keep your node P2P-only (no public block file hosting), enable disk encryption, cap egress with maxuploadtarget, consider -blocksonly or a pruned node, rate-limit 8333 and rotate peers/Tor if targeted. We use Cloudflare Magic Transit for DDoS and AWS GuardDuty for abuse signaling; DreamFactory helps gate internal APIs for node metrics and ops dashboards with RBAC. On “tinted chain” risk, institutions mostly hold via custodians/ETFs; a worst case is creation/redemption pauses, not instant mass dumping. Real telltales would be ETF AP statements, widening futures basis, and borrow spikes before a cascade. In short, focus on network hygiene and traffic controls; the legal risk is about serving or transmitting bad content, not passively storing or relaying the chain.
This is a bit misleading. It’s not just about SPAM. It’s also about bloat and malisons data. What happens when there is unencrypted malware code in RAM? Will AWS, Azure et. at. shut your nodes down? Lock your account? There is a DDOS component that we just don’t know how the network will react since impossible to test. There are compliance issues: What happens to price action if large institutional players have to exit if Bitcoin is labeled a “tinted chain”? We already seen a 50% retracement when fork of Bitcoin implanted this same code. What sort of liquidation cascade would this cause? The strange part is that no one is even asking for this.
One key point that often gets overlooked is validator accessibility. Solana has much higher hardware requirements, which limits who can realistically participate. To run a Solana validator, you need extremely high bandwidth (close to 1 GB/sec sustained), fast storage, and up to 128 GB of RAM. These costs create a higher barrier to entry and concentrate validator power in the hands of large, well-funded players. Ethereum, by contrast, has intentionally kept validation more accessible. With Ethereum’s Proof of Stake, all you need is 32 ETH, a reliable (but not extreme) internet connection, and modest hardware specs. You don’t need enterprise-grade RAM or bandwidth. This lowers the cost of entry and allows a much wider pool of participants to run validators, making Ethereum’s Layer 1 more decentralized in practice. It’s true that Solana achieves higher throughput with its Proof of History mechanism, which requires validators to maintain a constant internal clock to timestamp transactions. This design is what enables Solana’s speed, but it also drives its steep hardware requirements. Ethereum solves scaling differently—rather than pushing all throughput onto Layer 1, it embraces Layer 2 rollups. This lets Ethereum keep its base layer highly decentralized and secure, while Layer 2s handle the bulk of transaction volume at scale.
I use an old laptop that I have no other use for. It's like circa 2014, but it's more than enough for a node with 16GB RAM and a 1TB SSD. (Might need to upgrade that SSD soonish, though.)
I have built my own Bitcoin node using an old Dell OptiPlex SFF, installed a 1TB (should’ve 2TB) SSD, added RAM, installed Ubuntu Server and installed all necessary software to run a node. It is like my own home lab. Pi-hole and Homebridge are all running there, too!
We will see! I am pushing the RAM hard already 3.5ishGB usage. I shouldn’t of cheaped out lol ah well
Security Guardrails for UIGent Data isolation: Each service runs in isolated Docker containers with dedicated Redis databases (0/1) and PostgreSQL schemas, preventing cross-contamination between users' data flows. Credential management: API keys and tokens are never stored in generated code - they're injected at runtime through environment variables managed by the orchestrator service, with MinIO providing encrypted artifact storage. Read-only normalization: Forge only generates read-safe SDK wrappers and tool catalogs - it never directly executes against user APIs, that's handled by the sandboxed Atlas workers with resource limits (3GB RAM, timeout controls). Deterministic outputs: Content-addressable spec IDs ensure reproducible, auditable transformations - users can verify exactly what code was generated from their API docs. Zero-persistence execution: Atlas workers execute in ephemeral environments with ARQ job TTLs, ensuring sensitive data isn't retained beyond the immediate workflow execution.
No video sorry, but cold storage means "offline". No need for a hardware wallet to achieve cold storage. If you write down your seed phrase and then delete your wallet, you now have your btc in cold storage (your private keys hopefully no longer exist on an internet-connected machine). Your wallet was online at some point so there is still a risk, and the data could be recoverable from your hard drive unless you use software to wipe the free space. A step better is using a dedicated device (like an old laptop) with a fresh install of the operating system and disabled from being able to connect to the internet. My preferred method for long term cold storage would be to install Tails OS on a USB (Tails comes with Electrum pre-installed, it's a Linux system that boots from USB and runs only in RAM). Create your wallet, make your seed backups, export a public master key, shut down. Boot up Tails again to check that you can successfully restore your wallet from your seed. You can use the public master key to receive btc to your wallet and keep track of your balances (watch-only wallet safe to keep on your normal PC).
It’s good to understand the history of why Bitcoin was even created. That helps frame everything else. The way I think about bitcoin is that society needs money in order to allow trade to flourish (because barter is insanely cumbersome and obviously doesn’t work at scale). The more we can trade, the more we can specialize, the more we can produce for less time, energy, and materials. The more that happens the wealthier we can all become. Money is simply the asset in an economy that is the most desirable for its specific characteristics that are uniquely valuable when it comes to money. Any asset is valued relative to how well it can accomplish a set of needs (hard drive space and RAM for a computer, nutrition and flavor for a meal, speed and comfort for a car, and so on). Money is valued based on how well it accomplishes a totally separate set of needs that relates to allowing trade to flourish. The most effective money in a society is divisible, verifiable, portable, durable, scalable, fungible, and most importantly, scarce (which is the difficulty of bringing in new supply relative to the existing stock). A lot of things have been money in different societies, and the need for scarcity meant that you’d want a money that society lacked the capability of making more of cheaply (otherwise everyone would spend their time making more money/monetary assets rather than making more goods and services to consume). The problem was that different societies had different technical capability and one society could cheaply create more monetary media of another society and then wreck their economy by flooding their market with more monetary media rather than goods and services. Gold historically was the apex monetary asset because NOBODY could make more or mine more very easily. However, gold wasn’t portable or divisible enough, and fiat paper made up for this shortfall, then eventually you have centrally controlled gold, money, central banks, and all of the trusted third party moral hazards that have led to decades of inflation. Bitcoin solves for many of these problems that gold had, that paper money attempted to solve, but had too large of an incentive to cheaply make more monetary media that governments couldn’t resist. Bitcoin is essentially a verifiable, perfectly scarce, divisible, portable at the speed of light, fungible, monetary assets that beats gold in every way except that it is still relatively new and doesn’t have the long term track record that gold has. That last part just takes time. If you have time, use that to your advantage and get in at the ground floor of the digitization of monetary assets, in the one asset that most perfectly meets these monetary properties.
Kicking the can down is what every business does. It is especially useful, since CPUs, RAM and network speed does exactly the same. Youtube servs petabytes every day and all by kicking down the can since 2006.
It IS LITERALLY CRYPTO. hell you think sha256 is lol. Don't come to a party to talk about RAM or memory.... (IT Crowd)
tldr; The Solana Seeker phone, preordered 18 months ago for $450, has arrived. The total cost, including import taxes and accessories, was $570. The phone features a MediaTek Dimensity 7300 processor, 8GB RAM, 128GB storage, and a 108MP main camera. It integrates a secure crypto wallet and a dApp store for Solana-based apps. The setup included creating a SOL wallet and claiming a Seeker Genesis Token. The phone feels well-built, though the side thumb scanner feels outdated. Future updates on potential airdrops will be shared. *This summary is auto generated by a bot and not meant to replace reading the original article. As always, DYOR.
You don't need to be very technical to build your own Raspberry Pi Node, there are plenty of instructions, for example https://docs.raspiblitz.org/docs/setup/get-hardware You need an official Pi USB-C power supply, a Pi 4B (8GB RAM) or Pi 5 (8GB RAM), a Pi 5/5 case, and a 2TB SSD (you can use an AliExpress USB3.1 NVME external enclosure) and a 32-64GB MicroSD. That's about it, you can use a smaller MicroSD, you can use a smaller SSD/NVME but I would say it must be over 1TB size. You then flash a user friendly Pi Node like Umbrel (free, easy web user interface, but they might have pivoted to a more expensive x86 / Intel platform now), myNodeBTC (well maintained, lots of features, but their premium version costs USD$50 and is required if you want the features of Tor and easy on click software updates) and Raspiblitz. Raspiblitz is less user friendly, most technical, but the most features under the hood, it is the only one that supports Fulcrum as an alternative to ElectRS (this is the backend you connect your SparrowWallet to) but you need to manually edit the install script, and log in to your Pi node over SSH to run the bash script. You really don't need to buy a pre-made node, you can go to Jaycar for the Pi, or buy it online from an Aussie electronic store like coreelectronics.com.au or overseas store like Adafruit, or Arrow, or AliExpress. The SSD/NVME you can buy from any computer store
Yep. Oisy.com for an on-chain wallet, something like oc.app or dmail.ai or officex.app for on-chain sites, there are tons more, some dapps choose a hybrid approach to counter any shortfalls the IC currently has (database latency in the case of Odin.fun) - ingress is high compared to AWS ($5/GB/yr Vs AWS being free to upload) but egress is substantially lower than AWS so it suits relatively static sites where massive amounts of user-generated data aren't being uploaded for now, at least until nodes get upgraded with blob storage (currently storage is in RAM) which would reduce storage costs by ~99%
Don't blame Windows 95, blame the lack of RAM lol
Ok, ended up checking and seems on average it has been using about 73 Mb’s of RAM and very little CPU power.
Oh interesting I had no idea, I set it up maybe a month ago in a LXC. From what I remember it used very little. I can check it out once Im back home and tell you the average. When I ran Bitcoin Knots through umbrel it did use more RAM but that was the same with the Bitcoin Core.
I tried running Bitcoin Knots a couple of months ago. It had memory leaks and was using about twice as much RAM as Core. I ended up switching back to Core because of that issue. Any idea if the memory leak was fixed in Knots?
Doesn't take much power as it is centralised! Only has 23 nodes all run by big corporations - The hardware requirements are massive! The hardware requirements for Hedera nodes are quite specific and depend on whether you are running a consensus node or a mirror node. It's important to note that you can't just run a consensus node; they are currently permissioned and operated by the Hedera Governing Council members. However, anyone can run a mirror node. Consensus Node Requirements The requirements for a consensus node are very high-end and are designed for enterprise-grade performance and security. These are not for a typical home setup. * CPU: A high-performance, multi-core processor (e.g., Intel Xeon or AMD EPYC) with a minimum of 24 cores/48 threads is required. There are also specific performance benchmarks (Geekbench, Passmark) that must be met. * Memory (RAM): A large amount of ECC Registered DDR4 RAM is needed, with a minimum of 256GB and a recommendation of 320GB or more. * Storage: A substantial and very fast storage solution is essential. The requirements include at least 5TB of usable NVMe SSD storage with high sequential and random read/write speeds (e.g., 2,000-6,200 MB/s sequential read). The use of RAID arrays (e.g., RAID 1 for the OS, RAID 0 or 10 for data) is recommended for redundancy and performance. * Network: A sustained, unmetered 1 Gbps internet connection is required to handle the high volume of traffic. The node must also be deployed in an isolated DMZ network with specific ports open.
You could use a Raspiblitz on a Pi4b (8GB RAM) with a 2TB SSD and it will have Fulcrum and BTC RPC Explorer included (Raspiblitz includes and optional bash script to install Fulcrum)
You would download and all known addresses (UTXO) that had non-zero balance or at least activity (received and spent money) during a given time period when the user knows they were using the wallet. It is large but not excessive, less than 100 GB? Full blockchain size is 670GB, we need only addresses not other details, and can filter by known time range, and addresses can even be trimmed from 25 bytes to let’s say 10 bytes for address - it will still be very unique. Computationally it does not even need to be in GPU or even in RAM, it can stay on disk. Compute all valid permutations of words -> calculate ~29mln (479 millions / 16) seeds to BIP derivation paths (this is computationally intense!) -> get perhaps 200mln candidate addresses at common derivation paths -> 10 GB of candidate addresses (if trimmed to 10 bytes per address). Now need to lookup/join a 10GB dataset against a 100GB dataset on disk, it is doable on a PC in lots of ways effeciently.
This isn’t fearmongering — it’s basic opsec. Thanks chatGPT. Though curious to hear what humans think about this. So even if the wallet is encrypted, once you unlock it, there can be a potential from software like this to intercept it from RAM?
I'd been trying to say that all this time. Also he seems to totally underestimated RandomX. The best you can achieve is about 200 kH/s in an AMD Epyc 9xxx series. This CPU cost about as much as a small car, $20.000. So to achieve his desired hashrate, ~5 GH/s, he needs about 25.000 of these CPUs. This means at least 12500 dual CPU servers, lets assume Gigabyte MZ72-HB0, brand new goes to around $5000 each. So CPUs alone we are talking about 500 million USD, motherboards 62.5 million USD, adding for small components such as RAM, we have over 600 million USD, then we need a massive amount of powerful PSUs and probably a nuclear power station nearby to feed all this datacenter. To this we need to add the setup of those 12.500 machines, eventual hardware failure and management, $1 billion wouldn't be enough. Whereas the Qubic marketcap is at about 250 million USD, not enough to buy even the CPUs. He probably thought to rent servers, however datacenters with these CPUs have AUP terms, so while he was calculating to have those machines mining XMR, the datacenter most likely disconnects them as AUP violation as CPU usage is way too high.
Yes, I’ve done it a few months ago and it’s been quite easy. The best is to follow https://raspibolt.org . I have a 2 TB SSD, I boot from the SSD, so no need for sd card after the SSD is bootable. When downloading the blockchain the temp went a little high but now than it’s in sync it’s just very standard. I started with bitcoin core and easily moved to knots. It’s super stable, never had any problem. I have the raspberry pi 5 with 8GB RAM
Other projects to be funded by treasure which I and Cointelegraph forgot to mention: * **Mithril Enhancements**: Reduced bootstrap times and lightweight client support * **Nested Transactions**: Technical foundations for advanced smart contracts and interoperability * **Performance Optimizations**: Faster sync times, lower RAM usage, and reduced operational costs * **Project Acropolis**: A modular re-architecture of the Cardano node Source: [https://iohk.io/en/newsroom/from-roadmap-to-reality-cardano-community-approves-ioe-roadmap-proposal-unlocking-a-new-era-of-decentralized-delivery](https://iohk.io/en/newsroom/from-roadmap-to-reality-cardano-community-approves-ioe-roadmap-proposal-unlocking-a-new-era-of-decentralized-delivery)
I tried it on a Raspberry Pi 4 and it was painfully slow. That was just bitcoin core by itself. If you plan on running all that extra software on it, be prepared for a laggy interface. I also had an issue where the SATA to USB adapter knocked out the USB ports, even though I pasted heatsinks all over the chips on the board. I run Umbrel with Bitcoin core, Electrs, Mempool, NGINX proxy server, Pi-Hole DNS, all on an old HP Elitedesk mini PC. It will randomly spike to 25% cpu when doing something, but idles around 5-6%. Another thing to keep in mind, these PCs have variable speed fans, so they're quiet when idling. The Raspberry Pi 4 I bought didn't have a variable speed fan, so it was loud all the time. Something to keep in mind. Some info about my setup. I bought an HP Elitedesk 800 G3 on ebay with Core i5-6500 (3.2Ghz 4 core), 8GB RAM, no disk or power supply for $50. I added extra 8GB RAM ($15), new power supply ($12), and a cooling fan ($14), total cost was less than $100. I already had a 2TB Sata SSD laying around. Still under $200 if you need to purchase a disk drive. After realizing that Umbrel runs on x86 systems, I've completely abandoned Raspberry Pi as a platform for bitcoin nodes. There's better and cheaper options out there, unless you already have a Pi laying around that you want to repurpose.
2. The only way for that to happen would be IRL. And let's say if someone did manage to login to my Linux laptop: I use TailsOS, so everything is erased on shutdown, nothing is stored. Even if they managed to catch my laptop while it's turned on and I was in the middle of using Sparrow. They only have access to my public key. Even if they managed to catch my laptop, turned on, with SD card in. They only have access to my public key, the signed/unsigned TX cannot be tampered with. The RPi device is also stateless and 100% air-gapped, if they get to that, nothing is stored. 3. 1 slot 4. Yes, it's in the dev. notes, on the GitHub, and I've used it myself. Everything is stored in RAM, and it's gone when power is off 6. Check the GitHub, you can see the (public) contributers and code there. It's open source. The email on their website is for donations and community outreach.
Yes its easy for bots to snipe the keys from the low-end using a tool like this: [https://github.com/albertobsd/keyhunt](https://github.com/albertobsd/keyhunt) publishing your transaction to the mempool reveals the public key which can then be used to recover the private key. If you get a low end key you need to get it mined without going to the public mempool. Using that keyhunt you can attack the high-end keys where the public key is revealed like 135 bits with 13.5 Bitcoin. Using 32GB RAM and 16 Cores I can get 1 EH/s.
Yes, you are right. If all you need is to dump coins to an address for the next few decades, you don't need a dedicated hardware wallet. Hardware wallets are *specifically* made to make spending more convenient. Just make sure that your fresh Linux install is properly airgapped. Physically disconnect any networking/radio device from your system, and make sure to run a live, RAM-only session from an official distribution. Hand-stamp the mnemonics on metal (don't use a printer), preferably use multisig, jot down the public addresses, xpubs (and output descriptors if you're using multisig) and you'll have one of the safest setup out there.
Monero but need more RAM and a little more savvy
Was this written by an AI with only 64KB of RAM?
I would boot via DVD medium READ ONLY if I was not able to remove the USB boot medium physically (see logs...). Chose linux ISO that store in RAM and remove RAM to flush it all the set up after you finished....(of course, wifi / bluetooth and HDD removed as a minimum...)
Yes, let me give you my setup. I have a HP elitedesk 800 G3, added extra RAM to 16GB, replace the SDD with Samsung EVO 860 2TB and I downloaded and install umbrelOS.iso. Next, I installed Knots, mempool and Electrum and waaaaait for blockchain sync like 2 weeks (depending on your Internet speed). My connection is behind a VPN and I use only onion protocol (TOR). I installed Sparrow wallet on my desktop machine and I use only my node for mempool transaction filtering and broadcasting transactions
Partly because it's the first, and mainly because everything else is a backwards step from Bitcoin. For all the hundreds of alt chains, none have been an improvement on Bitcoin. They all tweak one or two aspects of Bitcoin in a way which makes them trash. ETH made looping scripts, as if Bitcoin's not-looping script language is a flaw. ETH has a bloated blockchain. It's impossible to initialize a. ETH node, so the node network became centralized. LTC and DOGE have shorter block intervals. This makes them less secure. Again, a backwards step, not an improvement. A dozen or more centralized smart chains owned by corporations, as if Bitcoin's decentralization is a flaw There's a frequently asked question: What if someone creates a new block chain, or a new digital currency that renders Bitcoin obsolete? The answer is, this can happen. But instead of a new Satoshi actually making a better crypto, the space is full of opportunists greedy for self-enrichment, including the user community focused entirely on profit, and completely ignoring Bitcoin's purpose Dogecoin couldn't have appeared before Bitcoin. It was cloned from Bitcoin's freely available source code Ethereum couldn't have appeared before Bitcoin. Its founder started with Bitcoin's design and made "clever" changes Solana couldn't have appeared before Bitcoin, because in 2008, there were no PCs with 256GB of RAM
>GTA 7 In 2040? You clearly have no idea what you're talking about. On a serious note, I believe the size will eventually get bigger. Not sure when though. Many people including myself have PTSD from [the Blocksize War](https://youtu.be/6YtS5ZNuuTw) and it'll take a lot of effort to do it right but it eventually gets done. Now, it isn't only about the cost of a drive, bigger blocks will need more CPU, RAM and faster internet connection. So the cost of running a node will get higher. But once we see the ability to run a node on cheap android (or hopefully Linux) phones, the size increase will make much more sense. Note to the bcash shills reading this, thinking why not using their shitty fork instead. Because the vast majority of the network will have to agree on the increase. Not a scammer Roger posting a tweet about the hardfork. That's not how decentralized system works. And yes, your dying shitcoin will be obsolete.
Full bullshit Bitcoin ASICs are ultra-specialized chips that ONLY do SHA-256 hashing. That’s it. They can’t do addition, multiplication, handle RAM, or literally anything else. It’s like comparing an electric can opener to a Swiss Army knife and saying “look, the can opener is 1000x faster at opening cans!” Yeah, but that’s all it does. A modern supercomputer can execute billions of different instructions per second, run complex scientific calculations, physics simulations, machine learning… The entire Bitcoin network wouldn’t even be able to properly emulate a single CPU core due to network latency. So yeah, technically the Bitcoin network processes more TeraFLOPS… but for ONE cryptographic operation only. It’s like saying a factory that only makes bottle caps is “more productive” than a car factory because it outputs more units per hour. This impressive-looking metric means absolutely nothing in practice. It’s just basic crypto marketing to impress people who don’t understand the tech. This comparison is completely bogus IMO.
Plebbit differs from Nostr in that Nostr is federated (using instances), whereas Plebbit is P2P (fully decentralized). Plebbit uses IPFS, which is more similar to BitTorrent, which is pure P2P as well. The issue with federations is that their instances are not easy to set up, most users don’t have an incentive to do so, and even if they did, they are not censorship resistant at all, because they work like regularly centralized websites. Your Nostr/Lemmy/Mastodon instance can get DDOS’d, deplatformed by the SSL certificate provider, deplatformed by the datacenter, deplatformed by the domain name registrar. The instance admin can get personally doxxed and harassed, they can get personally sued for hosting something a user posted, etc. And instances can block each other. Whereas running a node on Plebbit is as easy as opening up one of its desktop clients, which automatically run the custom IPFS node in the background, and seed all the protocol data automatically (similarly to how a BitTorrent client seeds torrents). It runs on a raspberry pi, on 4GB of RAM and consumer internet. It scales like torrents, i.e. the more users connect p2p, the faster the network gets. And most importantly, nobody can stop you or block you from connecting to another user, because there’s nobody in between. This means nobody can stop you from connecting to a subplebbit (subreddit clone). If you run your own community, you’re always reachable by any user on plebbit.
I like your set up, I would probably do the same as you you can work on 2 laptops system (the air gap one should be set in RAM only, HDD removed physically, Wifi and Bluetooth card too), you can remove the RAM physically when done or flush it at your convenience. I would use a DVD medium read only for booting (read only - no logs) if The USB could not be removed after booting though
Crazy to think my GPU can do nearly 1 billion keys per second but that would still take years. Using https://github.com/brichard19/BitCrack To attack the addresses with public key revealed you need a lot of RAM but this works https://github.com/albertobsd/keyhunt using the baby steps giant steps algorithm I believe
Seg faults could be a hardware issue. Do a memtest86 test of your RAM and at least check SMART values on that SSD. Maybe it's failing and having to rallocate sectors. Is it internal SSD or external? Sometimes a lousy USB cable for external disk can cause issues. I'm also assuming the site you went to is bitcoincore.org not anywhere else.
> The internet (as a whole) doesn’t need that kind of web3. You don't know what the Internet - or humans - will need in the future. Back when the Internet was an underground geeky thing, nobody knew they would need it. But now everybody does. I'm sure > for blockchain technology to actually scale to a point where it can host most of the internet nodes have to store petabytes of data in RAM which will make it no less centralized than something like AWS. The internet is full of garbage competing for views and add clicks. This has been a problem for years, and will become worse with AI. Not only don't we not need the entire internet, but we'd need a system to economically disincentivize garbage and encourage unique quality content. Some kind of "proof of value". I wonder what tech could achieve that? :) And if it's not that, it will be something else. Some technologies are simply to fundamentally innovative to not find a major use case.
Solo mining is not really helping the network, or yourself. The chances of ever hitting a block is far too small on the thinnest edge of nonexistant. Better to join a pool that i ethically in line with your goals so you have a chance of getting your voice heard and helping diversifying the mining and direction of the network. The network needs - transparency in transactions policies - transparency in miner voting procedures - miners taking part in the voting - well connected bitcoin nodes with sufficient RAM and CPU for relaying transactions and blocks - well connected Bitcoin nodes with storage for whole Blockchain to seed new nodes joining the network - public electrum servers - more users running their own electrum servers shielding their wallet privacy It is quite sad that then Bitcoin Core have killed the idea of SPV. While I understand the reasoning, SPV Bloom filters have great potential compared to.Electrum. I much rather had seen the above list include the option to run a SPV enabled node (bloom filters + txindex) but today that is not a viable option for BTC.
I understand the importance of LLMs here, but if a problem is quantitative by nature (numbers) it should not be addressed with a natual language model in general. The data that I have so far is all numbers (117k rows) and 25 columns. The only way LLMs can be used here is to finetune it with historical discussions of legit people talking about markets and crypto at different days, along with the actual market move the next day or next 24 hours etc.... But this requires really good discussions data or news data (reddit api, google api, twitter api) with a lot of cleaning, and shit ton of RAM in the cloud
I don’t see those ever being a problem. Bandwidth, storage, and ram just get cheaper and bigger. I remember fighting with the order I loaded drivers on my computer to get everything to fit into 640K of RAM. Our bandwidth was 1200kbps. For you kids under 40 years old, that ‘k’ is not a typo. We maxed out in kilobytes. Today we have blown past that, and blown past megabytes, and we are cruising past gigabytes for memory and bandwidth. Storage is in terabytes.
The HP machine I’m looking at has a Ryzen 5 and 16gb RAM so it sounds like it should be in good shape. Any reason for not having home server on the same machine? The plug and play setup Umbrel offers for sale is for that purpose specifically.
Setup Umbrel as a dedicated Bitcoin Server with Bitcoin/Lightning only stuff. Don't mix it up with other home server things, but use a different machine for these things. Security-wise it's better to reduce the attack surface, that's why. Other than that Umbrel works quite well and it's really stable. Go with a 2Tbyte SSD for the bitcoin things though. Make sure to have at least 8 gigs of RAM, 16 still better. Also make sure your CPU is at least a generation 6. CPU. I am testing different Bitcoin node suites for some years already. If you have further questions feel free to head back here.
Well, you can set up a pool for 505 ADA, 24GB RAM and 250GB Storage, just have to make sure you can cover the fixed fee of 340 ADA per epoch (5days) - pretty low costs if you can get sufficient delegators.
Found a refurbished Lenovo ThinkCentre M93p Tiny Desktop PC i7 4765T 16GB RAM 256GB SSD for $150
Get a used Lenovo Tiny or a Dell Optiplex Micro with a CPU at least generation 6. CPU (smallest would be 6100t), 8 Gigs of RAM, better 16 but not necessary, and a 2 TB SSD. This will make you future proof for at least 5 years and will cost you just around 200 bucks. Umbrel will run very well on this machine, no matter which Bitcoin services (Lightning, Electrum Server, LNbits, own mempool.space, BTCPayServer, or anything else) on top.
PoS still requires work - just not brute-force hashing. Ethereum validators use electricity and computing resources (CPU, RAM, network bandwidth) to perform cryptographic operations, validate blocks, and maintain uptime - so energy is consumed, and technically, some physical "work" is done. Whether you walk 10 miles or drive 10 miles, you still accomplish the same task: getting from point A to point B. Walking takes much more physical effort, while driving is far more efficient. Similarly, both Proof-of-Work and Proof-of-Stake aim to securely validate transactions and reach consensus - but PoW consumes far more energy to do so, while PoS achieves the same result with significantly less physical effort.
Device : Linux running in RAM only / Wifi - Bluetooth card removed / no hard drive Connection : Proton claims the connection has no logs, otherwise how can I check if it is tampered or not ?
Hey! I totally get the Umbrel storage struggle For your Raspberry Pi 4 (8GB RAM, nice!), try RaspiBlitz. It’s beginner-friendly, has a web interface, and lets you sync the blockchain to your 2TB SSD easily. Just flash it to your microSD and follow the setup. myNode is another simple option. Make sure your SSD is ext4 formatted! Check r/raspiblitz for tips. Good luck, you got this!
I got you on a ChatGPT answer if you want. This was good for me too, lots to learn. Public Electrum servers are able to return your balance in milliseconds because they don’t rescan every block on-demand like a vanilla Bitcoin Core wallet does. Instead, they maintain a continuously-updated, pre-built index of every UTXO and transaction history, stored in a high-performance database on fast hardware with tuned caches. Here are the key factors: ⸻ 1. Specialized indexing software • ElectrumX, Esplora/Electrs, and Fulcrum each build and maintain a full “address → UTXO/tx history” index as new blocks arrive. Your wallet’s balance lookup then becomes a simple database query—no full-chain scan needed.  • By contrast, when you point a wallet at a bare Bitcoin Core node (even with txindex=1), the wallet’s RPC rescan must walk every block output and check each script against your keys, which is inherently O(chain-size) and slow. ⸻ 2. High-I/O, low-latency storage • Public servers run on SSDs or NVMe drives (often in RAID), delivering thousands of IOPS so their indexer can write new blocks and serve random reads at lightning speed. Even a modest ElectrumX instance “is I/O-bound … SSD’s are definitely recommended” —and enterprise hosts use NVMe for even higher throughput. • Example (AWS testbed): • Data disk: 1 TB gp2 (3,000 IOPS) for the Electrum index • Bitcoin data: 600 GB gp2 (1,800 IOPS) for bitcoind’s block files • Result: balance queries over a cold cache complete in under 50 ms  ⸻ 3. Sufficient RAM for caching • ElectrumX is typically run with 2 GB+ of cache (CACHE_MB = 2048), and LevelDB’s own block cache (DB_CACHE ≈ 1,200–1,800 MB), so most lookups hit memory rather than disk.   • Even a single-user Electrs instance recommends 16 GB RAM to keep its embedded database hot.  ⸻ 4. Tuned software settings • ElectrumX config tweaks commonly used on public servers: COST_SOFT_LIMIT = 0 COST_HARD_LIMIT = 0 CACHE_MB = 2048 This disables internal rate-limiting and maximizes in-memory caching.  • Fulcrum sets txhash_cache=2000 to keep recent transaction lookups in RAM.  • Esplora (Blockstream’s server) uses a “constant-time caching” schema so addresses above a threshold get fully cached.  ⸻ 5. CPU and concurrency • Indexing a new block is parallelized, and query handling is asynchronous. Single-core speeds matter less once the index is built, but public hosts often use multi-core Xeons or equivalent to absorb spikes in demand.  • Your gaming-PC CPU may be fast, but if it’s paired with a spinning disk or limited DB cache, your wallet’s RPC rescan still bottlenecks on I/O and single-threaded script-matching. ⸻ How to speed up your local setup 1. Use an Electrum-style indexer locally (e.g. run ElectrumX, Electrs or Fulcrum against your node) instead of pointing Electrum directly at bitcoind. 2. Move your data directory to an SSD/NVMe, and give your indexer its own fast volume. 3. Increase DB cache in your server config (CACHE_MB, DB_CACHE) to keep more of the index in RAM. 4. Ensure your Bitcoin Core is started with -txindex=1 (if using ElectrumX/Fulcrum) or -blockfilterindex=1 (with descriptor wallets) so the indexer can pull historic data without re-scanning blocks itself. By adopting the same hardware profile (SSD + ≥16 GB RAM + decent single-core CPU) and software tuning that public hosts use, your local Electrum server will likewise return balance and history queries in milliseconds instead of minutes.
Solana actually requires 256 gb RAM...
It’s not "less RAM = better tech". It’s that Ethereum scales without needing high-end hardware. Solana’s RAM-heavy design makes it hard for regular users to run nodes — which hurts decentralization. Ethereum keeps hardware light and achieves massive scale by going modular: rollups handle execution, EigenDA handles data — all secured by Ethereum’s base layer. That’s a superior technical design.
Recommended settings for running an algorand node: CPU 8 vCPU, RAM 16GB of RAM, Storage 100 GB NVMe SSD Network Bandwidth 1 Gbps
Can someone explain? Is this irony? Why does less RAM make Ethereum better tech than Solana? Solana has more RAM, which is better, no?
Start it with `--dbcache=10000` to allocate a 10 GB cache. The optimal value depends on how much RAM you have. A larger cache can significantly improve download speed—but only after the cache has been filled.
I recommend buying your own rig. Much cheaper. I went on eBay and bought an HP Elitedesk 800 G3 Mini with an i5 6th gen quad core 3.2Ghz. Came with 8GB ram and no disk or power supply for $55. I added 8GB of RAM ($12), bought a power supply ($13), new cooling fan ($13), and used a 2TB sata ssd I already had. It runs fantastic. I do Bitcoin Node, Electrs, mempool, and Pi-Hole for DNS level add filtering. I generated my own SSL cert, opened some port forwards, use nginx reverse proxy and UFW firewall, and got a free DNS name from changeip and attached it to my ISP router. I'm able to connect apps like BlueWallet on iPhone and Sparrow Wallet on Windows PC to my own node for broadcasting transactions, and it's all secured with SSL certs and TLS 1.2/1.3. It's a bit complicated to setup, but RaspiBlitz has good guides on how to set this part up. Since Umbrel runs in a docker container, anytime I get an update, it wipes out the reverse proxy, firewall, and ssl cert. But it only takes me 10 minutes to set it up again after an update. Since I generate my own SSL certs, I consider this a safe method, because I generate a new cert key each time Umbrel is updated. If what I just typed has your head spinning, you may want to buy the premade node from Umbrel. I'm able to do this method because I'm an IT and network admin. Most people would struggle to do this on their own, and reinstalling all proxies, firewalls, and certs after an update may be a headache you don't want to deal with.
There are 10 000 000 000 possible passwords. In Atomic Wallet your password is run through a password‑hashing KDF (e.g. scrypt, PBKDF2 or similar) that uses a random salt and intentionally high work‑factor (many iterations and/or memory‑hard operations). That means each guess takes on the order of hundreds of milliseconds (or more) of CPU+RAM work, drastically slowing down any brute‑force attack. Below is an estimate of how long it would take to exhaustively try every all‑digit password of length 4–10, given two attack rates: * **Normal CPU**: \~10 password‑guesses per second (a typical 4‑core desktop). * **Fast server**: \~150 guesses per second (e.g. a 48‑thread machine with \~150 H/s). |Length|Combinations|Time @ 10 H/s|Time @ 150 H/s| |:-|:-|:-|:-| |4|10 000|16 minutes 40 seconds|1 minute 6 seconds| |5|100 000|2 hours 46 minutes|11 minutes 6 seconds| |6|1 000 000|1 day 3 hours|1 hour 51 minutes| |7|10 000 000|11 days 13 hours|18 hours 31 minutes| |8|100 000 000|115 days 17 hours|7 days 17 hours| |9|1 000 000 000|3 years 62 days|77 days 3 hours| |10|10 000 000 000|31 years 259 days|2 years 41 days| If your password was up to 7 digits long, then it would take up to 2 weeks to crack it on your computer. In worst case scenario that your password was 10 digits long you would need to rent a fast server for about 2 years to get it solved. Easy! 
The answer is different for ETH, because its blockchain is many terabytes. A comparison ... https://blog.lopp.net/2021-altcoin-node-sync-tests/ > mass adoption Bitcoin constrains resource usage (mainly RAM requirements, ask if you want this explained) by limiting the size of a block, and the block interval. This incidentally flows to other resources. It's very light on CPU and network, and a single affordable HDD will store almost 200 years of blockchain This constraint triggers the scaling debate. But is there a scaling issue? No. Would there be a scaling issue with mass adoption? Who cares? There has been no sign of mass adoption. The expectation of mass adoption is unrealistic. It's probably a happy coincidence. By carefully constraining resource usage, the node network is cheap to run forever. And the level of actual adoption is well within the constraints for at least 50 more years As you can see in Lopp's other articles about initialization times for Bitcoin nodes, the main unbounded constraint is the linear time increase for initializing a new node. How long is too long? Already, Lopp's 10-hour time is too long for many, and is currently increasing by about 8% per year. For an operator with an HDD, the time is about 60 hours Nothing else has adoption anywhere close to Bitcoin. The "Bitcoin equivalent" blockchains (LTC, DOGE, Monero) have so little usage that they don't even function as spillover options for when Bitcoin is occasionally congested. ETH is bloated from its not-Bitcoin design, an indicator of the folly of arbitrarily discarding Bitcoin constraints. ETH has the remarkable twin features of almost no usage, and a bloated chain making it too expensive to operate a node
Base is now 2x faster than Solana. Solana also chokes under volume, as seen with the Trump meme release. [https://www.youtube.com/watch?v=12G2nD821YE](https://www.youtube.com/watch?v=12G2nD821YE) Did you think this through? Solana's chain is over 300 TB in size. It becomes a nightmare to manage, and they will probably archive out older transactions to trim it down. Solana validators required 10G connections and 256 GB RAM. That by design makes in centralized. How many home validators can match these specs? [https://www.helius.dev/blog/how-to-set-up-a-solana-validator](https://www.helius.dev/blog/how-to-set-up-a-solana-validator) >Currently, the minimum recommendation *is* 12 cores/24 threads, 256GB RAM, 2x 1TB SSD disks (ideally w/ RAID0), and a *10GB* internet *connection*.
I was thinking a # SER8 Mini PC, AMD Ryzen 7 8745HS(4nm, 8C/16T) up to 4.9GHz, Mini Computer 24GB DDR5 RAM 8TB M.2 NVME & # WiFi6/BT5.2/2.5Gbps NORMAL Ryzen with upgraded storage, do you think this would be straight for3 to 4 years? Thank you.
This has been discussed many times before , but I'll explain it again >Why isn't Bitcoin's block size increased? The limit was already increased in 2017 from 1 MB to 4 million units of weight or up to 3.7 MB blocks limit . Anyone who tells you otherwise is lying or ignorant . Some context for beginners to understand scaling capacity in Bitcoin: Satoshi Nakamoto originally set a 1MB blocksize limit on Bitcoin to prevent certain resource attacks and keep Bitcoin decentralized. He suggested some ways we can scale Bitcoin by introducing us to the idea of payment channels and ways to replace unconfirmed transactions (RBF) for a fee market before he disappeared. The first version of Bitcoin released had a version of replacing transactions as well https://github.com/trottier/original-bitcoin/blob/master/src/main.cpp#L434 https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2013-April/002417.html Over the years there has been many opinions and disputes as to how to scale Bitcoin from keeping the limit as is , to scaling mostly onchain with large blocks, to a multi-layered approach and every variation in between. In 2017 the Bitcoin community finally removed the 1MB after coming to consensus over a path forward. https://github.com/bitcoin/bitcoin/blob/master/src/consensus/consensus.h 1MB limit was removed and replaced with a larger limit of 4 million units of weight- https://www.reddit.com/r/BitcoinBeginners/comments/ghqcqn/bitcoin_bubble_or_revolution/fqa72j1/ Bitcoin is taking the approach of scaling with many solutions at once. With larger blocks, hard drive capacity is the least of our concerns due to – Archival full nodes contain the full blockchain and allow new nodes to bootstrap from them . Pruned nodes can get down to around ~5GB , and have all the same security and privacy benefits of archival nodes but need to initially download the whole blockchain for full validation before deleting it (It actually prunes as it validates) The primary resource concerns in order largest to smallest are: 1) UTXO bloat (increases CPU and more RAM costs) 2) Block propagation latency (causing centralization of mining) 3) Bandwidth costs 4) IBD (Initial Block Download ) Boostrapping new node costs 5) Blockchain storage (largely mitigated by pruning but some full archival nodes still need to exist in a decentralized manner) This means we need to scale conservatively and intelligently. We must scale with every means necessary. Onchain, decentralized payment channels , offchain private channels , optimizations like MAST and schnorr sig aggregation, and possibly sidechains/drivechains/statechains/ fedimint, cashu must be used. Raising the blockweight limits in the future is not completely opposed - https://bitcoin.org/en/bitcoin-core/capacity-increases https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-December/011865.html >"Further out, there are several proposals related to flex caps or incentive-aligned dynamic block size controls based on allowing miners to produce larger blocks at some cost." But raising the blocksize further than 4 million units also might not be needed as well depending how all the other solutions come to fruition.
I have a MinisForum GK41 with 8GB RAM, an Celeron J4125 and I gave it an extra 1TB SSD, works great as a pruned node. If you don't want to spend too much money, that's a good place to start, but I would recommend something a little more beefy if you want to future proof, maybe the same model with but more RAM and SSD as you don't need a powerful processor to run Bitcoin, Lightning or anything Bitcoin related, just memory.
If an old laptop already works, and you like tinkering yourself, and fiat money is no issue, you might even consider building the ultimate future-proof node. With at least 8TB dual SSDs, 128GB DDR5 RAM and a >2GHz base frequency processor. How fast would that finish the initial blockchain sync?
You didn't list your RAM or CPU and which is more pertinent than your connection capability
My node is running on an older desktop that has 24GB of RAM with an i7 processor. I think that should be sufficient.
How much RAM do your node have? The UTXO set has to fit in memory or the download will crawl. 32MB RAM shound work well. 16MB is not quite sufficient. It is not sch a big problem once it has catched up with the blockchain. If you want to run the nose on a smaller computer then maybe consider using a bigger computer for the initial download and then sync over the blockchain and chain state to the smaller computer. There is no trust issue in you syncronising the chain state between two computers you trust.
Increase database cache – In bitcoin.conf, set dbcache=4000 (or higher if you have RAM to spare). Its more the 500gb now lol
There are lots of RWAs worthy of attention in my opinion, with RAM particularly being one of my major watch at the moment, as the decentralised and low latency storage it provides gains even more on-chain recognition, warranting further demand, value, as well as indicating its long-term viability
RAM is one of my faves. Serves as a RWA on the EOS ecosystem.
That's the best approach if you ask me. I'd rather opt for high-potential holds for the long term tbh, with the likes of RAM occupying top spots on my watch list, with on-chain utilities that'll guarantee their stay in the long run Do make your findings