Reddit Posts
Bitcoin Filters Work By Default, and That's a Good Thing | To Filter Spam From Your Bitcoin Core Node, set “permitbaremultisig=0” & “datacarrier=0” in your Bitcoin.conf File | Use "blocksonly=1" to turn off your mempool entirely
Cartesi: A rollup (and CPU) for every dApp developer | Avail Whiteboard Series
White Paper: Communication Through Bitcoin App
What altcoins are suitable for mining on low-end PCs?
GROQ | Missed out on GROK? Here is your chance to buy GROQ! 0/0 Tax | LP locked | Ath Coming !!!
What altcoins are suitable for mining on low-end PCs?
Dual EPYC 7742 CPU Mining RandomX Hashrates
Blocx - x11 - all in one computer manager - whitepaper & roadmap released - governance (dao) released - coinstore listing on 28th
The blockchain today vs. tomorrow
The Most ASIC-Resistant Coin Nobody Has Ever Told You About
How to keep your computer clean and minimize risk for malware.
The Beginner's Guide to PoW and PoS ! Learn about Proof-of-Work and Proof-of-Stake !
The Beginner's Guide to PoW and PoS ! Learn about Proof-of-Work and Proof-of-Stake !
Which mobile phone is best for multiple crypto wallet ?
Downfall: Threat to crypto projects?
Satoshi was, is, and will be, an AI from the future.
BLOCX - POW/POS - X11 - All in one computer manager
Utopia Messenger provides 100% security on your communication + ChatGPT assistant.
How to MINE Crypto with your PC or Laptop: GPU and CPU mining
Can someone tell me what exactly WhiteBIT are smoking?
Decentralizing Online Video: Discover the Power of AIWORK
Medium and small Bitcoin miners are at risk: “It is not profitable anymore” — The constant increase in Bitcoin mining difficulty raises questions about the profitability of the business. I talked with some miners for insights regarding the activity.
Medium and small Bitcoin miners are at risk: “It is not profitable anymore” — The constant increase in Bitcoin mining difficulty raises questions about the profitability of the business. I talked with some miners for insights regarding the activity.
Buying Bitcoin is easy today, because we have CEX and DEX. In the early days you would have to mine for it, visit scammy websites to buy it, find people for P2P on Bitcoin forums etc. The truth is - we need CEX.
I'm working on a Time of Death AI for my crypto holding (update for those who seen the template document)
I've followed all the instructions, but syncing my full node is taking for-f'ing ever...
Cardano: An in-depth look at its advantages an disadvantages
The software security argument why Ledger Recover is a security risk
The software security and scientific argument why Ledger Recover is a security risk
Nano: An in depth look at its positives and negatives to see why it's dying
How you can use crypto for good causes.
Algorand: An in-depth look and it's advantages and disadvantages
OctaSpace (OCTA) distributed computing project 275% in just under a month
Why most crypto users would rather mine fiat than crypto?
Several million constraints for an individual, unconstrained scalability for mankind.
About privacy, and how Monero (XMR) helps
Why Monero is a Better Choice than Bitcoin for Privacy-Oriented Users
Do you know Satoshi created Bitcoin in reaction to the 2007 global financial crisis, to give people around the world a choice
Love Banano? Gridcoin does everything Banano does, but better.
How to: Mine Moons using hardware equipment! 💻 🌓
You need to be a multi-millionaire to simply have a chance of being a validator on Binance Smart Chain network. And it gets worse from there. [SERIOUS] ly how did we ever accept this?
Just scored 29 points on Stress My GPU's CPU benchmark
why are these on my CPU files but under other company's names and why can i not access my wallets of the BTC I've developed as a licences mit developer for bitcoin. org plz help I'm being robbed
The most ASIC-resistant crypto has been around since 2013 and you've probably never heard of it
ASIC Resistance - Why it matters and who is doing it right
M2 is not just an Apple CPU… it WAS the reason why the US dollar was going down
$1500 DeSci Coin Giveaway and AMA w/ Curecoin, Gridcoin, Etica
Bitcoin - $BC | CMC Listed| Big Marketing Campaign | Strong Community
Bitcoin - $BC | APeer-to-Peer Electronic Cash System | Big Marketing Campaign | Strong Community
Why mining pools having huge hash shares is a bad thing. They can censor transactions. Individual pool miners have no control over what goes into the blockchain. Only the pool owner does.
What is Monero (XMR)? A beginner’s guide
What is Monero (XMR)? A beginner’s guide
"Master decryption key" for the whole Secret Network extracted via AepicLeak CPU bug
Ferrite Core v1.0.0 compiled and uploaded on Github today
Mentions
You can't mind Bitcoin on a phone at a speed that would generate any money whatsoever. You can mine a CPU coin on the phone and specifically I did this at one point to pay for my VPN but the phone only made $5 in 30 days. If you try to install a product that "shares bandwidth" What you are actually running is something that lets other people use your connection for at best click fraud / ai scraping and at worst unlawful content. This can get you blocked by Cloudflare and a bunch of other CDNs when you start sending out a bunch of bullshit traffic. Do not install this. It can have legal consequences for you. Do not install this. So anyway, the overall thing is if you have an old pixel you can make two dollars a month, mining a CPU coin, and then swap it to Bitcoin, which honestly will not even cover the five or so Watts the phone uses. And even if it did, it's $2. Lmao.
Personally I think dca (bit by bit) might be a safer play while Bitcoin is near all time high. Although since the projected end game is Bitcoin price far surpassing 100k, it doesn’t matter too much. Strike is good. Just make sure your id is verified and you’re able to withdraw. A cold wallet is the only way to go for long term storage. I use Jade in stateless mode, much like a seedsigner hardware wallet. Mining is now primarily run on specialized ASIC hardware, not CPU or GPU in your computer. And you’d need a lot of asic and near free electricity to see any profitability, but something like bitaxe is a fun lottery miner.
I lost a hard drive with a wallet that had roughly 150BTC on it that I mined in the beginning with a CPU, back when that was still a thing. Have no clue where that drive ended up, but it’s likely with all the other stuff I lost in like 2013 :/
Choosing to do SETI @ HOME over Bitcoin back when you could run it off a CPU/GPU.
2008. I could have fucking mined with my CPU if I just read a few more sentences before clicking away.
What you’re seeing is front running bots. 1/2 the cause of all the rug pulls you guys suffer from trying to trade shit coins. When you’re ready to get in on the ground floor of a real project, PM me. It’s live, it’s CPU mineable, and it’s built on proven technology. No bullshit involved. Mine it right now on a dusty old laptop and hit blocks. Isn’t even exchange listed yet.
If Monero be illegal, I will be a criminal heh Monero is descentralized and Open-source, nah, they don't can destroy Monero, Monero is anti-fragibility. How buy Monero: Use Retoswap.com, Cyphergoat, Trocador.app and Serai.excharge in future. How mine Monero: Use your CPU + P2Pool or solo mining (or creat your mining pool, all there opcional turns Monero more descenfralized). How store Monero: Use Monerujo or Feather Wallet + Sidekick or Anonero. (And have wallets that use a Pen-drive with TailsOS). see more in r/Monero and Monero.eco
Buy Monero in DEXs like Retoswap.com or mine with your CPU with P2Pool. And have Cyphergoat and Trocador.app
Excuse me for being ignorant, I just made a suggestion with Stratum V2, I didn't mean to be an ass either. So take this below with a GRAIN OF SALT. I am focusing on the hash rate part All I know, is Stratum V2 is going to reshape the way it works with choosing a pool, as a noob I could be wrong though...that's why I did not go further into details. I am not here to mislead someone... Tbh, and I don't think an AI will help about what you're asking as to "Why do people keep mining in those gigantic pools?" As a general rule : People don't like to be the first mover, and they do not want to take risk of being separated from the mass but once it has started, domino effect will come into play...if you ask me... My personal research leads me to THINK : A pool holds many individual miners that are free to unplug, so how could they gather to rewrite the blockchain, as you suggest ? So I personally will not worry, since energy is well distributed on the planet, there will always be miners of last resort, (be an individual, another pool of miners, countries). The way I see mining is more a moving game theory, and even pools could fail if they do not manage correctly their activity. Why not imaging States mining in the future competing with each other (rendering today's pools obsolete ?) Even considering this : if Moore's law hits a limit, the deflationary aspect of the economy might make profitable for the pleb to mine on their own (cheaper Asics) as it was intended from the start (CPU) and leading to more decentralization. Unpopular Opinion : I am not an AI fan for doing research, I prefer spending time to "understand" topics and fill in the blanks if needed, it is very time intensive though ! TL, DR : I am not an AI fan and I don't know if I can help you further
I mined 5 BTC with my gaming PC in 2011, when BTC was $2-3 USD/BTC. I mined on the Core2 Quad CPU itself, and both GPUs (a Radeon HD 6870 and a Radeon HD 4670), and was part of a mining pool that took 2% off the top and paid out the rest a few times a day. I thought that BTC was going nowhere and sold what I had in my Mt God account for about $10 in November 2011, right before I went to prison for six years. When I got out of prison in 2017 I found that I still had 1.15 BTC left in my wallet.dat on the PC. I moved it to a more modern wallet and sold some of it (at about $10k USD/BTC to help pay fines/debts. I sold some more about five years later at around $60k USD/BTC to help build a house. I still have 0.1 BTC left in that wallet that I don't think I'll ever sell. At the end of 2022 when BTC crashed down to $15k I bought a whole coin and change and have that in another wallet. We're doing pretty great overall, but I really regret selling that BTC at under $3 in 2011 😅
You're absolutely right — realistically, scanning such huge ranges on a phone isn't practical. Mobile apps like mine are more for learning, experimentation, and exploring how keyhunt-style scanning works. In real-world puzzle-solving, tools like KeyHunt (CPU-based) or BitCrack (CUDA/GPU-based) on desktops are statistically far more capable. **For example:** 📱 Xiaomi Mi 8 Lite → \~100 keys/second 📱 Poco X3 Pro → around 3–4× faster than that (\~300–400 keys/second) On a modern PC with proper GPU support, you can easily reach millions of keys per second. So yeah — the mobile version isn’t for serious cracking, but it helps visualize and test ideas while on the go But hey — maybe you're just *really* lucky 🍀 Might be worth a try, right?
I would say that Monero (XMR) is necessary so that you can protect yourself from a possible authoritarian government and to save your finances for the future, Monero is untraceable and fungible, I recommend that you take a look at the website Getmonero. org to find out more about Monero, some methods of acquiring it are: CPU mining (Monero is easy to CPU mine thanks to its algorithm, I recommend you use P2Pool), buying through DEXs such as RetoSwap, Basicswap, UnstoppableSwap (and in the future through Serai.excharge), you can also sell things for Monero through XMRBaazar for example, plus there's TradeOgre (all these methods are non-KYC). Call me if you have any questions!
Some friends and I have a small CPU farm, RN we're on TIG - it's paying the bills, but not much more. You have to scan daily to keep up - try to find something that's about to launch, prepare, follow and mine the hell out of it for a couple of days. You might get lucky and make bank. I made $20k off octa in two days of mining and about the same from IXI over a longer period. Be prepared to HODL. Shit is out there, you have be patient and willing to hodl.
>GTA 7 In 2040? You clearly have no idea what you're talking about. On a serious note, I believe the size will eventually get bigger. Not sure when though. Many people including myself have PTSD from [the Blocksize War](https://youtu.be/6YtS5ZNuuTw) and it'll take a lot of effort to do it right but it eventually gets done. Now, it isn't only about the cost of a drive, bigger blocks will need more CPU, RAM and faster internet connection. So the cost of running a node will get higher. But once we see the ability to run a node on cheap android (or hopefully Linux) phones, the size increase will make much more sense. Note to the bcash shills reading this, thinking why not using their shitty fork instead. Because the vast majority of the network will have to agree on the increase. Not a scammer Roger posting a tweet about the hardfork. That's not how decentralized system works. And yes, your dying shitcoin will be obsolete.
There are smart ways to implement it. Currently there are around 170 million UTXO. If you use hash map with a relatively short hashes (RipeMD160/sha256) you can achieve around 5-10 billion checks a second on a modern 32 core CPU (AMD EPYC). Hash map will occupy around 20GB of memory.
That post is extremely misleading Super-computer are general-purpose CPU/GPU that can do many kind of theoretical calculation while the BTC network is only composed of rig incapable of doing anything besides computing the SHA256 hash of a string Your chart compares two completely different things
Full bullshit Bitcoin ASICs are ultra-specialized chips that ONLY do SHA-256 hashing. That’s it. They can’t do addition, multiplication, handle RAM, or literally anything else. It’s like comparing an electric can opener to a Swiss Army knife and saying “look, the can opener is 1000x faster at opening cans!” Yeah, but that’s all it does. A modern supercomputer can execute billions of different instructions per second, run complex scientific calculations, physics simulations, machine learning… The entire Bitcoin network wouldn’t even be able to properly emulate a single CPU core due to network latency. So yeah, technically the Bitcoin network processes more TeraFLOPS… but for ONE cryptographic operation only. It’s like saying a factory that only makes bottle caps is “more productive” than a car factory because it outputs more units per hour. This impressive-looking metric means absolutely nothing in practice. It’s just basic crypto marketing to impress people who don’t understand the tech. This comparison is completely bogus IMO.
That dude developed GPU mining of Bitcoin and did the 10,000 thing many times over... Around 70,000 Bitcoin... He had mined so much Bitcoin that Satoshi was bothered about it and as such Laszlo did some heavy spending... Redistribution so to speak... Pizza guy was on a very short list of top Bitcoin people back then. He knew the significance and documented it. He certainly didn't realize the future value significance! He is largely credited for having started some frame of reference of value for Bitcoin! Beyond expanding from CPU mining.
The one that is titled "A Peer to Peer electronic cash system ?" It was not intended to be like gold. But I'd be curious where in the white paper you are confidently referring. Abstract. A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution. Digital signatures provide part of the solution, but the main benefits are lost if a trusted third party is still required to prevent double-spending. We propose a solution to the double-spending problem using a peer-to-peer network. The network timestamps transactions by hashing them into an ongoing chain of hash-based proof-of-work, forming a record that cannot be changed without redoing the proof-of-work. The longest chain not only serves as proof of the sequence of events witnessed, but proof that it came from the largest pool of CPU power. As long as a majority of CPU power is controlled by nodes that are not cooperating to attack the network, they'll generate the longest chain and outpace attackers. The network itself requires minimal structure. Messages are broadcast on a best effort basis, and nodes can leave and rejoin the network at will, accepting the longest proof-of-work chain as proof of what happened while they were gone.
I cant find any tool that can check passwords with GPU. I use btcrecover but it use CPU and its very slow 1000 p/s. The GPU option in btcrecover doesnt work for me/my wallet. Any suggestion for gpu tool besides hashcat?
100% CPU means 100% fans. I tried this back in 2010 and this was a fucking boing screaming in my chamber.
Imagine a desktop tower PC from 2008-2009 or so running 100% across both CPU cores 24/7. Fans screaming at full speed incessantly. That’s not tolerable for most people, as most people expect their machines to be whisper quiet the majority of the time.
How fake can that be ? MIning was done on CPU back then.
First heard about it on Slashdot in 2011. Didn't give it much attention because: 1. Eats up CPU cycles with unlikely benefits (at least protein folding seemed more useful) 2. P2P. Didn't like the idea of randos connecting to me and eating up my precious bandwidth 3. The blockchain was already eating up a fair bit of disk space. Didn't want to give up my hard drive for that 4. It seemed like a more complicated version of a service Paypal was already offering Kept tabs on it over the next few years. I started to recognize the value but Mt. Gox was sketchy AF. Silk Road held appeal to me and didn't seem likely to ever be of use. I was right on both fronts there.
Check out the original white paper: https://bitcoin.org/bitcoin.pdf Original website: https://web.archive.org/web/20090131115053/http://bitcoin.org/ "Abstract. A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without the burdens of going through financial institutions. Digital signatures provide part of the solution, but the main benefits are lost if a trusted party is still required to prevent double-spending. We propose a solution to the double-spending problem using a peer-to-peer network. The network timestamps transactions by hashing them into an ongoing chain of hash-based proof-of-work, forming a record that cannot be changed without redoing the proof-of-work. The longest chain not only serves as proof of the sequence of events witnessed, but proof that it came from the largest pool of CPU power. As long as honest nodes control the most CPU power on the network, they can generate the longest chain and outpace any attackers. The network itself requires minimal structure. Messages are broadcasted on a best effort basis, and nodes can leave and rejoin the network at will, accepting the longest proof-of-work chain as proof of what happened while they were gone." -bitcoin.org, 2009
You can find this post to help on how to get XMR https://np.reddit.com/r/Monero/s/nTOOin13JR Also it is easy to mine XMR with a CPU running in the background
I needed 1BTC to pay for membership to a newzbin search service. It was $10 ! Naaah I thought. Can’t be assed. Would have taken a month to mine on my CPU (!!) at the time.
Solo mining is not really helping the network, or yourself. The chances of ever hitting a block is far too small on the thinnest edge of nonexistant. Better to join a pool that i ethically in line with your goals so you have a chance of getting your voice heard and helping diversifying the mining and direction of the network. The network needs - transparency in transactions policies - transparency in miner voting procedures - miners taking part in the voting - well connected bitcoin nodes with sufficient RAM and CPU for relaying transactions and blocks - well connected Bitcoin nodes with storage for whole Blockchain to seed new nodes joining the network - public electrum servers - more users running their own electrum servers shielding their wallet privacy It is quite sad that then Bitcoin Core have killed the idea of SPV. While I understand the reasoning, SPV Bloom filters have great potential compared to.Electrum. I much rather had seen the above list include the option to run a SPV enabled node (bloom filters + txindex) but today that is not a viable option for BTC.
#SickandBullish Come and check out our cult and fun vibe in the Tg channel. Fully transparent and doxxed dev (Josh) almost never sleeps. Some say that when he sleeps, his hands are still writing code and that the CPU of his desktop is cooled by the tears of a Sick Buffalo.
The CPU compute can be used for anything really. At one stage it was estimated that QUBIC had more compute than el Capitan the fastest super computer in the world. With that kind of compute it can be outsourced to companies to do whatever work they want.
That is the whole idea. You can't train an AI with ASICs. The purpose is to collect most CPU compute and use it to train QUBIC's own AI and not rely on Nvidia GPUs supply bottlenecks. Everyone can mine with a CPU.
Is it cool to mine Monero on free CPU time? Think it’s posible to take over the network?
Vitalik has already gone against Trumps views multiple times and wants nothing to do with him or his party. Now lets be honest, nobody here is for the tech and were all here to make money. Solana or any other chain will jump to kiss the ring and get the mega pump that Eth should receive. Vitalik is to autistic to play ball and needs to get better at public affairs. BTC has Sailor and Eth needs someone to push the narrative forward. The 5k/10k narrative can be real, but Eth really needs someone that can push that narrative to the public. right now "BTC is digital gold" is the easiest sell. For Eth to hit 5k/10k you need to be able to sell it without explaining blockchains and layer2 solutions to the masses. People invest heavily in Nvidia but can not explain what a GPU does or how it differs from a CPU.
Vitalik has already gone against Trumps views multiple times and wants nothing to do with him or his party. Now lets be honest, nobody here is for the tech and were all here to make money. Solana or any other chain will jump to kiss the ring and get the mega pump that Eth should receive. Vitalik is to autistic to play ball and needs to get better at public affairs. BTC has Sailor and Eth needs someone to push the narrative forward. The 5k/10k narrative can be real, but Eth really needs someone that can push that narrative to the public. right now "BTC is digital gold" is the easiest sell. For Eth to hit 5k/10k you need to be able to sell it without explaining blockchains and layer2 solutions to the masses. People invest heavily in Nvidia but can not explain what a GPU does or how it differs from a CPU.
A) due to its restrictions it doesn't have the kind of futures/derivatives markets of other bluechip alts which tend to lead to manipulation/trend reversion properties B) it is more quantum resistant than other cryptos (even if the threat is overstated). the devs also have a culture of innovation/regular hard forks which could adapt to potential crisis C) mining is much more democratized due to an ASIC-resistant mining algo--anybody can still mine XMR with an average CPU. this relieves some sell pressure as big players aren't selling to fund huge energy intensive operations D) many people see the writing on the wall when it comes to censorship in the West. other cryptos have reasonable privacy features (LTC with MWEB) but Monero is the gold standard. Many bright devs made their way over to XMR from BTC and it fulfills BTC's original ethos in many ways go ahead, buy some XMR 😎 cake wallet makes it super easy to swap from LTC or whatever, no exchange necessary
Exactly. The fact that they’re still far from maxing out CPU utilization shows how much untapped potential there is. Once full utilization kicks in — not even counting future CPU inflow or algo tweaks — 500–800 MH/s is absolutely within reach. Add Tari merge mining and potential GPU deployment, and it's clear this could easily go over 50% of Monero's hashrate. It’s not just about mining rewards anymore. At that scale, the influence extends to consensus and long-term network dynamics. Don’t sleep on this.
Is it far fetched? Just make sure you give miners highest revenue versus others. In Qubic this is possible because PoW for the network itself is not needed for its security directly. Qubic is interested in both CPU and GPU computing power. ASICs make no sense because algorithm will change very often. Regular hardwire will win, see it as BOINC but with profits. Anyone can now mine with your (idle) home computer and get some profit.
Looking at the Qubic discord it seems they chose to mine XMR because its was one of the most complex to try and mine via POW and CfB stated if it was successful they would be able to mine any coin and also because it's heavily CPU mining. Apprantly there's a planned algorithm change in a month which will significant increase hashrate in favour of CPU mining rather than GPU. I've seen on X multiple posts over the last week where the hashrate has increased from 40mhs to now 150mhs. Its impressive to say the least and is looking more profitable than mining any other coins.
Using idle CPU time to mine Monero can be seen as efficient resource use but it risks breaching user trust if not transparently disclosed. Even small deviations from a platforms original mission can hurt its credibility. Reaching 51% hash rate on Monero is highly unlikely without massive coordinated infrastructure.
The man that bought the pizza was Laszlo Hanyecz. Most people today, even Bitcoiners, don’t know that name, but he was the first miner to use a GPU (rather than a CPU).
This man is famous for those two pizza, but this man wrote the first code to mine btc with GPU instead CPU. Good pizza day to everyone
Why is it not decentralized? I can run a mining node from my computer and it doesn't require an ASIC or powerful GPU, just my CPU and I'm contributing to the network. It is cash because it is fungible and works like paper money. No one is going to know my transactions history or any other information.
It can only be mined with CPU. RandomX is Asics resistant and quantum proof in some aspects
I started mining BTC on my gaming PC (with a mining pool) in 2011. It was profitable to mine on the CPU and on both GPUs, and over a period of about 6 months I had mined about 5 BTC, worth about $2-3 each. Everyone I talked to about it at the time was interested in the blockchain tech itself, and about how it could disrupt the entire finance industry. Sure we were making some money with our gaming PCs, but it was small change and we had no idea how valuable it would become. I ended up spending 6 years in prison (late 2011 to late 2017) so I was forced to HODL for that time. In 2020-2021 I sold a good chunk of my BTC ($30k-60k range) and put it toward building a house. It's crazy that we're in the $110k range now, but it looks like we still have a long ways to go before we hit the top. I wish I'd mined or bought more BTC way back then, but I'm still grateful that I've done as well as I have with it. I'm still a whole coiner 😎
The timing is really good actually. Akash does CPU compute, some others & Nvidia ofc doing GPU compute, so I could see Acurast could pick up a monopoly on mobile compute. Big for local AI inference, key to privacy, also on trend.
It won't be much, the oc will be loud and hot if you do. CPU mining is monero, you could use other coins but that would be the best overall imo GPU, Fark not much there and not for your situation. The PC will be slower to use if you mine and dual mine. Not worth it tbh.
I heard about the wear on Hardware, but isnt it pretty much Limited to the GPU ? Thought thw CPU would be much more resistent to wear Like that
Check out : https://bitvmx.org/ BitVMX is a protocol designed for trustless, bridgeless interactions on the Bitcoin network. It enables a virtual CPU to optimistically execute arbitrary programs, which can be used to facilitate various functionalities, including trustless bridges. Here's a breakdown of how BitVMX relates to trustless bridgeless transfers: * Trustless Bridges: BitVMX is designed to build trust-minimized and permissionless bridges, as seen with the UNION bridge connecting Bitcoin and Rootstock. These bridges aim to eliminate the need for central authorities or intermediaries by using cryptographic proofs and dispute-resolution mechanisms inherent in the BitVMX protocol. * Bridgeless Interactions: The core concept of BitVMX is to enable complex computations and verifications directly on the Bitcoin layer without requiring assets to be moved to another chain. This suggests the possibility of "bridgeless" interactions in the sense that you're not necessarily transferring assets to a separate blockchain to engage in different functionalities. Instead, the logic and verification happen within the Bitcoin ecosystem itself. * How it Works: BitVMX uses a challenge-response system where a "prover" executes a program, and "verifiers" can challenge the result if they suspect it's incorrect. The protocol uses hash chains of program traces to simplify the verification process. In case of a dispute, a specific part of the computation can be checked on-chain. * UNION Bridge Example: The UNION bridge between Bitcoin and Rootstock leverages BitVMX's dispute-resolution model. Users can lock BTC on the Bitcoin blockchain, and an equivalent amount of RBTC (on Rootstock) is released. This allows users to interact with DeFi applications on Rootstock using their Bitcoin, with the process secured by the trustless mechanisms of BitVMX. In summary, BitVMX is a technology that enables the creation of trustless bridges and facilitates trustless interactions within the Bitcoin ecosystem, potentially leading to "bridgeless" functionalities where the need to move assets to other chains is minimized or eliminated for certain use cases. The development and adoption of BitVMX are ongoing, with initiatives like BitVMX FORCE aiming to standardize and enhance its capabilities for the Bitcoin network.
The first disruptive thing I'd do with a sufficiently powerful quantum CPU is destroy the entire financial system by stealing every penny from everyone's bank account. I'd lose interest in crypto at that point.
>Its not about the bandwidth , more about the blockchain size. keeping decentralization gets difficult with extreme blocksizes. have to balance the trilemma. what say you monerofolk Actually bandwidth is the greatest limitation to scaling extreme blocksizes because unlike flash memory and CPU processing power it is not simple for consumers and small business to purchase additional bandwidth.
Setup Umbrel as a dedicated Bitcoin Server with Bitcoin/Lightning only stuff. Don't mix it up with other home server things, but use a different machine for these things. Security-wise it's better to reduce the attack surface, that's why. Other than that Umbrel works quite well and it's really stable. Go with a 2Tbyte SSD for the bitcoin things though. Make sure to have at least 8 gigs of RAM, 16 still better. Also make sure your CPU is at least a generation 6. CPU. I am testing different Bitcoin node suites for some years already. If you have further questions feel free to head back here.
Bitcoin's network will become insecure within 20 years because of it's failed security budget. Transaction fees will not provide enough financial incentive to continue mining and securing the network. Monero on the other hand, with it's energy efficient CPU mining and tail emission, will continue the provide the financial incentive required to secure it's network indefinitely.
I can explain it very easily. It is all built on the innovation to control a computer platform by majority vote. Just like your country is controlled by majority vote. "Crypto currency" is only hard to understand because people misrepresent it. In your country, you elect a government that is the central authority for a "block" of 4 years (the alternative is to always have the same central authority, i.e., a king or dictator). In "blockchain", a "central authority" is elected each "block" of N units of time (10 minutes in Bitcoin). The alternative there too is to *always have the same central authority* and this is how all computer platforms so far have worked, i.e., they are equivalent to a "king" or "dictator" in terms of control. The election of the central authority is by a sort of "indirect majority vote", based on how many "votes" you have (how much CPU power in Bitcoin or how much coins in Ethereum, and in the future, how many people-votes you have) you have a probability of being elected the next "block". In nation-state, it is instead the one with the most votes always, but in "blockchain" you instead have a chance of being "elected" even if you just have a small number of all "votes" but it is less likely.
Monero is a great CPU minable coin but, isn't usually profitable unless u have free electricity. But It is the cheapest way to buy monero and the best way to support the network. Also if ur looking to build a rig for it yesteryears epyc cpus can be gotten for pretty cheap on ebay and for a few hundred dollars u can get a pretty good hash rate.
That's a generation 4 CPU. Don't go with that one. It's too slow and too energy hungry.
Monero is mined by CPU so you don’t need any special hardware, any pc will work!
Get a used Lenovo Tiny or a Dell Optiplex Micro with a CPU at least generation 6. CPU (smallest would be 6100t), 8 Gigs of RAM, better 16 but not necessary, and a 2 TB SSD. This will make you future proof for at least 5 years and will cost you just around 200 bucks. Umbrel will run very well on this machine, no matter which Bitcoin services (Lightning, Electrum Server, LNbits, own mempool.space, BTCPayServer, or anything else) on top.
> That’s the slowdown I mean—spam pushes out normal transactions. Ok, that's not the definition of "slowdown" that the developers think of. Instead, we consider "slowdown" to be actual impact on CPU and memory usage that exceeds the average usage of a typical transaction. What you're describing is that the "spam" causes a high fee market, and that's definitely a concern. However, high fee markets, as we have seen with ordinals and many many times long past, resolve themselves eventually when those "spammers" run out of money. Furthermore, if those people want to make their transactions, they can and do directly submit their transcations to miners, bypassing mempools. This is trivially easy to do with things like MARA's Slipstream. When they do this, it actually makes it worse for everyone since direct submission removes information from fee estimation algorithms. Now, instead of being able to see the mempool and see that feerates are high, what you end up doing is making a transaction with a fee that you think is enough, and then finding out when a block is found that it actually wasn't. Lastly, Bitcoin is going to need a fee market eventually. Eventually, the block subsidy is going to be negligible, and miners will have to be sustained by fees. Any way you slice it, there has to be some point in time where the mempool is consistently full and fees generally increase as otherwise mining will stop being profitable and the security of the network goes down. > Without the OP_RETURN limit, a bad actor (say the CIA or someone with money to burn) could put illegal stuff, like child pornography, into the blockchain Even with the OP_RETURN limit, a bad actor can already do this. They can already do this with the inscriptions script construction, and that allows putting up to 4 MB into the blockchain whereas OP_RETURN would be limited to 1 MB. Furthermore, even without inscriptions, it's still possible to do it by creating outputs with fake hashes or fake pubkeys that actually encode the data. > They’re ignoring people who disagree (Samson Mow said there’s “no consensus”) Can you clarify what has actually been done that you think is "ignoring people who disagree"? Having read through the entire mailing list discussion and all of the comments on both PRs, I really don't think that's the case. Certainly there is no consensus, but there's been lively debate and discussion, and no action has actually been taken. Furthermore, several comments stated that they think the options should be preserved, and another PR was opened that does exactly that. That does not seem like ignoring people who disagree. > silencing critics Some of the biggest critics are wizkid057 and luke-jr. Their comments are still visible and readable, and in fact, many people have (tried to) engage them in discussion. And not just them, there's tons of comments in both PRs with debate going back and forth, each side trying to refute the other. I would not say that this is silencing critics. While several comments in the PR were hidden, these were hidden because they do not add to the converstion. Many of these are simply "Concept ACK" or "Concept NACK", which ultimately is not all that useful to the discussion and mainly ends up making it harder to read. Several comments were also hidden because they are off topic or abusive. However, I do not think any comments which left novel criticisms of the PR were actually hidden. > (Giacomo Zucco’s comments got deleted) There have not been many deleted comments. I count 4 deleted comments, and none of them appear to be from him. While I don't have a record of what the comments were, 3 of the comments were from accounts that look like spammers. 1 of the comments was actually from a regular contributor. None of the accounts look like Giacomo's. In fact, I don't think he has participated in the PR discussions at all, unless his Github name is not giacomozucco. > some devs have financial ties to projects with conflicts of interests. Can you cite any evidence for that? AFAIK, this claim seems to stem from the fact that Jameson Lopp is an investor in Citrea, a company that would benefit from this change, but are already planning on deploying their product without it. Jameson commented on the PR, but he is neither the person who opened it, nor the person who started the discussion. He's not even a Core Dev as he doesn't regularly participate in code review or submit PRs. From the perspective of the project, he's just another rando coming by to give their opinion. > They downplay risks (like past spam from Veriblock) I don't recall anyone referring to Veriblock or anyone citing risks from past spam. > and take away our choices (they removed a setting to force us to accept this) You write this as if a decision has been made and the PR merged. But no PR related to OP_RETURN has been merged. Neither Peter Todd's original PR that removes the option, nor Greg Sanders' alternate PR that leaves the option have been merged. In fact, of these 2 PRs, I think the one that's more likely to be merged is the second which leaves the option in, but of course no such merge decision has been made yet. > That's my POV. Tthe main point here is *how* this change is being implemented when it's deeply controversial. My POV is that this has been severely overblown and there's a ton of misinformation going around about what's actually happened. It seems like there's a bunch of influencers going around screaming as if the world is ending who also are getting their information second or third-hand rather than having actually participated in the discussions and reporting what they observed.
PoS still requires work - just not brute-force hashing. Ethereum validators use electricity and computing resources (CPU, RAM, network bandwidth) to perform cryptographic operations, validate blocks, and maintain uptime - so energy is consumed, and technically, some physical "work" is done. Whether you walk 10 miles or drive 10 miles, you still accomplish the same task: getting from point A to point B. Walking takes much more physical effort, while driving is far more efficient. Similarly, both Proof-of-Work and Proof-of-Stake aim to securely validate transactions and reach consensus - but PoW consumes far more energy to do so, while PoS achieves the same result with significantly less physical effort.
It was still actually mined using CPU by satoshi and some others, not technically premined like your shitcoin. It was open for anyone else to join in the mining as well, but because the whole thing was new and no premined marketing budget by some shitcoin foundation, most people did not know about it. Its not an emotional reaction out of nowhere, its years of being annoyed by shitcoiners who think their going to zero against bitcoin shitcoin is actually better than the real thing.
In 2010 the main wallet was bitcoin core. This was also a full node with built-in CPU mining. It created a file called wallet.dat with each address private key. Buy the IDE adapter to usb and make an image of the entire disk. If anything goes wrong seek profession hard disk recovery help. 15-years is a long time and mechanical disks are fragile. [https://www.amazon.co.uk/FIDECO-Adapter-External-Converter-inches-Black/dp/B0919SF9CP](https://www.amazon.co.uk/FIDECO-Adapter-External-Converter-inches-Black/dp/B0919SF9CP) Once you've got the image you could mount it (read-only) on Linux and search for the wallet.dat file. Once you have that file accessing the Bitcoin should be trivial. Add the wallet.dat to a live bitcoin core [https://bitcoin.org/en/bitcoin-core/](https://bitcoin.org/en/bitcoin-core/) so it can check the live balance and you can send without messing with private keys. Once the original wallet is empty you could look at the forks. There is like Bitcoin Cash, Bitcoin Gold etc I don't know about those I guess they have near zero value.
I got you on a ChatGPT answer if you want. This was good for me too, lots to learn. Public Electrum servers are able to return your balance in milliseconds because they don’t rescan every block on-demand like a vanilla Bitcoin Core wallet does. Instead, they maintain a continuously-updated, pre-built index of every UTXO and transaction history, stored in a high-performance database on fast hardware with tuned caches. Here are the key factors: ⸻ 1. Specialized indexing software • ElectrumX, Esplora/Electrs, and Fulcrum each build and maintain a full “address → UTXO/tx history” index as new blocks arrive. Your wallet’s balance lookup then becomes a simple database query—no full-chain scan needed.  • By contrast, when you point a wallet at a bare Bitcoin Core node (even with txindex=1), the wallet’s RPC rescan must walk every block output and check each script against your keys, which is inherently O(chain-size) and slow. ⸻ 2. High-I/O, low-latency storage • Public servers run on SSDs or NVMe drives (often in RAID), delivering thousands of IOPS so their indexer can write new blocks and serve random reads at lightning speed. Even a modest ElectrumX instance “is I/O-bound … SSD’s are definitely recommended” —and enterprise hosts use NVMe for even higher throughput. • Example (AWS testbed): • Data disk: 1 TB gp2 (3,000 IOPS) for the Electrum index • Bitcoin data: 600 GB gp2 (1,800 IOPS) for bitcoind’s block files • Result: balance queries over a cold cache complete in under 50 ms  ⸻ 3. Sufficient RAM for caching • ElectrumX is typically run with 2 GB+ of cache (CACHE_MB = 2048), and LevelDB’s own block cache (DB_CACHE ≈ 1,200–1,800 MB), so most lookups hit memory rather than disk.   • Even a single-user Electrs instance recommends 16 GB RAM to keep its embedded database hot.  ⸻ 4. Tuned software settings • ElectrumX config tweaks commonly used on public servers: COST_SOFT_LIMIT = 0 COST_HARD_LIMIT = 0 CACHE_MB = 2048 This disables internal rate-limiting and maximizes in-memory caching.  • Fulcrum sets txhash_cache=2000 to keep recent transaction lookups in RAM.  • Esplora (Blockstream’s server) uses a “constant-time caching” schema so addresses above a threshold get fully cached.  ⸻ 5. CPU and concurrency • Indexing a new block is parallelized, and query handling is asynchronous. Single-core speeds matter less once the index is built, but public hosts often use multi-core Xeons or equivalent to absorb spikes in demand.  • Your gaming-PC CPU may be fast, but if it’s paired with a spinning disk or limited DB cache, your wallet’s RPC rescan still bottlenecks on I/O and single-threaded script-matching. ⸻ How to speed up your local setup 1. Use an Electrum-style indexer locally (e.g. run ElectrumX, Electrs or Fulcrum against your node) instead of pointing Electrum directly at bitcoind. 2. Move your data directory to an SSD/NVMe, and give your indexer its own fast volume. 3. Increase DB cache in your server config (CACHE_MB, DB_CACHE) to keep more of the index in RAM. 4. Ensure your Bitcoin Core is started with -txindex=1 (if using ElectrumX/Fulcrum) or -blockfilterindex=1 (with descriptor wallets) so the indexer can pull historic data without re-scanning blocks itself. By adopting the same hardware profile (SSD + ≥16 GB RAM + decent single-core CPU) and software tuning that public hosts use, your local Electrum server will likewise return balance and history queries in milliseconds instead of minutes.
This is false. You are correct that most cryptos, and Bitcoin especially, would become useless, but energy efficient CPU mining would continue off grid. Phones and the wallet software could easily be charged by solar. There are certainty small scale monero miners and node operators that can can continue to function in grid down situations. I am one of them! In a grid down situation, my node and most efficient miner automatically switch over to battery back up power which is supplied by a solar array. My networking equipment is also on battery backup and my internet continues to function. In case my local ISP has a failure, internet connectivity continues with starlink! As for bitcoin being functional for any substantial grid down situation, good luck with that.
Does anybody know why the Coinbase Advanced spot trading page frequently (read: daily) does not display properly? At least once a day it will just have the header and the side bar with no chart, no orders section, and no way to buy and sell. Everything is blank except the header and sidebar. And the CPU will be pegged at 100%. Even closing the tab will not fix it, and the CPU will run at 100% forever until the browser is completely closed. This has been going on for months. The DevTools says 429 Too Many Requests, but what does having the tab open looking at the chart from time to time have anything to do with Too Many Requests? You're not allowed to do that? And why the hell is my CPU running at 100%.
FCMP+ update will bring privacy to next level And one of the last pow chain where anyone can mine with its CPU (gupax, P2Pool,...)
Recommended settings for running an algorand node: CPU 8 vCPU, RAM 16GB of RAM, Storage 100 GB NVMe SSD Network Bandwidth 1 Gbps
The profitability to mine with a CPU or GPU is just one single factor in the security of the network and for the overall health of the project generally.
I checked profitability of coins to mine nowadays, and XMR have some sense only, ofc its not 2017 and its not easy money anymore but still funny that you arent immiedietly on minus. This is why I wonder to which extend its ASIC resistant. I couldnt find any coin which have any sense to mine with CPU or GPU only XMR this says something.
The Rusty-Spectre v0.3.17 release focuses on upstream merges from Kaspa Rust to stabilize the codebase and minor adjustments for Spectre, which is a decentralized, proof-of-work CPU-mined blockDAG network. This release supports the SpectreX CPU mining algorithm based on AstroBWTv3, designed for efficient mining on various architectures, and is part of Spectre's broader ecosystem that includes a fixed maximum supply of 1.161 billion coins with decreasing block rewards, promoting a deflationary environment. * [Releases · spectre-project/rusty-spectre - GitHub](https://github.com/spectre-project/rusty-spectre/releases) * [Index of /downloads/ - Spectre Network](https://spectre-network.org/downloads/) * [Releases · spectre-project/spectre-miner - GitHub](https://github.com/spectre-project/spectre-miner/releases) ^(This is a bot made by [Critique AI](https://critique-labs.ai). If you want vetted information like this on all content you browse, [download our extension](https://critiquebrowser.app).)
For mining BTC, you’re going to want an ASIC. There is GPU mining but it’s mostly unprofitable right now - unless you have cheap to free electricity. Some CPU is profitable but not really even worth the wear and tear on your machine. If you’re willing to mine at a loss for awhile and accumulate - it’d be a cool little side gig to stack.
You even didn't read xD. BTC current state is not what Satoshi wanted. Each year its more and more centralized. Combined hashrate of three biggest mining pools is able to do this. If they could go to agreement. Or if someone would "convince" owners of the pools. It would be possible. One year ago you had to spend 6 bilion dollars to buy enough ASICs to have enough hash rate of your own. I think for governments it's still possible to pull this out. But such a thing would be immiedietly visible for us. You just can't buy such amount and appear in the network in one second. I wonder if this will be possible to pull with quantum computer. No one rly tried to seriously destroy network like this. Governments tried this just by simple ban on crypto. From their perspective it was much more efficient. On May 8, 2024, the Bitcoin network's total hashrate was 569.29 exahashes per second (EH/s). The top three mining pools by three-day hashrate were: FoundryUSA, at 175.76 EH/s; 30.9% of the total Bitcoin network hashrate AntPool, at 161.77 EH/s; 28.4% of the total Bitcoin network hashrate ViaBTC, at 73.11 EH/s; 12.8% of the total network hashrate Combined, these three pools made up 72.1% of the network hashrate, a whopping 486.9 EH/s (486.9 million TH/s—the CPU in your computer might be able to hash at about 15 kilo hashes per second). If Foundry and ViaBTC were to collude, they could take over 51% of the hashrate (248 EH/s)."
It makes no sense because your mind is too narrow. Early investors and the rich also get richer under a PoW system. They get the information first, have the capital to act on it, and get rewarded for building supply chains to control it. With a PoS validator, you only need capital and a minimal spec machine. That's a much lower barrier to entry than Bitcoin. CPU mining hasn't been a thing since 2013. GPU mining hasn't been a thing since 2016. The only ones "putting the hard work and labor in" are the Chinese companies that build 90% of the world ASICs and the three mining pools that make up 80% of the network.
Stop plaguing the discussion with your calculations. OP is only missing two words. That's 2048^2 combinations, so about 4 millions indeed. Any CPU will bruteforce that in seconds
I know how long GPU mining lasted. I used to do GPU mining as well. it would not have looked like that forever, even without PoS. ASICs did in fact appear, and if given enough time they would have been so popular that you would see the whole thing being controlled by ASIC farms next to hydroelectric dams etc. even now with monero being so ASIC resistant that you can only use a CPU, I still can't make any money trying to generate because I don't live near cheap enough electricity. it didn't used to be like that but now the hashrate is too high for small people playing with their computer in big chunks of the world. as PoW naturally progresses, more and more people get excluded over time.
I undoubtedly don't disagree with any of this. But it still doesn't really assaude by reticence. Not talking soft forking. Talking copying BTC source code 1:1 (with or without the changes). Undoubtedly no would just use it. But, if the security, functionality and scarcity are the literal same thing, but the price is different, then what explains the difference in price. Well it was first, existed longer, has an established user base. Sure. But then this implies in my mind that value of Bitcoin is tied with these intangible aspects (longevity, name recognition, etc.). There's roughly 1.2 million USD in circulation per BTC. So really I would think BTC instinsic value is higher than it's current price. The only exception being... There can be an infinite number of BTC's. (Not number on BTC in original chain, but infinite chains). Which they are already is basically, but they all distinguish themselves by working differently. The price should be different, if the price is reflecting something in code/functionaity. But if the code is the exact same... Why should the value of a currency be tied up on non functional characteristics of the currency? No particular reason to believe BTC2 is coming, or that anyone would adopt it. At the very least, you could mine BTC2 with a CPU again (for a bit) and can go without fees. So there could be incentive to mine, even if adaptation is painfully slow. Undoubtedly, I find all of it pretty fascinating though and the underlying concepts behind it all are undoubtedly genius.
That's newbie thinking. Hashrate doesn't matter at all because it doesn't provide additional Sybil resistance. Just as theoretical example for Bitcoin mining: * 1M people use CPU mining: low hashrate, higher security * 10k people use S9 mining: higher hashrate, medium security * 100 people use S19 Max + S21 mining: highest current hashrate, low security
That's newbie thinking. Hashrate doesn't matter at all because it doesn't provide additional Sybil resistance. Just as theoretical example for Bitcoin mining: * 1M people use CPU mining: low hashrate, higher security * 10k people use S9 mining: higher hashrate, medium security * 100 people use S19 Max + S21 mining: highest current hashrate, low security
There are 10 000 000 000 possible passwords. In Atomic Wallet your password is run through a password‑hashing KDF (e.g. scrypt, PBKDF2 or similar) that uses a random salt and intentionally high work‑factor (many iterations and/or memory‑hard operations). That means each guess takes on the order of hundreds of milliseconds (or more) of CPU+RAM work, drastically slowing down any brute‑force attack. Below is an estimate of how long it would take to exhaustively try every all‑digit password of length 4–10, given two attack rates: * **Normal CPU**: \~10 password‑guesses per second (a typical 4‑core desktop). * **Fast server**: \~150 guesses per second (e.g. a 48‑thread machine with \~150 H/s). |Length|Combinations|Time @ 10 H/s|Time @ 150 H/s| |:-|:-|:-|:-| |4|10 000|16 minutes 40 seconds|1 minute 6 seconds| |5|100 000|2 hours 46 minutes|11 minutes 6 seconds| |6|1 000 000|1 day 3 hours|1 hour 51 minutes| |7|10 000 000|11 days 13 hours|18 hours 31 minutes| |8|100 000 000|115 days 17 hours|7 days 17 hours| |9|1 000 000 000|3 years 62 days|77 days 3 hours| |10|10 000 000 000|31 years 259 days|2 years 41 days| If your password was up to 7 digits long, then it would take up to 2 weeks to crack it on your computer. In worst case scenario that your password was 10 digits long you would need to rent a fast server for about 2 years to get it solved. Easy! 
**How can Solana realistically compete with over 60 Layer 2 solutions?** Platforms like Base are reportedly already twice as fast as Solana, and now Ethereum's L2 ecosystem is evolving even further. Projects like the upcoming "Mega ETH" are targeting performance levels of 100,000 transactions per second with 1-millisecond block times. It’s hard to see how a single Layer 1 like Solana can go up against the entire Ethereum scaling infrastructure. Solana already struggles under high traffic conditions - as we saw during the Trump meme coin frenzy, where transactions either failed or took hours to complete. Imagine if a financial institution or bank tried to execute time-sensitive transactions during a surge like that. Layer 2s work more like multiple CPU cores - distributing load and scaling horizontally. Solana, on the other hand, is trying to route all global transaction volume through a single, monolithic pipeline. That model has limits. The blockchain is reportedly growing by several terabytes per month, and even the most powerful hardware will eventually buckle under that kind of data load. And with such steep validator hardware requirements, Solana risks further centralization. Only a handful of entities can afford to run the necessary infrastructure, which undermines the core principle of decentralization. From a Reddit thread: >*According to Solana, the network will generate 4 petabytes of data every year at full capacity. This wont be stored on each individual node but will be split across all nodes enabling a bit torrent-esque distribution of the data.* >*4 petabytes seems like a lot of data but Solana also plans to store this 100x (to avoid data loss and corruption). This means that on the network there will be 400 petabytes of data created every year (at full capacity). If this were spread over 1,000 validators it would entail 0.4 petabytes of data storage per validator per year. This seems pretty excessive... unless I am missing something.*
That's funny. The Dapps Q1 2025 revenue numbers indicate the opposite. Ethereum is like a multi-core CPU and Solana is like a single-core CPU. There is no way banks and institutions will want to share block space on a chain with MEME degens. Solana choked when Trump brought out his MEME. It's so obvious. Why anyone thinks all activity should be on one L1 is a actually a mystery. The world is like a multi-threaded application. It needs a multi-core CPU. [Ethereum Dominates Dapp Revenue in Q1 2025 Raking in Over $1 Billion – Crypto News Bitcoin News](https://news.bitcoin.com/ethereum-dominates-dapp-revenue-in-q1-2025-raking-in-over-1-billion/) [Ethereum](https://buy.bitcoin.com/eth/) continues to solidify its position as the leading platform for decentralized applications (dApps), with [dApps](https://news.bitcoin.com/solanas-dapps-revenue-hits-record-365-million/) on the network generating a staggering $1.014 billion in fees during the first quarter of 2025, according to [Token Terminal](https://tokenterminal.com/explorer/studio/dashboards/9da383a8-8ff5-452e-9542-5e44fbb731af). Trailing far behind, Base, Coinbase’s Layer 2 chain, secured second place with $193 million in dapp fees, reflecting its growing traction, but still a significant gap from Ethereum’s dominance. [BNB](https://buy.bitcoin.com/bnb/?utm_source=News) Chain [dApps](https://www.bitcoin.com/dapps/) followed closely, collecting $170 million, while Arbitrum’s ecosystem brought in $73.8 million. Avalanche’s C-Chain rounded out the top five, with its dApps generating $27.68 million in fees.
Think of Ethereum as a multi-core CPU. And Solana as a single-core CPU. The world is more like a multithreaded application with many different use cases. Why would banks, institutions and companies like Sony want to share the Solana blockchain with degen MEME traders? When Trump released his MEME, Solana choked. There are threads here on Reddit where people complained about their failed transactions. Many banks have now initiated engagement with Ethereum. The oldest bank in America, BNY, just announced they would broadcast some data to Ethereum. European banks have also indicated an interest in Ethereum.
Actually, no... I was in tech and was deeply intrigued by the concept. I was the network manager for one of my company's labs at the time so I set up a shared wallet on the server. Ran CPU based mining as that was still a thing on about 40 machines that were otherwise idle over nights as we only worked a day shift in that lab but never turned anything off. I did the "buy a pizza" type thing with friends so didn't have exactly a multiple of 25... but I did mine plenty (compared to today's rewards).
The answer is different for ETH, because its blockchain is many terabytes. A comparison ... https://blog.lopp.net/2021-altcoin-node-sync-tests/ > mass adoption Bitcoin constrains resource usage (mainly RAM requirements, ask if you want this explained) by limiting the size of a block, and the block interval. This incidentally flows to other resources. It's very light on CPU and network, and a single affordable HDD will store almost 200 years of blockchain This constraint triggers the scaling debate. But is there a scaling issue? No. Would there be a scaling issue with mass adoption? Who cares? There has been no sign of mass adoption. The expectation of mass adoption is unrealistic. It's probably a happy coincidence. By carefully constraining resource usage, the node network is cheap to run forever. And the level of actual adoption is well within the constraints for at least 50 more years As you can see in Lopp's other articles about initialization times for Bitcoin nodes, the main unbounded constraint is the linear time increase for initializing a new node. How long is too long? Already, Lopp's 10-hour time is too long for many, and is currently increasing by about 8% per year. For an operator with an HDD, the time is about 60 hours Nothing else has adoption anywhere close to Bitcoin. The "Bitcoin equivalent" blockchains (LTC, DOGE, Monero) have so little usage that they don't even function as spillover options for when Bitcoin is occasionally congested. ETH is bloated from its not-Bitcoin design, an indicator of the folly of arbitrarily discarding Bitcoin constraints. ETH has the remarkable twin features of almost no usage, and a bloated chain making it too expensive to operate a node
Appreciate your well thought out response. So strange how you have these people who think they're schooling me on stuff and they sound like they've just arrived to crypto. "Not your keys... Blah blah blah." Really? I was around for My.Gox. I set up some CPU mining back in 2013, but lost it. I only started investing in 2020. Anyway, I feel similarly to you. Having Bitcoin through an ETF in tax protected accounts has been really helpful for long term applicable wealth. But I always hoped to see my BTC wallet go from .99 to 1. Maybe it will someday when I can pull money out of my Roth and other IRAs.
And it will remain as Satoshi's Shield for eternity, No QC thing will hack anything. If you actually believe QC will out perform classic computers by billions of times, you are: GULLIBLE AF! [The largest number reliably factored by Shor's algorithm is 21](https://en.wikipedia.org/wiki/Integer_factorization_records#Records_for_efforts_by_quantum_computers). Note the keyword RELIABLY, as in repeatable, reproducible consistently without ever failing. They go onto quote several theories and once off factorizations that could not be repeated 'RELIABLY'. That is what I call hot air. And what about that absolute zero temperature quantum CPU? You know one of the things about Absolute zero is NOTHING MOVES. All matter utterly and completely stops at 0 degrees Kelvin ... not even electrons move - so like no electricity. But, apparently, that is the temperature at which these things will be computing at billions times the speed of a classic digital computer. Wow! QC is just noise designed to distract and produce FUD about cryptography, the greatest enemy of the state.
This has been discussed many times before , but I'll explain it again >Why isn't Bitcoin's block size increased? The limit was already increased in 2017 from 1 MB to 4 million units of weight or up to 3.7 MB blocks limit . Anyone who tells you otherwise is lying or ignorant . Some context for beginners to understand scaling capacity in Bitcoin: Satoshi Nakamoto originally set a 1MB blocksize limit on Bitcoin to prevent certain resource attacks and keep Bitcoin decentralized. He suggested some ways we can scale Bitcoin by introducing us to the idea of payment channels and ways to replace unconfirmed transactions (RBF) for a fee market before he disappeared. The first version of Bitcoin released had a version of replacing transactions as well https://github.com/trottier/original-bitcoin/blob/master/src/main.cpp#L434 https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2013-April/002417.html Over the years there has been many opinions and disputes as to how to scale Bitcoin from keeping the limit as is , to scaling mostly onchain with large blocks, to a multi-layered approach and every variation in between. In 2017 the Bitcoin community finally removed the 1MB after coming to consensus over a path forward. https://github.com/bitcoin/bitcoin/blob/master/src/consensus/consensus.h 1MB limit was removed and replaced with a larger limit of 4 million units of weight- https://www.reddit.com/r/BitcoinBeginners/comments/ghqcqn/bitcoin_bubble_or_revolution/fqa72j1/ Bitcoin is taking the approach of scaling with many solutions at once. With larger blocks, hard drive capacity is the least of our concerns due to – Archival full nodes contain the full blockchain and allow new nodes to bootstrap from them . Pruned nodes can get down to around ~5GB , and have all the same security and privacy benefits of archival nodes but need to initially download the whole blockchain for full validation before deleting it (It actually prunes as it validates) The primary resource concerns in order largest to smallest are: 1) UTXO bloat (increases CPU and more RAM costs) 2) Block propagation latency (causing centralization of mining) 3) Bandwidth costs 4) IBD (Initial Block Download ) Boostrapping new node costs 5) Blockchain storage (largely mitigated by pruning but some full archival nodes still need to exist in a decentralized manner) This means we need to scale conservatively and intelligently. We must scale with every means necessary. Onchain, decentralized payment channels , offchain private channels , optimizations like MAST and schnorr sig aggregation, and possibly sidechains/drivechains/statechains/ fedimint, cashu must be used. Raising the blockweight limits in the future is not completely opposed - https://bitcoin.org/en/bitcoin-core/capacity-increases https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-December/011865.html >"Further out, there are several proposals related to flex caps or incentive-aligned dynamic block size controls based on allowing miners to produce larger blocks at some cost." But raising the blocksize further than 4 million units also might not be needed as well depending how all the other solutions come to fruition.
Yeah smart stuff like QC! [The largest number reliably factored by Shor's algorithm is 21](https://en.wikipedia.org/wiki/Integer_factorization_records#Records_for_efforts_by_quantum_computers). Note the keyword RELIABLY, as in repeatable, reproducible consistently without ever failing. They go onto quote several theories and once off factorizations that could not be repeated 'RELIABLY'. That is what I call hot air. And what about that absolute zero temperature quantum CPU? You know one of the things about Absolute zero is NOTHING MOVES. All matter utterly and completely stops at 0 degrees Kelvin ... not even electrons move - so like no electricity. But, apparently, that is the temperature at which these things will be computing at a billion times the speed of a classic digital computer. Wow! Yeah, sorry, ain't movin' my coin to an utterly and completely useless blockchain bloating QC resistant address.
Henceforth be thy lesson : So let's talk about nodes and consensus as defined in the whitepaper and implemented in nodes. Let's start with [the bitcoin white paper](https://bitcoin.org/bitcoin.pdf) : > Satoshi from the Bitcoin white-paper chapter 12 'Conclusion' : The network is robust in its unstructured simplicity. **Nodes** work all at once with little coordination. They do not need to be identified, since messages are not routed to any particular place and only need to be delivered on a best effort basis. **Nodes** can leave and rejoin the network at will, accepting the proof-of-work chain as proof of what happened while they were gone. They vote with their CPU power, expressing their acceptance of valid blocks by working on extending them and rejecting invalid blocks by refusing to work on them. Any needed rules and incentives can be enforced with this **consensus mechanism**. First, you have to understand what 'consensus' actually means : > https://en.wikipedia.org/wiki/Consensus_%28computer_science%29 > A fundamental problem in distributed computing and multi-agent systems is to achieve overall system reliability in the presence of a number of faulty processes. This often requires processes to agree on some data value that is needed during computation. Examples of applications of consensus include whether to commit a transaction to a database *(or, for example, committing blocks to a blockchain)*, agreeing on the identity of a leader, state machine replication, and atomic broadcasts. The real world applications include clock synchronization, PageRank, opinion formation, smart power grids, state estimation, control of UAVs, load balancing and others. What does this mean if you are but an intrepid traveler amongst the erstwhile numpty-folk? Nodes are agents in a multi-agent system with [an agreed set of consensus rules](https://www.cryptocompare.com/coins/guides/how-does-a-bitcoin-node-verify-a-transaction/), which they and they alone enforce, that ensure that the system functions. Transactions are propagated through the multi-agent network based upon the agreed consensus rules by nodes, which are agents in a multi-agent system. Miners retrieve valid transactions from any of these nodes, which are agents in a multi-agent system. They then order the transactions, and perform a hashing function on them until the hashing function returns a value that is suitable to the nodes, which are agents in a multi-agent system. They then pass the new block that they've created to the nodes, which are agents in a multi-agent system. The nodes, which are agents in a multi-agent system, then validate the block to ensure that each of the transactions within the block agree with the consensus rules. Then the node, which is an agent in a multi-agent system, extends the block-chain by attaching the new block to it. They then pass the new block, if it is valid, to other nodes, which are agents in a multi-agent system. Then each of these other nodes, which are agents in a multi-agent system, each do the same validation on every block. Nodes accept incoming transactions and validate them. Miners don't. Nodes replicate transactions to other nodes. Miners don't. Miners take transactions from nodes, and order them in a block, and perform a hashing function on them, and it is the only thing they do. Miners pass the new block to the node. The node validates the transactions in the block. Miners don't. The node validates the block. Miners don't. The node extends the blockchain. Miners don't. The node replicates the block to other nodes. Miners don't. It is the validation of the nodes, and their CPU's, that define and police consensus in bitcoin. There is only one function that miners do. They take transactions, put them in a block, and hash them. As soon as a miner produces a block that nodes don't want, it is rejected. Miners work. Nodes validate. So nodes are the proof in proof-of-work. Nodes accept the transactions, validate the transactions (using their CPU), replicate the transactions, maintain the mempools, validate the blocks (using their CPU), extend the blockchain (using their CPU), replicate the blocks, serve the blockchain, and store the blockchain. Nodes even define the PoW algorithm that miners have to employ. If you can't convince these node owners that are using their nodes on a day-to-day basis, to uninstall their node software and install your new node client, especially when that node client decreases their node security and decreases the network security, any change you have is going to go exactly nowhere. So nodes maintain the protocol, not miners. It is thus. It has always been thus. If you can't convince all of those node owners running their node clients to uninstall one client and re-install another, any change you have to consensus is DOA. See for yourself. [Download it.](https://bitcoin.org/en/download) It's currently at 0.20.1 https://bitcoin.org/en/full-node > A full node is a program that fully validates transactions and blocks. Almost all full nodes also help the network by accepting transactions and blocks from other full nodes, validating those transactions and blocks, and then relaying them to further full nodes. Thus endeth thy lesson.
You didn't list your RAM or CPU and which is more pertinent than your connection capability
You can't mine bitcoin with a CPU. If you aren't mining bitcoin then wrong subreddit.
Was an interesting podcast actually. Just listened to it this morning. https://open.spotify.com/episode/3jIL02ivekv4hvKmcz7CPU Paolo Ardoino, the CEO of Tether, rejoins the show for a timely discussion. In this episode: * How is Paolo feeling about political trends in the US today? * Tether is the 7th-largest buyer of USTs at the sovereign level * Paolo’s decision to spend more time in the US * What does Paolo make of the stablecoin bills in Congress? * Does Tether want to eventually come onshore to the US? * Tether is pursuing a Big 4 Audit * The effect of Operation Choke Point 2.0 on Tether * Tether is moving their HQ to El Salvador * Paolo’s thoughts on stablecoins in MiCA * How does Tether get to the figure of 400m global users of the stablecoin? * Tether’s efforts to better understand their userbase and usage modes * Tether’s methodology around balance sheet investing * Is Paolo concerned about crypto-dollarization? * Tether’s relationship with Cantor now that Lutnick is Commerce Secretary * Is Paolo concerned about the emergence of interest-bearing stablecoins?
CPU mining, then GPU mining, then after some time FPGA then ASIC's. What a ride that was.
Me when my CPU miner kept crashing in 2010.
I was an altcoin miner at that time. My mining rig had four 290X's that screamed like little jet engines 24/7. (I burned out 3 of the 4 of them, and 1 of the two 1000W power supplies, in less than a year.) No fires, but I remember a hilarious thread on one mining forum where a whole bunch of us - myself included - admitted that we had each burned out not 1, and not 3, but exactly 2 Kill-a-Watt meters. We commonly used them to monitor how much power our rigs were drawing. The meters would die, we'd complain to Amazon and get a replacement, the replacement would burn out a few weeks later, and at that point we'd read the instructions and discover that you aren't supposed to leave them plugged in long term with a high wattage draw. And then there was the "Often bought with" screen on Amazon for certain motherboards and graphics cards that listed (literally) a milk crate and a specific resistor along with CPU, GPUs and other hardware....
tldr; The article narrates how the author turned a monotonous corporate office day into a Monero cryptocurrency mining experiment. Using their office PC, they set up a Monero mining rig, optimized CPU performance, and managed overheating issues. While the mining effort was not financially lucrative, it served as a productive and engaging way to escape the drudgery of corporate life. The author emphasizes the importance of optimizing settings, monitoring temperatures, and balancing work with personal interests to stay sane in a corporate environment. *This summary is auto generated by a bot and not meant to replace reading the original article. As always, DYOR.
While it is interesting to track your node connectivity, then design while very simple in principle is quite resilient in finding reasonably good paths for the data. The connectivity your node have will change a lot over time. It takes time to establish connections to the more stable nodes, and even then it varies a lot what in what connectivity you have as connections come and go. The node connectivity is guided by primary by two rules a) Each node only initiates a handful connections to other peers, selected at random. Most your connections are accepted incoming connections initiated by other nodes and clients looking for peers. b) Each node is limited in the number of connections they accept, normally about 1k, can be configured higher. A long running node is often full and do not accept any new connections. Once connected there is no significant difference in how the connection to the nodes are used. Blocks are downloaded from any of the peers which have newer blocks than your tip. Transactions and new blocks are relayed between all connections via an announce-request pattern. This results in that you will start out with mostly connections to new leaf nodes which are actively looking for peers to connect to, like yourself. And over time you will likely gain some connections to more stable reliable nodes. The majority of your connections as a public node will always be leaf nodes, and is what makes the network resilient as the loss of any of the long term stable nodes does not reduce the total network connectivity by a noticeable amount as the network connectivity is kept in a highly decentralized structure. Speed during an initial block download is mainly limited by your CPU capability to validate the blocks. You download blocks from several peers in parallel, and the network connectivity to each node is usually not a limiting factor. It does not matter much if some nodes are on a high latency low bandwidth connection, those will not be used much during the block download as they get outpaced by nodes with better connectivity. Connectivity when encountering slow responding nodes was recently improved significantly. Once you are up to speed with the blockchain, the nodes with the less latency is also the nodes you will mostly use. These are the ones you will first receive the announces from as new data becomes available in the network, prompting you to request the data from them. And you will then in turn announce the data to your peers and they will in turn download the data from the first node that announces to them. Resulting in that you mostly publish data to the nodes that are on slow links, rarely request anything from them.
You could get a feel for it by mining Monero, it can only be mined on a CPU/GPU, ie a PC.
>An aggressive military campaign against Bitcoin mining operations could do just that. Especially if it takes large amounts of hash power off line right after a major difficulty increase. Repeat that multiple times and you can delay that halving. By weeks or months even You know not what you speak as the difficulty is based off of the amount of miners. If enough miners shut down, bitcoin could go back to GPU or even CPU mining. The reason why the difficulty is increasing is mostly due to price and miners wanting to win the reward for that block. >The difficulty is adjusted every 2016 blocks based on the time it took to find the previous 2016 blocks. At the desired rate of one block each 10 minutes, 2016 blocks would take exactly two weeks to find. If the previous 2016 blocks took more than two weeks to find, the difficulty is reduced. If they took less than two weeks, the difficulty is increased. The change in difficulty is in proportion to the amount of time over or under two weeks the previous 2016 blocks took to find. [https://en.bitcoin.it/wiki/Difficulty#Can_the_network_difficulty_go_down?](https://en.bitcoin.it/wiki/Difficulty#Can_the_network_difficulty_go_down?)
I mined bitcoins on a CPU before Mt. Gox was even created old. 😂