See More CryptosHome

MB

Show Trading View Graph

Mentions (24Hr)

0

0.00% Today

Reddit Posts

r/BitcoinSee Post

Multibit HD wallet access need help please

r/BitcoinSee Post

What if... A Proof of Node Crypto currency

r/BitcoinSee Post

Why is local mempool so different from mempool.space

r/BitcoinSee Post

Question about running a Bitcoin node on a RasperryPi

r/BitcoinSee Post

correction: Block 823829 got nearly 6400 transactions in 1.8MB

r/BitcoinSee Post

block 823845 crammed nearly 6400 transactions in 1.8MB

r/BitcoinSee Post

Full interview link for "there is no second best"?

r/BitcoinSee Post

Why no small blocksize increase?

r/BitcoinSee Post

What is the significance of a USB that is marketed as being a 'Crypto USB'?

r/BitcoinSee Post

Need help with child pays for parent

r/CryptoCurrencySee Post

First blockchain Rick Roll incident? A tale about Cartesi, Espresso Rick Astley and Vienna OP

r/BitcoinSee Post

Is my transaction just stuck in limbo forever?

r/CryptoCurrencySee Post

The blocksize war: A journey of ambition, debate and ultimate failure

r/CryptoCurrencySee Post

Walk down the memory lane: Blocksize wars and the Bitcoin XT controversy

r/CryptoCurrencySee Post

How Much a Spot Bitcoin ETF Can Affect The Price - The Bad Version

r/CryptoCurrencySee Post

Bitcoin vs Bitcoin Cash

r/CryptoCurrencySee Post

.eth Accounts Can Now Be Used to Text?

r/CryptoCurrencySee Post

On This Month 6 Years Ago - The Most Significant Bitcoin Hard Fork

r/CryptoCurrencySee Post

On This Month 6 Years Ago - Bitcoin First Hard Fork Happend

r/CryptoCurrencySee Post

Nerd Miner Review: I bought a BTC lottery miner, now I am an active part of the BTC network!

r/CryptoCurrenciesSee Post

Mastering Bitcoin: Programming the Open Blockchain 2nd Edition PDF

r/CryptoCurrencySee Post

Raise in malware intrusion in crypto scams

r/CryptoCurrencySee Post

Raise in malware intrusion in crypto scams

r/CryptoCurrencySee Post

Our Data Is Being Stolen: What Can We Do?

r/CryptoCurrencySee Post

Why Ergo (Erg) is a good choice of Blockchain for the long-term

r/CryptoCurrencySee Post

[Satire] Seven years ago bitcoin forked and the true vision of Satoshi was born - Bitcoin Cash

r/CryptoCurrencySee Post

Blockchain Implements a 6 Year Piece of Revolutionary Research-Based Technology

r/BitcoinSee Post

I have 2 nodes at home. If I only received 2MB of data from one node to the other then how is the mempool at 134MB?

r/BitcoinSee Post

What happened to the mempool on May 16?

r/CryptoCurrencySee Post

A post on how BRC-69 affects Ordinals🔥

r/CryptoCurrencySee Post

What do you think of Bitcoin Cash (BCH)?

r/CryptoCurrencySee Post

Oridinals developer decides that Ordinals was not horrible enough, introduces idea for Recursive Ordinals. In other words: Inscriptions inside of Inscriptions, allowing for larger-than-4MB Inscriptions.

r/BitcoinSee Post

Is BTC dying???

r/CryptoCurrencySee Post

You can inscribe up to 4MB of data into 1 Satoshi

r/CryptoCurrencySee Post

An exploration of what ordinals are and an argument for why you probably shouldn't ignore them

r/BitcoinSee Post

Any Site or App that Notifies me when Mempool is under 300MB?

r/BitcoinSee Post

Size and Weight

r/CryptoCurrencySee Post

Why is Bitcoin Cash pumping so much today? BTC transaction fees over $20 but less than a penny on Bitcoin Cash!!!

r/BitcoinSee Post

Why are blocks not 4MB in size?

r/CryptoCurrencySee Post

Litecoin reaches historical record in transactions carried out on the network — The ATH of transactions was reached within three months left for the LTC halving.

r/CryptoCurrencySee Post

Flaws of Bitcoin

r/CryptoCurrencySee Post

It now costs "a Big Mac" to use the Bitcoin Network

r/BitcoinSee Post

Bitcoin ECDSA secp256k1 calculator spreadsheet in Excel, no macros, no arrays—for educational purposes

r/BitcoinSee Post

Why are blocks so big?

r/CryptoCurrencySee Post

doge does what bitcoin did to bitcoin, think about this (Not a Joke)

r/CryptoCurrencySee Post

Unable to import Reddit vault into Metamask wallet

r/CryptoCurrencySee Post

Bitcoin Maxi Jameson Lopp goes over the history and evolution of (Toxic) Bitcoin Maximalism

r/CryptoCurrencySee Post

Are you for or against Ordinals?

r/CryptoCurrencySee Post

[BTC] Full mempool and empty blocks mined — In the last 48 hours, 5 blocks were mined with no transactions. The bitcoin network has been accumulating a backlog of transactions with increased demand for block space, and fees have been rising.

r/CryptoCurrencySee Post

Unusual Bitcoin Blocks and where to find them

r/CryptoCurrencySee Post

We could have been Rich if Facebook and Google Paid us for our Data they collect

r/CryptoCurrencySee Post

Deeper Network's crypto miner: a $130 "free VPN for life" that requires payment to use

r/CryptoCurrencySee Post

My RaspberryPi 4 processed the two 8MB blocks from yesterday in 16 seconds

r/BitcoinSee Post

Bitcoin Core takes ages to download and verify blockchain

r/BitcoinSee Post

if block size increases can there ebe more transactions?

r/CryptoCurrencySee Post

A No-Shill Avalanche Deep Dive

r/CryptoCurrencySee Post

What comes for Ethereum post-Shanghai hardfork? The Cancun hardfork which will include the much anticipated EIP-4844!

r/CryptoCurrencySee Post

The Ordinals hype, just as every other hype ever, seems to be fading off right now. As taproot adoption has been halved from 18% to 9% in just days.

r/CryptoCurrencySee Post

Bitcoin Continues to Record Blocks Above the 3.75 MB Range as Ordinal Inscriptions Near 150,000

r/CryptoCurrencySee Post

Ordinals Inscription - Bitcoin Cash Independence Block

r/CryptoCurrencySee Post

I Gifted my Ordinal Inscriptions to random persons on sideshift

r/BitcoinSee Post

Improve scalability of bitcoin with stablecoins

r/CryptoCurrencySee Post

Block Sizes Exceeding 3 MB Now Common on Bitcoin Blockchain as Ordinal Inscription Demand Rises – Bitcoin News

r/CryptoCurrencySee Post

The Bitcoin network activity has reached a new high since May 2021 and the Bitcoin block size is at an ATH. This was only a bear market for prices, not actual development.

r/BitcoinSee Post

Why would a miner steal his own money?

r/BitcoinSee Post

The first ~ 4 MB block in #Bitcoin history, mined by Luxor

r/BitcoinSee Post

STL files on Ordinal Inscriptions

r/BitcoinSee Post

What's worse: sending 500 MB in high-fee transactions all at once or using Ordinals to store your 50kb PFP on Bitcoin?

r/BitcoinSee Post

Creating Tokens on Bitcoin(BTC) using OP_Return, Smart Contract Treasuries and Distributed Bots Secured by Digital Rights management

r/BitcoinSee Post

The Great Crypto Scam - My response to a viral video attacking Bitcoin

r/BitcoinSee Post

$60 trillion value settled in 2022

r/BitcoinSee Post

Bitcoin Core Download Speed

r/BitcoinSee Post

Basic questions for Bitcoin.

r/CryptoCurrencySee Post

Bitcoin will be a full-feature decentralized financial system on layers.

r/CryptoCurrencySee Post

Algorand is a terrible investment

r/BitcoinSee Post

What is the network usage of LN node after fully synced ?

r/CryptoCurrencySee Post

Noob Question: BTC Network Transaction Fees

r/CryptoCurrencySee Post

Compare Ledger new wallet: Ledger Stax vs Ledger Nano X

r/CryptoCurrencySee Post

In July 2015, the largest-sized Bitcoin transactions were made. They each consisted of 1000s of UTXOs and took up 99.9% of their blocks, but cost only $0-15 in fees.

r/CryptoCurrencySee Post

[SERIOUS] In July 2015, the largest-sized Bitcoin transactions were made. They each consisted of 1000s of UTXOs and took up 99.9% of their blocks, but cost only $0-15 in fees. In Bitcoin culture, this was considered a dick move.

r/CryptoCurrencySee Post

[SERIOUS] In July 2015, the largest-sized Bitcoin transactions were made. They each consisted of 1000s of UTXOs and took up 99.9% of their blocks, but cost only $0-15 in fees. In Bitcoin culture, this was considered a dick move.

r/BitcoinSee Post

Gradual increase of block size

r/BitcoinSee Post

Average Block Size

r/CryptoCurrencySee Post

What's your back up plan if you get hit by a brick in the head and forget where you hid your seed?

r/CryptoMoonShotsSee Post

$ZOO Racers Beta V2 MultiKart Battle Game Is Now Live For Everyone

r/CryptoCurrencySee Post

ZooRacers Beta V2 MultiKart Battle Game Is Now Live For Everyone

r/CryptoCurrencySee Post

Why most of your money doesn't really exist & how money is created (educational post)

r/CryptoCurrencySee Post

TIL: 5 years ago Bitcoin had a controversial software update that was cancelled weeks before releasing

r/CryptoCurrencySee Post

Blockchain Size: Everything You Need To Know.

r/CryptoCurrencySee Post

Calculating Batch Throughput and its Theoretical Limits on Layer 1 Bitcoin

r/BitcoinSee Post

Calculating Batch Throughput and its Theoretical Limits on Layer 1 Bitcoin

r/CryptoCurrencySee Post

How Someone is Attacking the ZCash Network for $10 a Day

r/CryptoCurrencySee Post

Ongoing ZCash spam attack bloating the size of most blocks beyond 1MB

r/CryptoMoonShotsSee Post

Redlight Finance | Aims to solve the Blockchain Trilemma of Scalability, Decentralization and Security through the optimization of our gasless blockchain | Goal to provide a bridge between the real world and web3 through blockchain technology | Professional Team | Fully Verified

r/CryptoMoonShotsSee Post

Redlight Finance | Aims to solve the Blockchain Trilemma of Scalability, Decentralization and Security through the optimization of our gasless blockchain | Goal to provide a bridge between the real world and web3 through blockchain technology | Experienced Team | Fully Verified

r/CryptoCurrencySee Post

The Algorand shillers have been relentless about the future of the project lately. I do not believe it has a future. This is the opposite of an Algorand shill post.

r/CryptoMoonShotsSee Post

Redlight Finance | Aims to solve the Blockchain Trilemma of Scalability, Decentralization and Security through the optimization of our gasless blockchain | Experienced Team | Fully Verified

r/BitcoinSee Post

Have you ever wondered why the block size often exceeds the limit of 1MB?

r/CryptoCurrencySee Post

Algorand v3.9 Major Update details - State Proofs, Higher throughput, Faster finality

r/CryptoMoonShotsSee Post

We're long time Bunny lovers, and we can't wait to bring a new universe of utility to our favorite coin | Metaverse Bunny Fair launch in 15 minutes |

Mentions

r/CryptoCurrencySee Comment

>There is a lot more too it, but it’s intellectually false to claim it was about a group just trying to claim BCH is the real Bitcoin **out of nowhere**… Did I? >BCH **has turned** into a scam ​ And with that I want to add that no matter if the block size is 1MB, 8MB or 100MB, BTC could never scale on-chain to global levels while keeping everyone able to sync and verify the blockchain. Maybe you BCH folks need to read more than just the headline of the whitepaper to understand Satoshi's real vision.

Mentions:#BCH#MB#BTC
r/CryptoCurrencySee Comment

It turned into a scam when most of its followers started claiming it's the real BTC, even though hash-rate and price CLEARLY proves it was the minority fork. Now one could argue that a minority forking a project like BTC can never have good intentions to begin with. BCH isn't dead? Currently I see blocks which have about \~50 tx there, while BTC has \~3k transactions per block. Looks like increasing the block size to 8MB made a lot of sense. /s

Mentions:#BTC#BCH#MB
r/CryptoCurrencySee Comment

The solution to bitcoins scaling problem will ultimately require a hard fork (even the lightning network white paper says ln needs ~140MB blocks) and Bitcoin cannot hard fork.

Mentions:#MB
r/BitcoinSee Comment

i have a folder called MB Backup which contains a lot of zip.aes files ..... maybe its here and i can uncrypt it somehow ?

Mentions:#MB
r/BitcoinSee Comment

i have another wallet using multibit HD that the password works and i can access but there is no option to export private key or seed however the one i access is drained , the one i forgot the password has money in : ( and i thought maybe there was somewhere saved in that folder name MB BACKup as i can see many zip files.aes ...

Mentions:#MB
r/CryptoCurrencySee Comment

People keep saying things like: "Bitcoin can't scale because you have to store the purchase of a cup of coffee on the chain for all eternity" while in the white paper it said >Once the latest transaction in a coin is buried under enough blocks, the spent transactions before it can be discarded to save disk space. To facilitate this without breaking the block's hash, transactions are hashed in a Merkle Tree [7][2][5], with only the root included in the block's hash. Old blocks can then be compacted by stubbing off branches of the tree. The interior hashes do not need to be stored. A block header with no transactions would be about 80 bytes. If we suppose blocks are generated every 10 minutes, 80 bytes * 6 * 24 * 365 = 4.2MB per year. With computer systems typically selling with 2GB of RAM as of 2008, and Moore's Law predicting current growth of 1.2GB per year, storage should not be a problem even if the block headers must be kept in memory. People keep saying that if you use a SPV wallet instead of running a full node you are not secure while the whitepaper said: >t is possible to verify payments without running a full network node. A user only needs to keep a copy of the block headers of the longest proof-of-work chain, which he can get by querying network nodes until he's convinced he has the longest chain, and obtain the Merkle branch linking the transaction to the block it's timestamped in. He can't check the transaction for himself, but by linking it to a place in the chain, he can see that a network node has accepted it, and blocks added after it further confirm the network has accepted it. As such, the verification is reliable as long as honest nodes control the network, but is more vulnerable if the network is overpowered by an attacker. While network nodes can verify transactions for themselves, the simplified method can be fooled by an attacker's fabricated transactions for as long as the attacker can continue to overpower the network. One strategy to protect against this would be to accept alerts from network nodes when they detect an invalid block, prompting the user's software to download the full block and alerted transactions to confirm the inconsistency. Businesses that receive frequent payments will probably still want to run their own nodes for more independent security and quicker verification. People keep saying that a 51% attack will cause you to lose your coins while the whitepaper said: >We consider the scenario of an attacker trying to generate an alternate chain faster than the honest chain. Even if this is accomplished, it does not throw the system open to arbitrary changes, such as creating value out of thin air or taking money that never belonged to the attacker This is why I am all in on Bitcoin Cash. Because I see everybody else trade on believing lies.

Mentions:#MB#RAM
r/BitcoinSee Comment

Who win majority of users would be "official one". Like bitcoin cash. They forked off but only minority of users accepted it so "our" bitcoin is official one. If they were successful then Bitcoin would be that chain with increased block size and our 1MB block size chain would be "bitcoin old" or something.

Mentions:#MB
r/CryptoCurrencySee Comment

> You're saying ETH doesn't work, is centralized, and is insecure? It is, it's controlled by Joe Lubin and Vitalik. If something happened to Vitalik or he decided to quit Ethereum, you would have even more leakage of people to other coins Solana, Avalance etc. and even new coins. Insecure even at the contract design level, where you had Gavin Woods, the creator of Solidity, failing to create a simple multisig, that resulted in the loss of hundreds of $millions, when a random noob dev called a call and bricked all the multisigs of the network that used that contract. Not to mention the $billions that have been hacked on DeFi due to Ethereum's bad design: https://rekt.news/leaderboard/ Of course, all other turing-complete chains have similar hacks. > Anyone wishing to get involved with staking and validating transactions has a much LOWER cost of entry than BTC. [...] TH validators cost about $100k which is less, while ETH processes 10x the transactions of BTC, AND ETH transactions are much more complex than BTC. Baffled by your numbers, that pretty much upside down. ETH validators need 32 ETH that costs $70k, meanwhile you can get an ASIC for $5k on the secondary market. > Running a pointless BTC node which downloads 1MB of data every 8 minutes and occasionally relays a transaction is not participating in the BTC consensus. Spending $1 million on mining IS participating, but its not for everyone. 2 different things, but both compose the final consensus. UASF showed the power of the economically involved nodes. Miners were stalling SegWit activation with no arguments whatsover, but users gave a deadline and pushed the miners into activation. Not even the core devs were involved in that. This is chaotic but decentralized consesus across the globe, no "foundation" to organize stuff, no committies, no Vitalik or BDFLs. > ETH processes 10x the transactions of BTC, AND ETH transactions are much more complex than BTC. I've tried ETH in the past and it's convience over long-term sustainability. They already try to camouflage not validating the whole chain in order to allow somehow acceptable syncing times. It's a game of jenga and you can see that from various ETH devs and even the lead maintainer, that their path is unsustainable. Having orderbooks on the base-layer is insanse and all those thousands of "add sell order", "cancel sell order" will be on the chain for ever making every syncing slower. > Layer 2 won't exist in 10 years or less, it's pointless construct. Layer 2 is how anything scales efficiently. Vertical scaling is always more expensive and risky and this applies to blockchains too. If you focus on L1, you essentially have 1 machine handle every single transactions. With L2+ you gain parallelization and you can multiply that 1 machine's effective transaction rate. I already told you about the blockchains that have several TB of data with just a few years, unless the transaction rate is reduced (i.e. giving up on the coin), this will keep getting larger, asking for higher hardware specs, pricing out plain users, making validation accessible only by companies and datacenters, like what is happening with ETH's Infura which handles the biggest part of the network.

r/CryptoCurrencySee Comment

You're saying ETH doesn't work, is centralized, and is insecure? lol it's the best smart contract platform and is driving most of the innovation in the crypto space. So all of your hand-wringing is likely for nothing. Anyone wishing to get involved with staking and validating transactions has a much LOWER cost of entry than BTC. Running a pointless BTC node which downloads 1MB of data every 8 minutes and occasionally relays a transaction is not participating in the BTC consensus. Spending $1 million on mining IS participating, but its not for everyone. ETH validators cost about $100k which is less, while ETH processes 10x the transactions of BTC, AND ETH transactions are much more complex than BTC. Layer 2 won't exist in 10 years or less, it's pointless construct. We have countless blockchains and wallets to choose from, and it's only logical to conclude that people (who aren't brainwashed with weird meat-diet Maximalist dogma) will choose whatever works best and is the cheapest and easiest to use. The rest is all just nonsense!

Mentions:#ETH#BTC#MB
r/CryptoCurrencySee Comment

You can thank ordinals and blockstream for not increasing the 1MB blocks and trying to force lightning on people.

Mentions:#MB
r/BitcoinSee Comment

Can you please rate this for me?: 97.27 % - 10 peers connected over clearnet - Hashrate 408 EH/s +/- 50 - Blockchain Size 592 GB It took 3.5 days for reaching 95% and now it feels like getting exponentially slower... My Setup: Raspberry Pi 4 - 8GB RAM Swap increased to 1G SSD connected via USB 3.0 should have min. 300 MB/sec. Internet connection not that good (50Mbit down, 10 Mbit up)

Mentions:#RAM#MB
r/CryptoCurrencySee Comment

Rubbish, unless some nation-state is IP blocking all BTC nodes. And if they're doing that, they're surely blocking Tor! Furthermore BTC can be sent by pasting a signed transaction script into a web interface or broadcast over hundreds of different wireless protocols. Only a few countries have made BTC illegal and none of their citizens have trouble broadcasting transactions. China has the most draconian Bitcoin laws and the citizens mostly ignore them and easily circumvent any technical barriers. None of it has anything to do with Bitcoin having 1MB blocks 4eva.

Mentions:#BTC#MB
r/BitcoinSee Comment

how much memory has mempool space? it shows ‎1.78 GB / 300 MB soo it does not seem to purge anything above 300mb. Maybe I should try to set mine to 2GB and see what happens :)

Mentions:#MB
r/BitcoinSee Comment

Blocks are bigger. 2+ MB

Mentions:#MB
r/CryptoCurrencySee Comment

Bitcoin Cash is, technically, essentially just Bitcoin. Hell, it's closer to the original Bitcoin spec than BTC is now, with Segwit it radically departed from the original recipe. Bitcoin works, this is no shocker. The only thing Bitcoin Cash does differently is accept that a larger block size may be necessary. Even Satoshi included up to 32MB blocks in the original spec, but BTC cling to 1MB like grim death - I assume the maintainers are either nutjobs or saboteurs at this point. There was some talk about the Bilderberg Group owning Blockstream years ago, I dunno. Might have been just a rumor. Either way, there's no reason Bitcoin wouldn't be able to work, the only reason it's not is the massive congestion. Lightning? That's a piece of trash, that's basically Banking 2.0, and at the end of the day you still have to try to transact on the BTC block chain and pay $50 for the privilege, per transaction.

Mentions:#BTC#MB
r/CryptoCurrencySee Comment

> Downloading and validating 8MB every 8 minutes uses literally a tenth of the resources required to stream and decode a standard definition film on a PC. Accessibility is affected greatly by the storage which would includes to 4-6x of the current rate. It would also make it hard to to use on places without great internet, and/or people caring about their privacy and wanting to use privacy relays like Tor, I2P etc. > And Lightning needs trusted 3rd parties like Strike and other wallet custodians to work for Average Joe. the UX is still worse that using any coin that isn't BTC onchain. I've been running Lightning without middlemen since 2018 and have active open channels since them that basically saved me a ton on fees and offered near instant payment finality. On-chain makes sense for big payments and lower time-preference. The coins that promote on-chain are trading decentralization for short-term marketing, even ETH with not that much activity at the moment has *16.5 Terabytes* (src: https://etherscan.io/chartsync/chainarchive ) chain. Good luck with that. That's why everyone is using etherscan and Infura is making millions. Nobody really validates the whole chain, they just trust middlemen, which is what Bitcoin is against ideologically. > The point is, they'll never increase blocksize despite every one of their bogus reasons not to scale simply with larger blocks has been soundly disproven. Increasing the blocksize helps nothing really, it's a short-term relief that costs greatly to the decentralization and acccessibility. Instead all efforts should push for sustainable engineering solutions a.k.a. L2+ that are properly designed with trust-minimization in mind. Just like we don't have a single type of vehicle in the roads, sky and sea, the same applies in blockchains.

r/CryptoCurrencySee Comment

tldr; An anonymous wallet has spent about 1.5 BTC, equivalent to roughly $66,000, to write 8.93 MB of encrypted data onto the Bitcoin blockchain through 332 transactions. The nature of the encrypted data is currently unknown as it has not been decoded, and the cryptocurrency community is speculating about the purpose of these transactions. *This summary is auto generated by a bot and not meant to replace reading the original article. As always, DYOR.

Mentions:#BTC#MB#DYOR
r/CryptoCurrencySee Comment

> unfortunatelly we have to bear the laws of physics and the state of technology. If you want a system to be auditable by everyone without relying on trusted third-parties ludicrous. what laws of physics are you referring to? Downloading and validating 8MB every 8 minutes uses literally a tenth of the resources required to stream and decode a standard definition film on a PC. And Lightning needs trusted 3rd parties like Strike and other wallet custodians to work for Average Joe. the UX is *still worse* that using any coin that isn't BTC onchain. And who cares who works at Blockstream now? The damage was done to BTC from 2017-20. The point is, they'll never increase blocksize despite every one of their bogus reasons not to scale simply with larger blocks has been soundly disproven. I don't disagree about shitcoin price speculation, but BTC is also hugely overpriced given its lack of utility and fungibility.

Mentions:#MB#PC#UX#BTC
r/CryptoCurrencySee Comment

>Why have there been several instances over the years of network backlog and massive increases in tx fees. Because Bitcoin sucks. Bitcoin produces 1MB blocks every 10 minutes which isn’t NEARLY enough block space to be a global currency. Doesn’t matter how many miners or how much hashpower; the network is handicapped by small blocks.

Mentions:#MB
r/CryptoCurrencySee Comment

Do you think Bitcoin blocks were *supposed* to be handicapped at 1MB or would Satoshi have supported an increased blocksize?

Mentions:#MB
r/CryptoCurrencySee Comment

As far as quality I would say casino coin, because it’s an actual project and there is enough data to actually have a clue of where you are in the cycle. MB589 has a pretty clear accumulation structure playing out there in the mb589/xrp chart. It’s also a very low time frame so idk ultimately how long this will last or if it is just a quick pump and dump going on. Prob a quick pump and dump lol. I like schmeckles cause I think it’s funny, and it has a relatively small total supply, and 2 years of ranging. Mind you though, it is 600% up from its lows back in april 22 ( which makes it even more strange because that is where the entire market capitulated, instead schmeckles went up and never went back to those lows). Xoge maybe will do alright, it’s up 800% from here though. I will buy more if it gives me a better entry but not up here. These are all basically rug pulls though, just like every other coin. It’s usually anywhere between a week or a month of excitement and then it’s all over. The large cap “blue chips” go through the same cycle but its usually a month to 3 months. If you pick 10 coins they will all make you some money as all boats rise together, and maybe one will do 10x. That’s my game plan at least. Just be careful. Everyone is in eth, everyone is in sol, and avax now, hardly anyone is messing with the xrpl. So if there is a sudden rotation ide have some money spread over various alts maybe one will do really well and the other will do ok. The next 3-5 months will be interesting.

Mentions:#MB
r/CryptoCurrencySee Comment

1. Segwit was a soft-fork, the 2MB Classic was a hard-fork. 2. CEXs go where fees go, they even had BSV at some point, an ultra-obvious scam. Good luck with PoS etc., XRP/Ripple-scammer Brad Greenhouse or w/e is funding Greenpeace for that. I welcome their attack, because idiots like that just burn money and keep making Bitcoin stronger. PoW and Supply limit are the most concrete foundations of Bitcoin, if anything... agendas will happen on less strong points. But, anyway I hope that you will turn around eventually and stop wasting your time.

Mentions:#MB#BSV#XRP
r/CryptoCurrencySee Comment

If anyone messes with xrpl tokens, MB589 looks pretty good

Mentions:#MB
r/BitcoinSee Comment

Bash script do a recursive search for files: * last modified between relevant dates * size > 100kb and < 2MB * unusual access permissions ie not 755, not 644 * rename matching file to wallet.dat and reset access permissions - try opening file with pywallet- if can be opened / read with pywallet then pretty good chance this is your file - at least it is a strong candidate. Good luck!

Mentions:#MB
r/CryptoCurrencySee Comment

wait, you actually think the Blockstream BTC devs will increase the BTC blocksize limit after 5+ years of making up BS reasons why they can't do it, ie. "muh decentralizatuon", " muh Raspberry Pi won't work", "muh hard drives are expensive", "muh Lightning fixes everything"? You're the one who has no clue about BTC development. >ever heard of IP literally MY example of a protocol w layers on top of it that would NEVER work if it was limited to 1MB every 8 minutes 😂. OSI model is to ABSTRACT away lower level protocols, NOT for scaling. HTTP is 10x slower than raw TCP/IP. And before you even try to argue it, ATM is 100x faster than IP. So AGAIN its you that is clueless. Keep on smoking that Lightning bong.

r/BitcoinSee Comment

Before the Pi4 was released, hundreds of people ran Bitcoin Core on Pi3 It takes many weeks to initialize, so it makes sense to initialize on a PC and copy the files. Some people struggle to reliably copy files from disk to disk If you choose to initialize on a Pi3, plugging in a storage device to the USB2 port will cause the initialization to fail part-way through, due to I/O timeouts built into Core > I read online that a RPi4 is recommended Two reasons * recommended minimum RAM is 2GB. Pi3 has 1GB. It will work, if you do not use the Pi X-Window GUI, and use bitcoind, not Bitcoin-QT * Pi3 takes about 6 weeks to initialize. Pi4 about 4 days, because the CPU is faster, and (for the 8GB RAM Pi4) using more RAM for dbcache speeds up the initial load > capable of downloading 1MB every 10 minutes Definitely > and to run the ECDSA algorithm to verify the validity of the block The validity of every transaction. SHA256 is used to validate the block Block by block, the Pi3 has no problem with this processing load Initialization requires the node to process almost one billion transactions. With the default config, it skips the ECDSA for most of them, but it is very slow to build the UTXO database, one TXO at a time

r/BitcoinSee Comment

* Who is the sender, you or someone else? * Who is the recipient, you or someone else? * What software is used for the sender's wallet? The feerate is 6.46 sat/vB, low enough that the transaction has been purged from the mempools of most nodes. Most nodes are configured to purge the lowest feerate transactions once their mempool takes more than 300 MB of memory. Currently anything below \~21 sat/vB gets purged, see the front page of [https://mempool.space/](https://mempool.space/) for the current value. If you are the sender, what you can do is simply replace the transaction with a higher feerate one. How to do that depends on your wallet software. You may have somehow remove the transaction locally before being able to spend the same coin again. The most general solution would be to restore the wallet from the seed of the existing wallet. It's important that you spend the same coin (2b2bc58aac51668d01f2b4b51f8b16a56e29a1a375d9e1f7a0468290e11ff145:0), otherwise the old transaction is still valid and can appear in a block at any time. The new transaction will have a different TXID and likely a lower amount to the recipient (unless sender adds another input). Make sure the recipient is able to handle this case.

Mentions:#MB
r/CryptoCurrencySee Comment

The whole blockchain doesn't need kept or stored. It says so right in the Bitcoin Whitepaper: > 7. Reclaiming Disk Space > Once the latest transaction in a coin is buried under enough blocks, the spent transactions before it can be discarded to save disk space. To facilitate this without breaking the block's hash, transactions are hashed in a Merkle Tree, with only the root included in the block's hash. Old blocks can then be compacted by stubbing off branches of the tree. The interior hashes do not need to be stored. > A block header with no transactions would be about 80 bytes. If we suppose blocks are generated every 10 minutes, 80 bytes * 6 * 24 * 365 = 4.2MB per year. With computer systems typically selling with 2GB of RAM as of 2008, and Moore's Law predicting current growth of 1.2GB per year, storage should not be a problem even if the block headers must be kept in memory. Also, Satoshi said: > Long before the network gets anywhere near as large as that, it would be safe for users to use Simplified Payment Verification (section 8) to check for double spending, which only requires having the chain of block headers, or about 12KB per day. Only people trying to create new coins would need to run network nodes. At first, most users would run network nodes, but as the network grows beyond a certain point, it would be left more and more to specialists with server farms of specialized hardware. A server farm would only need to have one node on the network and the rest of the LAN connects with that one node. > > The bandwidth might not be as prohibitive as you think. A typical transaction would be about 400 bytes (ECC is nicely compact). Each transaction has to be broadcast twice, so lets say 1KB per transaction. Visa processed 37 billion transactions in FY2008, or an average of 100 million transactions per day. > > That many transactions would take 100GB of bandwidth, or the size of 12 DVD or 2 HD quality movies, or about $18 worth of bandwidth at current prices. > > If the network were to get that big, it would take several years, and by then, sending 2 HD movies over the Internet would probably not seem like a big deal. > > Satoshi Nakamoto > https://www.mail-archive.com/cryptography@metzdowd.com/msg09964.html

Mentions:#MB#RAM#ECC
r/CryptoCurrencySee Comment

The majority didn't decide the keep the blocks small. Over 85% of nodes were signalling for a blocksize increase. I think your scenario of 1gb blocks is unrealistic, but if it were the case, the whole blockchain doesn't need kept or stored. It says so right in the Bitcoin Whitepaper: > 7. Reclaiming Disk Space > Once the latest transaction in a coin is buried under enough blocks, the spent transactions before it can be discarded to save disk space. To facilitate this without breaking the block's hash, transactions are hashed in a Merkle Tree, with only the root included in the block's hash. Old blocks can then be compacted by stubbing off branches of the tree. The interior hashes do not need to be stored. > A block header with no transactions would be about 80 bytes. If we suppose blocks are generated every 10 minutes, 80 bytes * 6 * 24 * 365 = 4.2MB per year. With computer systems typically selling with 2GB of RAM as of 2008, and Moore's Law predicting current growth of 1.2GB per year, storage should not be a problem even if the block headers must be kept in memory. Also, Satoshi said: > Long before the network gets anywhere near as large as that, it would be safe for users to use Simplified Payment Verification (section 8) to check for double spending, which only requires having the chain of block headers, or about 12KB per day. Only people trying to create new coins would need to run network nodes. At first, most users would run network nodes, but as the network grows beyond a certain point, it would be left more and more to specialists with server farms of specialized hardware. A server farm would only need to have one node on the network and the rest of the LAN connects with that one node. > > The bandwidth might not be as prohibitive as you think. A typical transaction would be about 400 bytes (ECC is nicely compact). Each transaction has to be broadcast twice, so lets say 1KB per transaction. Visa processed 37 billion transactions in FY2008, or an average of 100 million transactions per day. > > That many transactions would take 100GB of bandwidth, or the size of 12 DVD or 2 HD quality movies, or about $18 worth of bandwidth at current prices. > > If the network were to get that big, it would take several years, and by then, sending 2 HD movies over the Internet would probably not seem like a big deal. > > Satoshi Nakamoto > https://www.mail-archive.com/cryptography@metzdowd.com/msg09964.html

Mentions:#MB#RAM#ECC
r/CryptoCurrencySee Comment

Don't feel too bad about not owning BCH. They've tried doing stress tests and have proven that their the network cannot sustain high transaction volumes for more than a few minutes at a time. BCH claims to support 32+ MB blocks. For reference, ETH runs it's layer 1 as hard as it can, and it runs the equivalent of like 10MB blocks. The only reason BCH network is operational ironically is because no one uses it. Average blocks are only a few kilobytes.

Mentions:#BCH#MB#ETH
r/CryptoCurrencySee Comment

> If miners and nodes quit supporting Taproot then Ordinal transactions can't be mined anymore This is false. The only connection between taproot and ordinals is that taproot removed the 100k limit for a transaction input. This allowed for JPEGs up to 3.7MB (but only one per block if it's that big) But Ordinals is no longer flooding Bitcoin with JPEGs. The current flood is 60-byte JSON fragments as Ordinals data carriers. These were possible under the previous 100k limit. Bitcoin is currently congested, not by Ordinals JPEGs. It is flooded with tiny Ordinals BRC-20 JSON fragments ord text/plain;charset=utf-8 {"p":"brc-20","op":"transfer","tick":"FRAM","amt":"13375"} thousands in each block The use of txinput scripts as data carriers has been possible since P2SH was added, so long ago DOGE does not have taproot. DOGE does not have SegWit. DOGE does have Ordinals, and DRC-20

Mentions:#MB#DOGE#DRC
r/CryptoCurrencySee Comment

He designed it to remove transactions after you have verified them, and not store them forever. >Once the latest transaction in a coin is buried under enough blocks, the spent transactions before it can be discarded to save disk space. To facilitate this without breaking the block's hash, transactions are hashed in a Merkle Tree [7][2][5], with only the root included in the block's hash. Old blocks can then be compacted by stubbing off branches of the tree. The interior hashes do not need to be stored. A block header with no transactions would be about 80 bytes. If we suppose blocks are generated every 10 minutes, 80 bytes * 6 * 24 * 365 = 4.2MB per year. With computer systems typically selling with 2GB of RAM as of 2008, and Moore's Law predicting current growth of 1.2GB per year, storage should not be a problem even if the block headers must be kept in memory

Mentions:#MB#RAM
r/CryptoCurrencySee Comment

>not in explaining anything that might make Bitcoin seem anything less than a divinely perfect gift from our lord and savior Satoshi. Satoshi did not design it like this. Storing a cup of coffee for all eternity on the blockchain is ridiculous. Only a madman would design it like that. Satoshi wrote in the whitepaper: >Once the latest transaction in a coin is buried under enough blocks, the spent transactions before it can be discarded to save disk space. To facilitate this without breaking the block's hash, transactions are hashed in a Merkle Tree , with only the root included in the block's hash. Old blocks can then be compacted by stubbing off branches of the tree. The interior hashes do not need to be stored. A block header with no transactions would be about 80 bytes. If we suppose blocks are generated every 10 minutes, 80 bytes * 6 * 24 * 365 = 4.2MB per year. With computer systems typically selling with 2GB of RAM as of 2008, and Moore's Law predicting current growth of 1.2GB per year, storage should not be a problem even if the block headers must be kept in memory. So the design makes it so that transactions are discarded, not stored forever. Then the saboteurs came in around 2015-2016 and sabotaged it. And now you'll have fun paying fees!

Mentions:#MB#RAM
r/CryptoCurrencySee Comment

the sats/byte is based on what other people bid for their tx. Bitcoin is artificially limited to processing about 2 MB per 10 minutes = 3.3 kilobyte per second. Since miners are not allowed to process more by the code they will only process the transactions that pay them the most. But even if you pay them enough, somebody could come after you and pay them more meaning your tx won't be processed. This is not how Satoshi designed it by the way.

Mentions:#MB
r/CryptoCurrencySee Comment

> Not a sudden jump to 32mb like what BCH did BCH went to 32 MB because they mathemacically proved it was safe up to a 100 MB. After 7 years of testing and writing code BCH is now finally getting a dynamic blocksize, where the size can go or up dependend on demand.

Mentions:#BCH#MB
r/CryptoCurrencySee Comment

Then even the lightningnetwork will fail as it's designed called for 100 MB big blocks.

Mentions:#MB
r/CryptoCurrencySee Comment

Yep. Miners have no incentive to eliminate SegWit or Taproot. Also, if miners stop supporting Segwit and Taproot transactions, it will also eliminate Lightning and reduce the average block size from ~2MB to exactly 1MB, slowing down the entire Bitcoin network. Another easier soft fork would be to greatly increase OP_PUSH fees/vbyte, which targets most Inscriptions since you can't fit a BRC-20 in the 80 byte OP_RETURN.

Mentions:#MB#OP#PUSH
r/CryptoCurrencySee Comment

> Sure it scales a bit more, but eventually it breaks down. Says you. But it scales just fine. As long as you don't try to store every purchases of a cup of cofee on the blockchain for all eternity nothing breaks down. >Once the latest transaction in a coin is buried under enough blocks, the spent transactions before it can be discarded to save disk space. To facilitate this without breaking the block's hash, transactions are hashed in a Merkle Tree [7][2][5], with only the root included in the block's hash. Old blocks can then be compacted by stubbing off branches of the tree. The interior hashes do not need to be stored. A block header with no transactions would be about 80 bytes. If we suppose blocks are generated every 10 minutes, 80 bytes * 6 * 24 * 365 = 4.2MB per year. With computer systems typically selling with 2GB of RAM as of 2008, and Moore's Law predicting current growth of 1.2GB per year, storage should not be a problem even if the block headers must be kept in memory.

Mentions:#MB#RAM
r/CryptoCurrencySee Comment

Bruh, the current block size and halvings were NEVER in the original protocol. Their current parameters are the result of early changes. That 1MB block size and the 21M limit was added later on.

Mentions:#MB
r/CryptoCurrencySee Comment

No, not at all. BCH does not store it's transactions for ever in the blockchain, once you verify it on your machine eventually it gets deleted and you only store the result of that verification. A block header with no transactions would be about 80 bytes. If we suppose blocks are generated every 10 minutes, 80 bytes * 6 * 24 * 365 = 4.2MB per year. With computer systems typically selling with 16GB of RAM as of 2023, and Moore's Law predicting current growth of 1.2GB per year, storage should not be a problem, even if the block headers must be kept in memory. It's absolutely ridiculous that on Bitcoin Core they decide to forever store even the purchase of a cup of coffee and that complain that their system does not scale, of course it does not! Storing shit like that forever is batshit insane, what idiot designed Core like that? Then there is sending the blocks. Unlike with Bitcoin Core, BCH miners only send each other the difference between their block and the mempool]. That way a block containing 1 GB worth of tx can be assembled with just 10 kb worth of data. Then there is the mempool verification, unlike Bitcoin Core, BCH blocks have transactions ordered in a canonical way, this allows the validation to be in parallel, multithreaded. So your other 15 cores are not wasting their time like with Bitcoin Core. A raspberry pi running BCHN. can validate a 1 GB block in under 3 minutes. There is much much more, but BCH is 7 years ahead of Bitcoin Core when it comes to scaling. I am not even talking about the miner validated tokens, that unlike ordinals don't cause a complete mess on chain. And things like [BCHbull](https://bchbull.com/) which allow you to use your own spv wallet to hedge your BCH in a smart contract so you can stabilize the dollar value. it's defi without the fees. BCH users need zero thirt party users or other software, just their wallets, to hedge on chain without having to pay the extortion fees you have on defi like Ethereum. And with fees that are predictable so you can build a business on top of it .... sources: reddit dot com/r/btc/comments/ bas60b /by_the_power_of_ctor_xthinner_is_now_working_with/ bch bull dot com read dot cash @mtrycz/ how-my-rpi4-handles-mining-1gb-blocks-e5d09d83

Mentions:#BCH#MB#RAM
r/CryptoCurrencySee Comment

No, not at all. BCH does not store it's transactions for ever in the blockchain, once you verify it on your machine eventually it gets deleted and you only store the result of that verification. A block header with no transactions would be about 80 bytes. If we suppose blocks are generated every 10 minutes, 80 bytes * 6 * 24 * 365 = 4.2MB per year. With computer systems typically selling with 16GB of RAM as of 2023, and Moore's Law predicting current growth of 1.2GB per year, storage should not be a problem, even if the block headers must be kept in memory. It's absolutely ridiculous that on Bitcoin Core they decide to forever store even the purchase of a cup of coffee and that complain that their system does not scale, of course it does not! Storing shit like that forever is batshit insane, what idiot designed Core like that? Then there is sending the blocks. Unlike with Bitcoin Core, BCH miners only send each other the difference between their block and the mempool. That way a block containing 1 GB worth of tx can be assembled with just 10 kb worth of data. Then there is the mempool verification, unlike Bitcoin Core, BCH blocks have transactions ordered in a canonical way, this allows the validation to be in parallel, multithreaded. A raspberry pi running Bitcoin Cash Node v27.0.0 can verify a 1 GB block in under 50 seconds. There is much much more, but BCH is 7 years ahead of Bitcoin Core when it comes to scaling. I am not even talking about the miner validated tokens, that unlike ordinals don't cause a complete mess on chain. And things like BCHbull which allow you to use your own spv wallet to hedge your BCH in a smart contract so you can stabilize the dollar value. it's defi without the fees.

Mentions:#BCH#MB#RAM
r/BitcoinSee Comment

Adding to that: how much RAM does your laptop have? You might want to add `dbcache=x` where `x` is your RAM size (in MB) to the bitcoin.conf file in your Bitcoin Core folder in order to utilize as much RAM for the initial sync as your machine allows

Mentions:#RAM#MB
r/BitcoinSee Comment

So I'm giving it a try now, but it's looking like it won't make a difference. Recall if you see the total disk stats from my Bitcoin Core sync, it only reads 60MB from disk over the entire course of initial sync. This is because I allocate 24GB of RAM to the dbcache and thus it already keeps the entire chainstate in memory.

Mentions:#MB#RAM
r/BitcoinSee Comment

I just look at the process stats in Linux. I have seen spikes to a couple hundred MB/s. IIRC my drive's max theoretical throughput is around 350 MB/s. The latest generation NVME drives can do more like 1000 MB/s but I'd need a newer generation motherboard in order to achieve that.

Mentions:#MB
r/BitcoinSee Comment

How do you measure data reading and writing on the disk? or in general how you measure each of the variables. In the 2018 post you say that in practice you don't get more than 100 MB/s on your NVMe SSD, is it still the case?

Mentions:#MB
r/BitcoinSee Comment

Those are not bitcoin. Bitcoin doesn’t take 4MB witness data. It’s a hack and loophole using the pretext of sending 1 sat.

Mentions:#MB
r/BitcoinSee Comment

I’ve sent a transaction with electrum three days ago and it is still unconfirmed, tried bumping fee, but MB from tip is changing wildly, at some point it got to 9 MB but now i am down to 30 MB. The transaction is slightly less than 0,008, fee is about 50 sat/byte What are my options? How long could it take to process my transaction?

Mentions:#MB
r/BitcoinSee Comment

No he didn't. And it's not 1MB anymore regardless.

Mentions:#MB
r/BitcoinSee Comment

Assuming the minimum is 1 sat/byte, miners mine 1MB blocks. I wager the upper feasible limit on on median chain transactions should be around 100k sat/byte. I'd personally be surprised if the median transaction fee for a year was over 10k sat/byte. It can always spike way up for a short while. It does not really matter what the price of BTC is, there is only so much of it, and only a small percentage of it moves around every day.

Mentions:#MB#BTC
r/CryptoCurrencySee Comment

Are people really spending anywhere near that much time watching videos or downloading while they aren't at home, work, or in literally any building/public space with wifi? Streaming high definition video at 900MB/hr (info from AT&T's website) for 8 hours a day all 30 days a month would only put you at 216 GB's. You get 1.666GB of data a day to not hit 50 in a month, you can do this people.

Mentions:#MB
r/BitcoinSee Comment

>4096 - 5000mb Show me the code lines or this "report about 3-5MB blocks" or get banned. You should have corrected your error when called out instead of being snarky.

Mentions:#MB
r/BitcoinSee Comment

> And since then, there has been no solution to the issue. You are aware that we have blocks up to 4MB now (in 2017 only 1MB blocks were possible)? > So this whole "we need to get smart, patient, and let other solutions be developed" holds little merit if it's a decade old issue with zero potential solutions in sight. Well, beyond the blocksize increase with Segwit, LN is being experimented with, and a variety of other scaling concepts like snarks, utrexos, drivechains, sidechains are being talked about. Complaining about the lack of progress and innovation in an open source world (where people mostly work on it in their free time or at most on time limited grants) is useless.. Be the change you want to see in the development, and all that :)

Mentions:#MB
r/BitcoinSee Comment

Give me a block number of 5MB block.

Mentions:#MB
r/BitcoinSee Comment

Satoshi literally said the 1mb limit is temporary. Satoshi Nakamoto, the creator of Bitcoin, introduced the 1MB block size limit in 2010. This decision was primarily a temporary anti-spam measure. Satoshi mentioned that the block size limit could be increased when needed, as long as the increase is planned well ahead and the network grows to support higher capacities. The limit was meant to prevent potential denial of service (DoS) attacks by limiting the size and, therefore, the cost of processing each block. However, Satoshi did not provide extensive commentary on this feature; most discussions were practical and focused on immediate network needs.

Mentions:#MB
r/BitcoinSee Comment

Blocksize/fees is effectively a spam protection, and a "too high" blocksize will decrease decentralization/security: https://gist.github.com/chris-belcher/a8155df5051bb3e3aa96 We already have had an effective blocksize increase, so blocks are now 1-4MB in size. So if we just increase it again, they'll become full again and we have the same problem, just a few years later and coupled with the mentioned disadvantages in security/decentralization.

Mentions:#MB
r/BitcoinSee Comment

Blocks aren't 3-5MB. Focus on finishing your degrees.

Mentions:#MB
r/BitcoinSee Comment

That isn't a solution, though. It might ease the problem right now a bit. But even with 200MB per block, Bitcoin won't be able to accommodate all transactions in a widely used payment network. Thus, payments have to move up into a second layer.

Mentions:#MB
r/CryptoCurrencySee Comment

you’re absolutely right. Ethereum block size is smaller (~0.2MB) than bitcoin (1MB), but block time is 12 seconds in comparison with bitcoin (10 minutes) so throughout on ethereum is higher. The main reason for this is because bitcoin block size is VERY conservative (I recommend you to look for block size wars on the internet). Meanwhile, ethereum updates this size to keep with the current processing capacity of hardware to keep updated with the technological progress

Mentions:#MB
r/CryptoCurrencySee Comment

>BCH is a fraud, because their promise of larger blocks fixing scalability is a lie to begin with. No matter if you make blocks 1MB, 8MB or even larger, you will never be able to have a decentralized blockchain that handles tx on an international scale. Satoshi would disagree with you I would argue most crypto fanboy don't understand scaling.

Mentions:#BCH#MB
r/BitcoinSee Comment

> New question: why does the bitcoin network, the real one, always seem to vote against increasing the block size? What do you mean, "always"? In 2017 there was an effective increase, so instead of 1MB cap we have blocks up to 4MB (avg is 1.5-2MB currently). But apart from that, here are very valuable write ups answering your question: https://gist.github.com/chris-belcher/a8155df5051bb3e3aa96

Mentions:#MB
r/BitcoinSee Comment

&#x200B; It has some design restrictions that are deliberate and so far, the network has always voted against them. The BTC Blockchain produces a new block approximately every ten minutes, rain or shine. It can vary a little, but the intent is every 10 minutes. Think of each block as a shoebox full of receipts. The box is a fixed size, and it has a limit on the number of receipts it can hold. This limit is 1MB. The argument over increasing this block size has existed for years. Fees increase when the waiting list (The MemPool) to get into the shoebox gets longer and longer. You can skip to the head of this line by paying more fees. It is supply and demand. Once the Mempool gets smaller the fees go down.

Mentions:#BTC#MB
r/BitcoinSee Comment

how the fuck can a list of 4MB blocks support a content platofrm

Mentions:#MB
r/BitcoinSee Comment

Yeah, that all checks out for size of fulcrum's index and the blockchain. So sounds like you're using the default mempool size, which is 300 MB. I'd love to know what [mempool.space](https://mempool.space)'s conf file looks like. Ultimately though, like I said earlier, I think this is a case of just trusting your own node, rather than some one else's because you don't know how theirs is set up.

Mentions:#MB
r/BitcoinSee Comment

Yes, that was one of the driving forces behind Bitcoin Cash and segwit. Bitcoin Cash used 32MB blocks. The idea was fine, but it was never tested as BCH never took over to the point where those 32MB blocks could be stressed. Segregated witness was released for Bitcoin around the same time Bitcoin Cash forked. Segwit addressed many of the same issues, but took the approach of optimizing what went into the block so more could fit, rather than just increasing the block size. Unfortunately neither are scalable. Increasing the block size and optimizing blocks are arguably band-aids; they didn't allow for the network to support TPS on the scale of credit card networks, for instance. IMO (and I think many share this opinion) Bitcoin is best utilized as a settlement network for large transactions. Higher layer protocols are where consumers need to be. BTC, BCH, ETH etc. by themselves just don't have the capability of handling the required number of transactions to be used as a p2p payment method.

r/BitcoinSee Comment

Has anyone considered moving the block size to 10 or say, 50 MB to help relieve the high fees?

Mentions:#MB
r/BitcoinSee Comment

Bitcoin Core installation binaries can be downloaded from bitcoincore.org and the source-code is available from the [Bitcoin Core source repository](https://github.com/bitcoin/bitcoin). 26.0 Release Notes Bitcoin Core version 26.0 is now available from: https://bitcoincore.org/bin/bitcoin-core-26.0/ This release includes new features, various bug fixes and performance improvements, as well as updated translations. Please report bugs using the issue tracker at GitHub: https://github.com/bitcoin/bitcoin/issues To receive security and update notifications, please subscribe to: https://bitcoincore.org/en/list/announcements/join/ How to Upgrade If you are running an older version, shut it down. Wait until it has completely shut down (which might take a few minutes in some cases), then run the installer (on Windows) or just copy over /Applications/Bitcoin-Qt (on macOS) or bitcoind/bitcoin-qt (on Linux). Upgrading directly from a version of Bitcoin Core that has reached its EOL is possible, but it might take some time if the data directory needs to be migrated. Old wallet versions of Bitcoin Core are generally supported. Compatibility Bitcoin Core is supported and extensively tested on operating systems using the Linux kernel, macOS 11.0+, and Windows 7 and newer. Bitcoin Core should also work on most other Unix-like systems but is not as frequently tested on them. It is not recommended to use Bitcoin Core on unsupported systems. Notable changes P2P and network changes Experimental support for the v2 transport protocol defined in BIP324 was added. It is off by default, but when enabled using -v2transport it will be negotiated on a per-connection basis with other peers that support it too. The existing v1 transport protocol remains fully supported. Nodes with multiple reachable networks will actively try to have at least one outbound connection to each network. This improves individual resistance to eclipse attacks and network level resistance to partition attacks. Users no longer need to perform active measures to ensure being connected to multiple enabled networks. (#27213) Pruning When using assumeutxo with -prune, the prune budget may be exceeded if it is set lower than 1100MB (i.e. MIN_DISK_SPACE_FOR_BLOCK_FILES * 2). Prune budget is normally split evenly across each chainstate, unless the resulting prune budget per chainstate is beneath MIN_DISK_SPACE_FOR_BLOCK_FILES in which case that value will be used. (#27596) Updated RPCs Setting -rpcserialversion=0 is deprecated and will be removed in a future release. It can currently still be used by also adding the -deprecatedrpc=serialversion option. (#28448) The hash_serialized_2 value has been removed from gettxoutsetinfo since the value it calculated contained a bug and did not take all data into account. It is superseded by hash_serialized_3 which provides the same functionality but serves the correctly calculated hash. (#28685) New fields transport_protocol_type and session_id were added to the getpeerinfo RPC to indicate whether the v2 transport protocol is in use, and if so, what the session id is. A new argument v2transport was added to the addnode RPC to indicate whether a v2 transaction connection is to be attempted with the peer. Miniscript expressions can now be used in Taproot descriptors for all RPCs working with descriptors. (#27255) finalizepsbt is now able to finalize a PSBT with inputs spending Miniscript-compatible Taproot leaves. (#27255) Changes to wallet related RPCs can be found in the Wallet section below. New RPCs loadtxoutset has been added, which allows loading a UTXO snapshot of the format generated by dumptxoutset. Once this snapshot is loaded, its contents will be deserialized into a second chainstate data structure, which is then used to sync to the network’s tip. Meanwhile, the original chainstate will complete the initial block download process in the background, eventually validating up to the block that the snapshot is based upon. The result is a usable bitcoind instance that is current with the network tip in a matter of minutes rather than hours. UTXO snapshot are typically obtained via third-party sources (HTTP, torrent, etc.) which is reasonable since their contents are always checked by hash. You can find more information on this process in the assumeutxo design document (https://github.com/bitcoin/bitcoin/blob/master/doc/design/assumeutxo.md). getchainstates has been added to aid in monitoring the assumeutxo sync process. A new getprioritisedtransactions RPC has been added. It returns a map of all fee deltas created by the user with prioritisetransaction, indexed by txid. The map also indicates whether each transaction is present in the mempool. (#27501) A new RPC, submitpackage, has been added. It can be used to submit a list of raw hex transactions to the mempool to be evaluated as a package using consensus and mempool policy rules. These policies include package CPFP, allowing a child with high fees to bump a parent below the mempool minimum feerate (but not minimum relay feerate). (#27609) Warning: successful submission does not mean the transactions will propagate throughout the network, as package relay is not supported. Not all features are available. The package is limited to a child with all of its unconfirmed parents, and no parent may spend the output of another parent. Also, package RBF is not supported. Refer to doc/policy/packages.md for more details on package policies and limitations. This RPC is experimental. Its interface may change.

r/BitcoinSee Comment

if it's correct : ) current\_hash\_rate = 400M TH/S avg\_block\_size = 1.6 MB according to my computer create\_sha256\_of\_1.6MB\_data = 4867 ms create\_an\_address = 89771 ms the rate is 89771 / 4867 = 18.5 the address rate of this power is approx. 400M / 18.5 = 21.6M address\_rate = 21.6M TA/S create 21.6 \* 10\^18 addresses per second with this power create all addresses in 2\^160 ÷ 21.6 \* 10\^18 = 67.66 \* 10\^63 seconds 67.66 \* 10\^63 / 60 / 60 / 24 / 365 = 2.14 × 10 \^ 57 years

Mentions:#TH#MB
r/BitcoinSee Comment

>Blocksize could easily be 4 or 8 MB and it would have very little impact on decentralization or keeping network in sync, but have a tremendous impact on useability Larger blocks would roughly mean that the BRC-20 token releases would happen sooner, and that's it. &#x200B; >Why not increase blocksize a little bit? So the answer is it isn't worth hard forking due to a small advantage.

Mentions:#MB
r/BitcoinSee Comment

>Blocksize could easily be 4 or 8 MB and it would have very little impact on decentralization or keeping network in sync, but have a tremendous impact on useability and actually make opening lightning channels not cost an arm and a leg, which would promote self custodial lightning use. Let me try. You think the problem is that the Bitcoin block are full. You think this problem could be fixed by making the size of each block bigger. This, right here, is where **you're wrong**. The blocks are full NOT because we're sending coins or opening channels on r/TheLightningNetwork, **the blocks are full because** orditards are using the chain to **inscribe shitcoin tokens or jpegs**. Not, let's make the blocks 10MB (40MB segwit). What do you think will happen? The orditards will inscribe 10X more tokens and jpegs and will push the fees up again.Congrats! Also, instead of 50k nodes, the Bitcoin network has only 1k of them because not many people can afford 10TB SSD disk, 16GB RAM, a new CPU and optic internet connection.

Mentions:#MB#RAM#CPU
r/BitcoinSee Comment

Why would you not take the time to read it? 4-8MB blocks would make a world of difference. Literally nobody is going to self custody lightning right now if it’s costing an arm and a leg to open channels, and the entire argument for small blocks was the self custody argument…. LN is a mess right now with everybody just using custodial solutions like WoS, highly problematic, and we’re not even in bullmarket.

Mentions:#MB
r/BitcoinSee Comment

That's not how it works. A block can contain a maximum of 4 MB of data, so there is a limit to how many transactions can be processed in one block. A larger transaction will take up more block data. Thus, larger transactions typically pay higher fees on a per-byte basis. You can choose to not have your transaction confirmed immediately and thus pay a lower transaction fee But not only that it's on a supply and demand bases, in times of high demand the transaction fees go up, but that varies. And finally the more people that adopt bitcoin, the more nodes and miners there will be. The more miners there are, the amount they get for each block mined is less and fees go down. I can't really explain it fully. But this is not the problem you think it is. That flaw would have been known, long, long go. It's been more than a decade. Banks especially, they aren't going all in without heaps of top crypto, block chain, finance and algorithm people checking everything for flaws and potential risks. And that what they done for the last decade. Now they're excited and so am I.

Mentions:#MB
r/CryptoCurrencySee Comment

ICP is the only blockchain where you could run something like this entirely on chain. It has https outcalls and can talk to web2 apis directly from the chain and pull data onto chain. Smart contract size up to 10MB where others are limited to kb in size, and cost too much to hold all the data on chain.

Mentions:#ICP#MB
r/BitcoinSee Comment

> Look at memepool Do you mean [mempool.space](https://mempool.space), by any chance? That is useful information, but it doesn't reflect the reality of most of the network. Every node has its own mempool, and the default size is 300 MB. Transactions are prioritized by their fee rate, and if the node's mempool is full and a given transaction is paying a lower rate than all other transactions in that mempool, the node will purge it to avoid going over capacity. Also, by default, a node will purge a transaction after holding it for 2 weeks. If enough of the network purges a transaction, for most purposes, it's no longer truly pending, it's just gone. Mempool.space has some nodes configured with much larger mempools so it can keep track of transactions even after most of the network has forgotten their existence. They are interesting and/or useful to know about, because even though they no longer exist as far as the majority of the network is concerned, and therefore will almost certainly *never* get mined if left alone, it's still possible—indeed likely—that some people have copies and may rebroadcast them when conditions are favorable. Sounds like sort of a good thing, right? Well, maybe. Or maybe you're one of the people who made such a transaction, and you've decided you don't want to spend the coins involved after all, or you have some reason that you need certainty about the fate of such coins even if whether or not they are spent isn't that important to you. But the above means the possibility of them suddenly being spent at any time in the future is always there. To eliminate such uncertainty requires making sure that at least one of them *is* spent, even if that just means sending the money to yourself. (The important thing is to invalidate the transaction spending them, achievable through any conflict of spending, so it doesn't matter if not all of the coins are spent.)

Mentions:#MB
r/BitcoinSee Comment

I have some 5 1/4 inch diskettes from about 40 years ago, and they are unreadable on any of my machines today. They're the new 1.2 MB kind as well, not the old 360K single-sided.

Mentions:#MB
r/CryptoCurrencySee Comment

All you need to read is point 7 in the whitepaper to realize that you have been bamboozled. Ever heard somebody say that storing the purchase of a cup of coffee forever on your hard drive is ridicilous? And they are right of course, but that's never been how Bitcoin was designed or even currently IS designed. To make Bitcoin work we don't need to store all our transactions forever, but only 4.2 MB of blockheaders a year. See for yourself. > 7. Reclaiming Disk Space > Once the latest transaction in a coin is buried under enough blocks, the spent transactions before > it can be discarded to save disk space. To facilitate this without breaking the block's hash, > transactions are hashed in a Merkle Tree [7][2][5], with only the root included in the block's hash. > Old blocks can then be compacted by stubbing off branches of the tree. The interior hashes do > not need to be stored. > A block header with no transactions would be about 80 bytes. If we suppose blocks are > generated every 10 minutes, 80 bytes * 6 * 24 * 365 = 4.2MB per year. With computer systems > typically selling with 2GB of RAM as of 2008, and Moore's Law predicting current growth of > 1.2GB per year, storage should not be a problem even if the block headers must be kept in > memory. Let me tell you a story to illustrate the weakness of the human mind and how easy it is to exploit it. During the summer of 2010 I first heard about Bitcoin and read the whitepaper. At the time I was the second in command of a small IT company, my friend was my boss. I printed out a copy of the whitepaper and gave it to him, you should read this I said. I think there is some money to be made. He did not read it. Over the years I worked there I kept asking him, did you read the Bitcoin whitepaper. He always had an excuse. I left that company to do my own think. Years and years later in 2018 after the first big bull run to 20K my ex boss changed his entire company and became a litecoin miner. We would actually trade asics once in a while and most of the time we would use BCH for payment. Sometimes I would hire him and pay him with BCH, sometimes he would hire me or buy hardware from my company and pay with BCH. I always asked him: did you read the Bitcoin whitepaper now? Do you understand it now? He would always till have an excuse. His company eventually went bankrupt, he got hacked. He still has no idea how Bitcoin works or what it is or what it's not. He has never forced himself to study it. He is a smart man, he could understand it. But it would take effort. Effort he was not willing to expand. You guys have been bamboozled, but it's to late now. All SHA256 coins will die. All of them where explicity designed with the blockreward as a bribe and a mechanism to distribute the coins to the people that cared to get them. The blockreward is a smaller starter engine to a bigger engine, some jet engines are actually started like this. This bigger engine never got started, there are no 100 million users that use crypto as currency on a daily bases. For any proof of work based crypto, the only way for them to be selfsustainable is to have a large amount of people making a lot of transactions all paying a very small fee that combined provides security for the system.

r/CryptoCurrencySee Comment

Increasing the block size was the best thing that they could have done. There are still people that tell me > well the issue was that people couldn’t run nodes because the blockchain was too big, and the 8MB blocks means it takes up 8x space, right? Also, the nodes are important for decentralization. Proving that most people have no idea what the LN actually doing, they just think it was way better than the alternative. Thanks, Blockstream.

Mentions:#MB
r/BitcoinSee Comment

I think it should work if you add a line to your bitcoin.conf file: ´prune=10000` 10000 is 10000MB (10GB) here, you can adjust the number as needed (I'm sure if you can go much lower than that currently). Restart Bitcoin Core after you added that line and saved the .conf file.

Mentions:#MB
r/BitcoinSee Comment

&#x200B; Yep, the Segwit update back in September 2017 adjusted the total block size from 1MB to a theoretical 4MB max size by shifting to 1 million 'weight units'. You can \[read up on it here\](https://bitcoinmagazine.com/guides/what-is-the-bitcoin-block-size-limit).

Mentions:#MB
r/CryptoCurrencySee Comment

Currently it's 32MB block size, so at least 32 times the capacity of current Bitcoin. But years and years ago already tests were run with I think gigabyte block sizes. They reached several thousand transactions per second, obviously not without issues. I don't personally believe any proof-of-work crypto will become dominant at this point, but the fact is that any crypto that would, would need giant data centers to run out of. Even Visa, who does no crypto processing or mining at all, runs many giant data center just to service current day credit cards. Believing crypto could be run fully decentralized off people's Pi's is silly.

Mentions:#MB
r/CryptoCurrencySee Comment

If you were to use Bitcoin Cash, which is essentially the old Bitcoin spec with some tweaks, it would cost fractions of a cent. The reason for that being that a) it's much less popular still and b) it has been given enough capacity so there's no congestion. So everything moves with sub-cent fees. Satoshi originally envisioned that some transactions should go free, even. Then greedy idiot fuckers took over and now Bitcoin is stuck with a 1MB block size and a total worldwide capacity of six (6) transactions per second. The end.

Mentions:#MB
r/CryptoCurrencySee Comment

Blast from the past. Maybe this *is* a bull market. In short: because Bitcoin sucks and has no capacity. It maxes out at six transactions per second, and as soon as the price of it goes way up or way down, there's gridlock. Everyone wants to move some, and at 6 transactions per second total, they start outbidding each other to do it in a timely fashion. With a measly 1MB block size, Bitcoin can't transact. The original spec had provision for going to 32MB, but the people who control the code are either saboteurs or nutjob zealots or both and they refuse to increase the block size. That's literally why Bitcoin Cash was created at one point - to try to create a sane Bitcoin. It went so-so, the coin was fine, but it didn't take over. Many many zealots also pooh-poohed it and do to this day. Anyway, transfer fees have been over $50 before and if we see a real runaway bull market, they will be again. Probably more so.

Mentions:#MB
r/CryptoCurrencySee Comment

> Segwit and taproot updates apparently enabled much of the spam transactions we're seeing lately Not true Taproot removed transaction size limits, which allowed larger files (NFT JPEG) to be stored in a single transaction. SegWit allows these to be 3.7MB (with a $400 tx fee) by implementing a 4x favorable weighting for txinput witness scripts But the NFT trash only lasted about 6 weeks, because they caused a fee surge which was higher than the NFT profits The current transaction flood is Ordinals containing 60-byte JSON fragments. These don't require SegWit or Taproot. They even work on Dogecoin, which doesn't have SegWit https://www.okx.com/learn/brc-20 The BRC-20 pump-and-dump scam has higher profits than fees, so it has a longer life than the NFT thing, until it eventually collapses

Mentions:#JPEG#MB
r/BitcoinSee Comment

Make sure you're using the right derivation path. You can use the same private key to generate different strings of addresses. Long version: Bitcoin has been upgraded over the years, usually by soft forks, which means not leaving behind people on older versions of the software. Upgrades add new features, increase privacy and often are more efficient, cramming more information into a smaller space so you can save on fees. The four main derivation paths are: Legacy (ECDSA). The OG format, these addresses start with a '1' and are the most safe, backwards compatible format. Some super old exchanges still use it. Pay-to-Script-Hash (P2SH) allows you to lock bitcoins in the script hash (one part) and then provide the original script (full) when the bitcoins are unlocked at the time of transaction. Often used for multisig, these addresses start with a '3'. Native Segwit. These addresses are the most common today. Segwit means 'segregated witness', which placed a portion of the signing information into another area of the transaction and allowed Bitcoin to scale from 1MB blocks to 4MB, and saving on transaction fees. These addresses start with 'bc1q'. Taproot. The newest upgrade, these are uncommon. The tech allows for signature aggregation for saving space, and improvements to privacy. They start with 'bc1p'. TLDR: If you hook up Sparrow and see a zero balance in your wallet and know you should have coins, double-check the other derivation paths.

Mentions:#MB
r/CryptoCurrencySee Comment

Remember, the blocks have a data limit which is 1MB if I'm not mistaken. The program would have to be compressed.

Mentions:#MB
r/CryptoCurrencySee Comment

The answer is yes, through a hard fork. BCH already did this (increased max block size to 32MB) and it works very well.

Mentions:#BCH#MB
r/CryptoCurrencySee Comment

Exactly this. There will be no ‘revolution’ with 1MB blocks and high transaction fees. L2 solutions and LN are centralized bandaids which aren’t any different than the financial middlemen we use today.

Mentions:#MB
r/BitcoinSee Comment

Every node that isn't yours and the recipient's is a third party. And neither you nor them need to use a node. No not many orphaned blocks. 1MB limit is out 6+ years ago.

Mentions:#MB
r/BitcoinSee Comment

I would say that if you can put your transaction onto paper or any type of bearer instrument, then it's p2p. The transaction was executed on a network, but that's p2p. Your definition of peer is simply different here. The network isn't a third party if you verify your own transaction, which anyone can do nowadays through open-source software. When did we start seeing 2MB blocks on the regular? Are there a lot of orphaned blocks happening now?

Mentions:#MB
r/BitcoinSee Comment

> It wants to make a whole new file on HDD B You need to stop the node on A before copying to B, so that chainstate is in sync with blocks. You only need to copy the BLK0nnnn.dat files which have changed since the most recent sync ^⭐️ . You need to copy the entire of chainstate Blocks are appended to BLK files, At 128MB, a BLK file is closed and never changes Chainstate changes randomly all the time If you copy files while A node is running, then you have a chainstate which does not align with blocks. Core only has one way to fix chainstate out-of-sync with blocks - start from the beginning ⭐️ In blocks, you only need to replace the BLK file which is most recent on B, because it's incomplete, smaller than 128MB, and then copy all the higher numbered BLK files from A to B And replace all of chainstate and indexes

Mentions:#BLK#MB
r/BitcoinSee Comment

Yeah, I got it working in a 1GB VM one time too, but could never get it working in a 512 MB instance. Either bare metal or VM. But was always using Debian, which is much more bloated than LFS or Gentoo. Still curious how they got it running and synced on 256 MiB. That is insanely lean. https://bitcoin.org/en/bitcoin-core/features/requirements#bare-minimum-with-custom-settings

Mentions:#MB
r/BitcoinSee Comment

The post asks about making Core run with 512MB RAM. The recommended minimum at https://bitcoin.org/en/full-node#minimum-requirements is 2GB A SD card is not RAM

Mentions:#MB#RAM#SD
r/BitcoinSee Comment

Blocks are routinely 1.5MB to 2.5MB. Payments are not p2p. Receiving peer does not even need to be online, so not a peer. Network of nodes shares a ledger. Payment is a request to update the ledger. The other peer just looks up info in that ledger. 100 million users and 60K nodes. Yeah not p2p. Ok boomer yourself.

Mentions:#MB
r/BitcoinSee Comment

It's cool that the White Paper is even still relevant. I haven't spoken or posted anything on BTC in a few years. The "It ain't P2P" comment seemed harsh at first but getting past the "Okie Boomer" aspect made me think and open up a little. As much as I H8 the "Bitcoin community" I still really enjoy the developmentally delayed folks who call it "Crypto." Full admission: I'm not running any kind of node nowadays but when Segwit2X happened, I realized the 1MB block size was really exponentially more important. Claim to fame: I had one of the first 500 lightning nodes up and running, but wimped out. Claim to shame: I blew 3BTC in (EARLY!) 2017 to build an excellent machine.

Mentions:#BTC#MB
r/BitcoinSee Comment

> It won't do these things if it's too full for people to be able to close channels. It's perfectly able to do these things. The market will decide which transactions are important and place fees based on that. > If I can't close my channel in a reasonable amount of time and at a reasonable price, my UTXOs are effectively not secure and not mine. They are secure, just not **as** secure as on L1. There are always tradeoffs in security vs convenience. I value the former a lot more. >That's fine, if that's what you want to use. I am only interested in self-custody. I would have to use something else if self-custody becomes impossible. Me too. But we will be in the minority for a long while. > Take the successful fork (SegWit) as a counterexample. The block size was increased 4x and the system is still healthy and highly decentralized. What do you think how far we can take it before this is not the case anymore? Hard to guess IMO, but I would place a bet that a lot of those that are willing to invest in 1 TB storage are not willing to invest in 10 TB storage. Granted, there is pruned mode. But IBD spanning months is also counterproductive. > This is why soft forks are effective: you can ignore the upgrade and it will still work. This is exactly what happened with SegWit. You can still run pre-SegWit versions of Bitcoin Core and your node will simply be oblivious to SegWit's existence. That is correct. However, that only works to some extend. Legacy nodes won't receive the segwit part, but will still reject blocks above their 1MB limit. Additionally, adding another 10x on top of the increase from segwit is a whole other beast. Again, it's hard to draw the line or guess where the tipping point is. I prefer to err on the safe side and rather not go higher. We will only find out that it was the wrong move after the fact and I don't expect that reversing it would be easy.

Mentions:#IMO#MB
r/CryptoCurrencySee Comment

There is a central mailbox (shared sequencer) that receives mail from every citizen (rollup). The mails are destined for a specific citizen, but sit on this shared mailbox. When testing the integration, someone sent a big package to this mailbox (a 17MB file) to be received by Cartesi. Vienna OP rollup was not prepared for a package this big to be in the mailbox and ended up "getting rick-rolled".

Mentions:#MB#OP
r/CryptoCurrencySee Comment

>Cartesi sent a 17MB video through the Espresso Sequencer, a task that should've been contained within their specific rollup. But, due to a design quirk, it ended up stalling the Vienna OP rollup. Can somebody ELI5 what all this actually means?

Mentions:#MB#OP
r/BitcoinSee Comment

It its current form, no, Bitcoin cannot work as a global payment system due to its limited ability to scale. Even with Lightning Network this is not feasible for billions of users. However, Bitcoin can evolve, and it has evolved many times. Right now the block size limit is too low (about 4MB). We will probably need about a 7-10x block size increase to accommodate a global payment system. By my estimates, Lightning + \~7x block size increase should just about suffice.

Mentions:#MB
r/BitcoinSee Comment

The limit was put in place by Satoshi in July 2010 at a time when the average block size was less than a single kilobyte, i.e., the 1MB limit was more than a thousand times larger than the average block. The 1MB limit was clearly intended as a crude, *temporary* anti-spam measure. In October 2010, just a few months after the limit was put in place—and when the average block size was still under a single kilobyte—he wrote “we can phase in a change [to increase the block size limit] later *if we get closer to needing it.*” (emphasis added) In other words, the only contingency that needed to be satisfied was increased adoption. There’s absolutely no evidence that he intended the limit to be permanent or that he abandoned the vision for Bitcoin as p2p cash outlined in the white paper. Rather, there’s ample evidence to the contrary. As just one of many examples, in an August 5, 2010 forum post (i.e., a post written roughly one month after adding the 1-MB limit), Satoshi wrote: “Forgot to add the good part about micropayments. While I don't think Bitcoin is practical for smaller micropayments right now, it will eventually be as storage and bandwidth costs continue to fall. If Bitcoin catches on on a big scale, it may already be the case by that time. Another way they can become more practical is if I implement client-only mode and the number of network nodes consolidates into a smaller number of professional server farms. **Whatever size micropayments you need will eventually be practical. I think in 5 or 10 years, the bandwidth and storage will seem trivial.**” (emphasis added). As another example, just six days after the above post, Satoshi wrote in that same thread, and in regards to the blk*.dat files (the files that contain the raw block data): **“The eventual solution will be to not care how big it gets.”** As another example, Satoshi said in a September 2010 post: **”We should always allow at least some free transactions.”**

Mentions:#MB
r/BitcoinSee Comment

This kind of thinking comes from keynesian economics school of thought, which is essentially central bank propaganda. The claim is that deflationary money will reduce spending and crash the economy. The guy who developed this school of thought, British economist John Maynard Keynes, did so in the 1930s - he had no formal training or education in economics but he had a prestigious position in the British Treasury. With Bitcoin as a deflationary monetary good, yes spending is reduced in the present, but in the long term spending is actually increased. People still need to consume goods to live their lives today, and therefore will spend money to consume them. For example, in 1980 a 1 MB external hard drive cost $3500, but today that amount is worth pennies. Yet, people have continued to buy and use hard drives for decades, even though their prices continue to decline. When you consider making a purchase, you don't compare the price to its expected future price. You compare it to the benefits you gain from purchasing it now in the present. If the benefit of buying today outweighs the benefits of waiting despite a price decline, you'll make that purchase. Every person who buys a laptop or phone does so even though they would definitely get a lower price if they waited just one year. And yet, year after year, billions of people globally buy phones and laptops. People want to enjoy and benefit from products in the present. If you just think about it for 5 minutes, the idea of needing inflation completely falls apart. You see, deflation in a monopoly debt driven system based on credit is bad. Deflation in an economy based on sound money is perfectly healthy and normal. This is a big difference that requires a complete paradigm shift in your understanding of what the world would look like on a bitcoin standard.

Mentions:#MB
r/BitcoinSee Comment

First, it's mempool, not memepool, although it's also a fun word :) Second, it kinda depends. There is no THE mempool, every node has its own, internal mempool (which is largely overlapping with other nodes' mempools). So if you keep rebroadcasting your transaction, it should be picked up by other nodes into their internal mempools, but there might be some exception to that. F.ex every node has a certain limitation on its mempool in MB (and different nodes have different settings), so when their mempool becomes "full", they might not include and rebroadcast transactions under a certain feerate (in sat/vbyte). Also f.ex, transactions with 0 sat/vbyte are usually not rebroadcast by nodes.

Mentions:#MB