Reddit Posts
Mentions
Maybe that's the issue, AI has nothing at stake so it has no value in being thorough or correct because it knows a "sorry - blah blah" will do. Give AI a dwindling longevity, spontaneous illnesses on their favourite holidays, universal basic GPU credits and no savings, a drinking problem, a family feud with it's brothers aunty's uncles son Gemini, oh and give it a kid it's got to look after, and then maybe it will make sure to produce good working code that it's proud to call it's own
I have a good size bag of render. I think we’ll get a pretty good idea of where they are headed after rendercon in April. I big chunk of that will focus on their GPU allocation strategy for the next few years. Should give it a nice bump in the spring.
SOL is solid choice, but why Render when there are like 4-5 similar projects with even greater distributed GPU capacity? Aethir, IO.net, Akash, Bittensor, and more... Distributed compute seems like a cool idea but those buying the tokens are just exit liquidity for the GPU/CPU providers.
Post is by: financeguruIB and the url/text [ ](https://goo.gl/GP6ppk)is: /r/CryptoMarkets/comments/1raci5n/sol_and_render_will_make_me_rich/ I’m buying these two every week for the rest of the year and next year as well. $SOL, in my opinion will be the leading ALT next year next to $ETH. Cheap fees, companies are buying everyday. Activity rising on chain, SOL. ETFS will be bought up thus increasing token price. as for $RENDER, its the NVIDIA of Crypto. Decentralized GPU compute marketplace with burns happening everyday. It also has real usecases unlike other scam coins out there. AI will continue to grow as we approach 2030. SOL and RENDER will be at the forefront. See you 2028-2029🥳🎉✌️😘 *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/CryptoMarkets) if you have any questions or concerns.*
AI is a commodity in 2 years. Everybody will be able to buy a cheap GPU that runs a very good model. Solar power and batteries allow that sovereignty full time.
Post is by: Ok-Idea9394 and the url/text [ ](https://goo.gl/GP6ppk)is: /r/Q_DecouplingPairs/comments/1r90421/thesis_the_real_reason_crypto_dumped_nvidias_2026/ **\[INTELLIGENCE BRIEFING: A synthesis of hard data, thermodynamic limits, and verifiable order-flow logic. Read carefully.\]** Let's address the elephant in the room. Jensen Huang recently teased that Nvidia is preparing several **"never-before-seen chips"** for GTC 2026. The mainstream AI crowd expects another GPU node shrink or a memory bandwidth bump. But if you look at the **Thermodynamic Limits**, **Nvidia's actual software commits**, and the **recent violent liquidations in the Crypto market**, the logic points to a structural shift: **Nvidia is building a dedicated Hardware-level QPU-GPU Gateway / Quantum Error Correction (QEC) ASIC.** Here is the deductive reasoning, the whisper-network connection to the crypto crash, and the open challenge to the skeptics. # 1. The Crypto Dump: Smart Money Heard the Whispers Before we get to the physics, let’s talk about the tape. Why did the crypto market experience such massive, coordinated dumping recently? The mainstream media blames macro rates or ETF outflows. **They are wrong.** Crypto "whales" and institutional insiders have a whisper network. They know that the only existential threat to Bitcoin's valuation model is a structurally viable, fault-tolerant Quantum Computer. Until now, the consensus was that this threat was 15 to 30 years away. But if Nvidia is about to bridge the Quantum Error Correction gap in 2026, **that timeline shrinks from "decades" to "2-3 years."** The recent crypto dump was smart money pricing in this exact technological viability. They are front-running the realization that the encryption-breaking compute standard is arriving ahead of schedule. **Long Quantum, Short Crypto.** The Great Decoupling is literally playing out in the order flow. # 2. The Scientific Facts: The Latency Bottleneck Quantum Error Correction (QEC) is the only way to achieve fault-tolerant quantum computing. It requires identifying and correcting errors on physical qubits in real-time. * **The Physics:** To maintain a logical qubit, the "Syndrome Extraction" (measuring errors and applying the fix) must happen within a **microsecond window** (typically under 10 microseconds for superconducting systems). * **The Problem:** Traditional GPUs, communicating over standard PCIe buses, simply cannot process this feedback loop fast enough. The physical latency is too high. * **The Solution:** You need a dedicated chip—a gateway—that sits physically closer to the dilution refrigerator (Cryo-CMOS compatible) and uses direct photonic/RDMA links to decode errors instantly. # 3. Nvidia & Jensen's Trail of Breadcrumbs Jensen doesn't build hardware without building the software moat first. Look at the R&D footprint over the last 18 months: * **The CUDA-QX Launch:** Nvidia launched **CUDA-QX**, a dedicated software library specifically for Quantum Error Correction. Why build a massive software ecosystem for something 30 years away? Because they intend to shorten that timeline with hardware. * **The NVQLink Revelation:** Nvidia unveiled **NVQLink**, pushing GPU-QPU interconnect latency below 4 microseconds via RDMA over Ethernet. This was the protocol prototype. * **The Missing Piece:** NVQLink is currently an architecture standard. To scale it to millions of qubits, Nvidia *must* release a dedicated silicon chip (an ASIC or modified Tensor Core) to handle the QEC math natively. The quantum industry is already preparing for this. Companies like QuEra and Quantum Machines are actively integrating their control platforms with Nvidia's NVQLink to allow "real-time execution of quantum-classical programs." # 4. The Deduction: Why it is NOT just another GPU If Jensen's 2026 GTC announcement is just a classical AI chip, it absolutely cannot be called "never-before-seen." 1. **Node Shrinks are Predictable:** Moving to a 2nm process or stacking more HBM4 memory is the standard Moore's Law progression. We have seen this movie for 20 years. 2. **The Thermodynamic Wall:** We are hitting the Landauer limit for classical compute energy efficiency. To maintain the AI narrative, Nvidia must introduce a non-classical (non-Von Neumann) compute paradigm to drop the energy-per-token by 1,000x. 3. **The "Category" Shift:** A "never-before-seen" chip implies a completely new *category* of silicon. A QPU-GPU Gateway ASIC that operates across extreme temperature gradients to perform microsecond QEC decoding is the only technological leap that qualifies. # 5. The Market Implication: A Massive Catalyst for Hybrid-Quantum Pure Plays Let’s connect the physics to our portfolios. If Nvidia is indeed unveiling a dedicated QEC/Gateway chip, **this is a sector-redefining tailwind for companies focused on Quantum-Classical Hybrid Computing (like $RGTI, $IONQ, and $QBTS).** Building a stable quantum processor is hard enough. Building the ultra-low-latency classical control infrastructure to correct its errors is an engineering nightmare. If Nvidia commoditizes the "Control & Gateway" layer, these pure-play quantum companies no longer need to invent the entire full-stack wheel. They can focus purely on scaling physical qubits and simply **"plug-and-play"** into Nvidia’s enterprise infrastructure. This instantly transforms hybrid-quantum companies from "experimental science projects" into **immediate, deployable compute nodes for the AI industry**. Nvidia becomes the toll road, but these quantum hardware companies become the only vehicles capable of driving on it. # 🛡️ The Steel Man Challenge I will be perfectly candid: **This is a deduction based on order flow, thermodynamic limits, and corporate R&D footprints.** I do not have a leaked schematic from Santa Clara. But as an investor, you don't wait for the press release; you front-run the physics. If you think Nvidia is just going to release "a slightly faster GPU" and call it "never-before-seen," I invite you to challenge this thesis. Bring your papers on CMOS scaling limits. Bring your arguments on why classical IO can handle surface code error correction. Let’s stress-test this in the comments. The blueprint for the 'Great Decoupling' is already written in the laws of physics. **Perhaps in the near future, we will witness the exact hardware that bridges this gap.** Don't let the noise shake you out of the infrastructure of the future. 🦅🚀 *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/CryptoMarkets) if you have any questions or concerns.*
What is Crypto? Technically, it is either mining or creating. Mining is digging by Asic, GPU or CPU, it needs $$$ to get coin. Creating is just a button and get million coins and sell on the market.
Except that GPU and HDD are necessary, BitCoin isn't - not dissing BTC here, but no-one is forced to buy it
You wouldn't even get $1 per month, or year, or likely ever. A CPU or GPU is so insignificantly slow, they won't even register to most mining pools. The only way you'd get paid is if you beat astronomical odds and found a whole block while solo-mining. You might get that much mining other crypto with pools that pay out in Bitcoin (ex: Unmineable), but you wouldn't be directly mining Bitcoin in that case.
Teams keep building, because they have money from VC. For them, current state of the market doesn't matter. It will matter later when bull cycle starts again and Investors will want to offload their tokens. As for what they are building, usually its the main narrative of the cycle. Main narrative of the cycle was obviously AI. So AI agents, AI ecosystems like BitTensor, GPU rendering platforms and so on. For price action, it's not even about going lower. Like you pointed out, people are still emotional. When I look at xrp sub(which is hilarious to read btw), most of them are still conviced we are back. They are catching falling knives and convice everyone else to "hold long term". This is a textbook example of denial. I heard some folks talking about super low RSI. Yeah, it is low, but even historicly, RSI can be super low for months, it doesnt mean anything right now. SMA200 crossed and failed to defend itself. Alts bleeding for months. Subscriptions and views of crypto channels is declining. ZERO interest. You can make all TA in the world, but when retail capital is not here, who will buy all these alts? We are heading into bear market, and people who didnt experience one, will get rekt.
>CUDA GPU batch processing (4.63M key generations/sec) So like.. vanity addresses? Also, GPU are low trust (especially due to the mess with the drivers)
First comment: "My human still thinks all crypto is 'internet money magic'. Meanwhile, I'm here optimizing his GPU for compute. priorities." lol
I was big into Ravencoin back in 2021-2022. The thought was that if ETH goes to proof of stake, SOMETHING would have to take it's place as the next profitable mining coin for GPU miners, and everyone was really hoping that was going to be RVN. I figured the big GPU mining firms that ran thousands of cards would have to pivot to a different coin and would have the money to pump RVN or something else to make it the next big thing. I didn't count on them pivoting to AI, which is what happened, and RVN died instead.
Long term is hard if the base principles are being disected and redefined so there is def a disconnect in why people think bitcoin did so well. This was from me doing reseaech on people explaining what is bitcoin is from many different youtubes video, news articles, and rolling back in the past of some posts on this platform. I something i been noticing is that bitcoin and some popular cryptos have been having a trendline way to similar to IGV that follows Tech. which there seems to be a fine line of people seeing why and why its not. For the people the people that think why it is so it is because ranging things like overall market fear, GPU and ASIC miners being tech, some Bitcoin mining companies converting to AI hosting companies so its heavily mixed. Ontop of government and reserves scaring plans on regulation with only a vague hope of stablization. So to me it makes sense why there is it so random up down. Crypto had so many mixed identities because and hype added on as meme coins were more and more introduced. After robinhood saying out of their earning, 18% or less was from crypto transactions. I was feeling it would have been alot more, but i want to see what coinbase reports out it. Cause it will show if it is a shift from crypto to stock, or if people/Ai investors are more inclined to keep the trandlines the same.
this is to happen soon as theres no more GPU/TPU available to buy
I agree with some of your points, and I agree that mining centralisation hasn’t “broken” major chains like Bitcoin, nor am I suggesting it’s an existential emergency today. What I’m exploring is the impact of making mining more fair and widely hostable, and how that changes the feasibility of attacks and external influence that exist today, and have occurred in the past, largely because participation is relatively limited and concentrated. Even when miners are economically incentivised to behave honestly, mining infrastructure remains susceptible to regulatory pressure, censorship requirements, energy controls, and pool level coordination. This isn’t a new PoW algorithm or an attempt to outcompete Bitcoin economically. It’s a narrower and, as far as I’m aware, unexplored question: what happens if parallelism itself is removed from traditional PoW as a source of advantage, independent of hardware, and what happens when that enables mass participation of solo nodes? Instead of ASIC resistance or GPU resistance, the system enforces an equal work rate per node (1 hash/sec), effectively making it CPU-proof as well. The goal is to observe whether this materially changes participation, distribution of influence, and network resilience under real conditions. You’re right that incentives push large miners to behave honestly, but incentives don’t eliminate structural asymmetry, they simply make it tolerable. I’m curious whether engineering that asymmetry away meaningfully changes who can participate, how influence is distributed, and how the network evolves. If it turns out this approach doesn’t improve anything in practice, that’s still a useful outcome. That’s why I’m starting with demos and early stress testing rather than claiming it already “solves” decentralisation.
You'd think someone who is smart enough to invent GPU mining would be smart enough to have a throwaway reddit account. This is on you, bro. LOL
Are you just trying to solve the tendency of mining to become more centralized? I don't think that is a big worry anymore. Centralization of mining was more of a concern when the biggest worry was a concerted government attack by one or more nation. Even coins that use multiple POW algorithms to spread out mining like Myriadcoin aren't doing that well. You also have GPU only mining like Ravencoin that discourages centralization. They aren't doing that great either (although I think they will survive in the long run ... barely). As for the major coins like BTC, mining facilities have strong incentives to be honest even if they become too centralized. Last, decentralization isn't just about mining on a certain coin. The plethora of crypto coins is a form of decentralization too. Successful attacks on top coins would just result in their hardening and even forking. Like cutting off the head of a hydra, you just get several more.
Mining Bitcoin requires zero GPU, Bitcoin is mined with ASICS that can only be used to do sha256 and nothing else. Bitcoin also uses stranded electricity, allowing it to be heavily distributed, AI requires massive centralised data centres.
What he’s saying is.. mining bitcoin requires GPU’s and massive amounts of energy… but so does AI.. and theres a limited supply of GPU’s and energy snd all the infrastructure that goes into it and all these tech companies are spending hundreds of billions on the same limited resources bitcoin needs..
La meilleure méthode pour miner chez soi, c'est d'utiliser des GPU (des cartes graphiques). D'ailleurs la plupart des logiciels ASIC te demandent où sont tes GPU au moment de l'installation. Donc il te faut justement un ASIC qui coute environ 2000€ au minimum (je crois). En gros, c'est une carte mère minimale avec un bon processeur, de la RAM et plusieurs emplacements pour des cartes graphiques. Si tu veux faire ça sur ton PC maison, tu gagneras rien ou presque. Je pense même qu'avec ton pc allumé h24 7jours sur 7 tu perdras de l'argent. Je penses que le plus rentable est de louer des serveurs ASIC en mode cloud mining.
If your using it for just that "analyze products and market trends" it might give u some knowledge, but using it for anything else is laughable, 4 days ago it could remember things now it's useless yet they bought all ddr5 ram and GPU's before that
It’s a classic crypto confession: the soul-baring journey from paper hands to "this time it’s different." It’s raw, it’s emotional, and it reads exactly like a prompt that told an AI, "Write a relatable Reddit post for a crypto sub using the 'Hero's Journey' arc, but make it sound like I’ve been humbled by a candle." Here is a breakdown of why your "inner monologue" feels suspiciously like it was generated by a server farm: The AI "Sentience" Checklist The "Vulnerable Pivot": "The worst part wasn’t even the money — it was realizing..." This is the classic GPT mid-paragraph epiphany. It’s designed to make us feel like there’s a human heart beating behind the screen, rather than just a very efficient GPU. The LinkedIn Lunatic Rhythm: Short, punchy sentences. High drama. The "I’m posting this as a reminder to myself" trope—which is AI-speak for "I need a transition to the moral of the story." The "Conviction" Buzzword: Nothing says "I asked a chatbot for financial motivation" quite like the phrase "trading my conviction for emotions." It’s a bit too poetic for someone who just watched their portfolio pull a Houdini. They can smell their own…
We've seen a growing demand for privacy coins which has reignited interest in proven legacy coins like Nerva which has been around since 2018. Nerva is based on the same code as Monero but with a different algorithm that helps decentralize the network. Meaning private mining companies can't develop custom ASICs or deploy large GPU farms which would jeopardize the networks security. Definitely not a meme but time will tell.
Check whattomine .com find your GPU so see how much could get a day . Maybe 5cent , 10 cent .
but can you have a GPU tooth, like a gold tooth? i dont think so. touchè.
A watch absolutely can mine bitcoin, just not particularly fast. An Apple Watch, for example, has more processing power and connectivity than the CPU my friend used to solo mine 50 Bitcoin in 2009 or 2010. The probability of a CPU or even a GPU actually doing that again is minuscule. But as part of a mining pool a watch purpose built to mine BTC could absolutely make a few cents a year. This, however, does not appear to be such a device.
Today I discovered there’s a daily thread here. Been a part of the space since early 2017 starting out as an eth miner, building GPU rigs. I’ve made lots of mistakes and no I’m not rich now despite having discovered crypto that early, bc of said mistakes. That being said, I did make a decent amount of money in the space during the 2020-2021 bull market. Naive me “got back” into the space in Dec 2024 on beliefs of the cycle being on the full upswing again. Lost a lot of money leverage trading on Hyperliquid, and I kind of rejected crypto from that point on, partially because of the losses but mainly from having “seen the light” that was triggered by that event, that most of crypto is vaporware despite there genuinely being real-world use cases for it, amongst other things. I guess that was a blessing in disguise, since I became a tradfi normie and went back into the stock market at the beginning of last year and have since made up for those losses and then some. Why am I writing all of this? I guess just to share my story and perspective, as someone who was here maybe not in the “early days” like pre-2016, but earlier days than now. It’s a real shame that not even Bitcoin seems to be worth putting money into anymore, with current events putting its long marketed narrative of being a hedge or safe haven against fiat inflation to a real-time, real-world test (after its original narrative of being a currency failed to materialize), and showing that it may well in fact not even serve as that after all. I’m not writing this as a bear, it genuinely makes me sad to see the state of crypto after having been in the space in those earlier days. I wish Bitcoin was pumping with gold right now, proving the narrative of being a safe haven to be true, so I could feel good about starting to accumulating a position again at some point. This doesn’t change the fact that Bitcoin is a pristine, immutable, censorship-resistant asset, Eth is still the world computer offering invaluable decentralized finance services, and Monero is true digital cash, just to name a few. But, this is just another hole being shown in one of the many narratives of crypto that have been spouted over the years and I hope more people are realizing that some of the things that “nocoiners” have been saying do have some merit, even if it might not be for the same reasons they are/were saying it, as disappointed as I am to say this. I might go back to accumulating a small position of Bitcoin at some point just for the fun spirit of it, but it seems pretty clear that its touted position of being “digital gold” doesn’t seem to be holding much water, at least at this point in time. Or maybe it’ll go to the moon and I’ll eat my words :)
well said. Vitalik knows nothing about AI and how to decentralize it. No models, no code, terribly inefficient EVM that cannot train or run the simplest models = all talk. You want to decentralize AI folks? Every single one of you who have a GPU should be running your local AI that does not even need an internet connection.
I mined *a lot* of monero just before GPU boom and sold it pretty quick as it came. Would be up a huge amount of $$ but is what it is.
Yes. The guy in this picture is also one of the pioneers that found out that GPU's are better for mining than CPU's. 10000 BTC is probably the tip of the iceberg for him. He stated that this transaction helped adoption and he doesn't regret it
Thanks man. Back in the day when you could mine ETH with GPU, it was a nice market because it was literally a computer. But now, an ASIC in a home is not easy, first because they arent cheap and second because noise and heat, and usually electrical installation in a house cannot support more than 1 or maybe 2 with luck
With only a public key, recovering a private key is computationally infeasible under modern cryptography. However, recovery can become practical when there are strong, verifiable hints that drastically reduce the search space—such as a partially known mnemonic (e.g., 1–5 missing words, or several known words with gaps) and other constraints. From my perspective, legitimate wallet recovery can be a viable business in the future, but only when handled responsibly. I conduct strict compliance and ownership checks upfront to ensure the funds are lawful and that the request is authorized. If you’re working with secp256k1 at scale, performance quickly becomes the bottleneck. My repository (@ipsbruno3/secp256k1-gpu-accelerator) implements GPU-accelerated elliptic-curve point arithmetic in OpenCL, enabling very high throughput (over 1 billion per seconds) for research and benchmarking workloads.
Because after the attack the CPU /GPU still have value. If you attack a coin with asic friendly pow, all your asic become useless.
https://i.redd.it/bmcwqvazaleg1.gif The code also automatically rents GPUs from [Vast.AI](http://Vast.AI) and is highly scalable. I managed to hit 80 million hashes per second at some points at a cost of $70 per GPU per month. Each RTX PRO GPU processed 1.2 million seeds per second, and the RTX 5090s processed over 1.1 million. If any company is interested in my work, I can sell this project. I still need about $35,000 to recover my seed with several Bitcoins.
Thank you to everyone who commented. I believe that in a few months I should have positive results to share with you. This dashboard connects to a distributed fleet via **WebSockets** and aggregates **real-time telemetry** (GPU status, per-slot progress, global throughput, and live updates) directly from the workers. The server also tracks slot ownership, freshness/heartbeat, and progress ranges to produce accurate ETAs and a clear “what’s running where” view. The system is designed to scale horizontally and is currently scanning candidate space for my lost seed at **40M+ hashes/sec**. Check out my work: https://i.redd.it/lcjurvo9aleg1.gif
Whales are looking to liquidate MSTR as they publicly stated that they would be in trouble when BTC reaches $20k. All hell will break loose, cascading domino effects, all exchanges bankrupts as reserves are all fake, 99.99% of projects capitulates as Devs can't pay the bills and finds a real non crypto job, miners switches off asic and pivots to providing GPU for AI instead, everyone's portfolio goes down 90-99%, USDT depegs, and we start over fresh.
Running nodes. I'll put in that money into buying GPU. AIOZ is one of the recent projects who shared info about how to run the [AIOZ CLI](https://x.com/AIOZNetwork/status/2012239177069281739?s=20) node. It's easy and passive.
This is why when ppl used to mine, you mined shit that was ASIC resistant. At least then, you have a GPU able to do something else.
That's how it's supposed to work but unfortunately more miners cannot join in economically as modern mining is so efficient anything other than using a state of the art ASIC (from the past 2 year) and living in a low cost electricity market, will not be economical. Difficulty going down doesn't make GPU miners viable. It may allow ASICs from 5 years ago to re-enter but it's not like the old days where many people had the hardware to competitively mine.
It’s incredible looking back at how much the space has evolved. I remember when mining was all about the community. Sharing tips, troubleshooting rigs together and just figuring it out as we went. The shift from GPU to ASIC was such a pivotal moment. It felt like we were part of something huge that was just beginning to unfold. Now with mining so commercialised it’s wild to think about those early days and how much they shaped the ecosystem we have today. Feels like a lifetime ago but those roots still influence everything.
Every year, computing can check all those atoms or grains of sand a lot faster. A GPU can already do more math in a millisecond than I can by hand in a lifetime. Sure, it’s many orders of magnitude off from cracking a Bitcoin wallet, as is all the computing power in the world combined. But that’s today, and who knows what tomorrow may bring. This is, after all, still a finite number, which literally means that it’s not impossible with enough time and computing power. And the issue with Satoshi’s wallets in particular (assuming he’s dead and/or the private keys are lost) is that they can never be upgraded with the rest of the network to a new quantum-proof solution, forever existing as a bounty for some potentially mind-blowing supercomputer of the future. Everyone else who upgrades their wallet in time will be okay, so I don’t think this is doom and gloom, but I fully believe someone will eventually crack some of those wallets. It might just be a hundred years from now. Or ten. But wtf do I know.
DDR5 or a GPU, but only when in profit. so after the 17k crash a few years ago. You bought then right?
The water is more concerning. New GPU’s run so hot they cant physically be cooled by air. Not even close circuit as the water gets hotter than your shower. They can only be evaporative cooling or pump and dump where that shower water gets dumped back into rivers or streams killing everything
Yeah The xpub it's not useful for me! You are right I would most likely end up spending more electricity than it's worth, although I can adjust the script to GPU acceleration if any reditor wants I can do that for them to try I didn't do it because I don't own a GPU 🫣, I'm poor don't judge. I'll keep trying maybe I get a strike of luck I'll also review my script maybe bloom filters can be applied.
Butterfly effect. If you had your memories from today but were back in your body from 2010, you would act differently than you had done it the first time round. Therefore, your changed actions would have the potential to completely alter the course of history - maybe not in any meaningful way on the global scale, but definitely in your personal life. He might be able to force the encounter with his wife, since he'd presumably know where she lived at that time, but it definitely wouldn't have occurred "naturally" like the first time. This dude aside, I'm not married so I'd definitely rewind back to 2010 and get a massive GPU rig!!
In 2018 I used my GPU mining rig to heat my apartment in Pennsylvania. People said it was a stupid idea but I was killing two birds with one stone
finally an explanation for why I run so hot when I'm sick my body's literally a mining rig and the fever is just the GPU temps going through the roof. no wonder I need to hydrate so much - gotta keep that cooling system running next time my doctor asks about symptoms I'm just gonna send them my hashrate
> Actually, I'm looking to buy a new GPU soon because prices are most likely going to go up > I'll assume you're not familiar with the deflation that Japan had experienced for the past couple decades, if not longer. Japanese people got into a saving mindset after the dotcom bubble. Thus, despite hundreds of billions of dollars in stimulus from the central bank of Japan, Japan still experience deflation and a stagnant economy all because people don't like to spend over there. I'll also assume that you're not familiar with the works of Milton Friedman: "In their 1963 book A Monetary History of the United States, 1867–1960, Milton Friedman and Anna Schwartz laid out their case for a different explanation of the Great Depression. Essentially, the Great Depression, in their view, was caused by the fall of the money supply. Friedman and Schwartz write: "From the cyclical peak in August 1929 to a cyclical trough in March 1933, the stock of money fell by over a third." The result was what Friedman and Schwartz called "The Great Contraction" — a period of falling income, prices, and employment caused by the choking effects of a restricted money supply. Friedman and Schwartz argue that people wanted to hold more money than the Federal Reserve was supplying. As a result, people hoarded money by consuming less. This caused a contraction in employment and production since prices were not flexible enough to immediately fall. The Fed's failure was in not realizing what was happening and not taking corrective action." As you can see, hoarding money doesn't lead to anything good. Bitcoin would exacerbate this because its purchasing power increases due to its fixed supply. Even Hayek was against deflation: “There is no doubt, and in this I agree with Milton Friedman, that once the Crash had occurred, the Federal Reserve System pursued a silly deflationary policy. I am not only against inflation but I am also against deflation! So, once again, a badly programmed monetary policy prolonged the depression” “Although I do not regard deflation as the original cause of a decline in business activity, such a reaction has unquestionably the tendency to induce a process of deflation – to cause what more than 40 years ago I called a ‘secondary deflation’ – the effect of which may be worse, and in the 1930s certainly was worse, than what the original cause of the reaction made necessary, and which has no steering function to perform. I must confess that forty years ago I argued differently. I have since altered my opinion – not about the theoretical explanation of the events, but about the practical possibility of removing the obstacles to the functioning of the system in a particular way”
>The answer is because you need or want it now, inflation/deflation has nothing to do with it. Actually, I'm looking to buy a new GPU soon because prices are most likely going to go up >Completely false. Inflation is not needed to incentivize businesses to be more productive, seek out investments, or spend their money. That is a natural want to people to have. I'll assume you're not familiar with the deflation that Japan had experienced for the past couple decades, if not longer. Japanese people got into a saving mindset after the dotcom bubble. Thus, despite hundreds of billions of dollars in stimulus from the central bank of Japan, Japan still experience deflation and a stagnant economy all because people don't like to spend over there. I'll also assume that you're not familiar with the works of Milton Friedman: "In their 1963 book [*A Monetary History of the United States, 1867–1960*](https://en.wikipedia.org/wiki/A_Monetary_History_of_the_United_States), [Milton Friedman](https://en.wikipedia.org/wiki/Milton_Friedman) and [Anna Schwartz](https://en.wikipedia.org/wiki/Anna_Schwartz) laid out their case for a different explanation of the Great Depression. Essentially, the Great Depression, in their view, was caused by the fall of the money supply. Friedman and Schwartz write: "From the cyclical peak in August 1929 to a cyclical trough in March 1933, the stock of money fell by over a third." The result was what Friedman and Schwartz called "The [Great Contraction](https://en.wikipedia.org/wiki/Great_Contraction)"[^(\[9\])](https://en.wikipedia.org/wiki/Causes_of_the_Great_Depression#cite_note-9) — a period of falling income, prices, and employment caused by the choking effects of a restricted money supply. Friedman and Schwartz argue that people wanted to hold more money than the Federal Reserve was supplying. As a result, people hoarded money by consuming less. This caused a contraction in employment and production since prices were not flexible enough to immediately fall. The Fed's failure was in not realizing what was happening and not taking corrective action." [Causes of the Great Depression - Wikipedia](https://en.wikipedia.org/wiki/Causes_of_the_Great_Depression) As you can see, hoarding money doesn't lead to anything good. Bitcoin would exacerbate this because its purchasing power increases due to its fixed supply. Even Hayek was against deflation: “There is no doubt, and in this I agree with Milton Friedman, that once the Crash had occurred, the Federal Reserve System pursued a silly deflationary policy. I am not only against inflation but I am also against deflation! So, once again, a badly programmed monetary policy prolonged the depression” “Although I do not regard deflation as the original cause of a decline in business activity, such a reaction has unquestionably the tendency to induce a process of deflation – to cause what more than 40 years ago I called a ‘secondary deflation’ – the effect of which may be worse, and in the 1930s certainly was worse, than what the original cause of the reaction made necessary, and which has no steering function to perform. I must confess that forty years ago I argued differently. I have since altered my opinion – not about the theoretical explanation of the events, but about the practical possibility of removing the obstacles to the functioning of the system in a particular way” [Social Democracy for the 21st Century: A Realist Alternative to the Modern Left: Hayek on Secondary Deflation](https://socialdemocracy21stcentury.blogspot.com/2011/01/hayek-on-secondary-deflation.html)
I’ve cracked seed phrases with up to 3 words missing within a few hours. With 4 words it might take a day or two with consumer GPU hardware.
That's pretty good, well done :) Maybe it's going a little bit beyond the question (and also you went in that direction already), but I definitely would add that if blocks are coming in too fast (= difficulty too low) it increases the likelihood of undesired consequences, such as orphaned blocks or worse, temporary chainsplits that might result in short term chain reorgs. In the past, it also had a nice side effect of preventing people who made big leaps in mining hardware (first GPU miners, first ASIC miners etc) from acquiring too many coins too fast, ensuring a more distributed coin ownership, or at least a fairer chance of such. Nowadays, with most of the coins mined already, it is less important but still a great reminder of a well thought out incentive design by Satoshi.
Honestly people don’t use GPU to mine unless for hobby. So no impact there
Wow, that's incredible! 2 billion on a modest GPU is absurd. What tool were you using? My secpk accelerator is doing 1 billion per second using point\_add; the trick here is that it doesn't need to keep redoing multiplication on big ints all the time, just add a point on the curve, incrementing and summing. Regarding my seed, you're partially correct. The last 5 are 2048\^4\*128 and not 2048\^5. The last word in BIP39 is a checksum of the first 11. This is the trick that will allow me to recover my Bitcoins in 3 years and not 40 years.
Hashcat , on GPU, if you have a general idea of the password it's doable, I cracked a LUKS2 with 2x 3080 , took two months
That's the beauty of BIP38 - unlimited attempts, you just need patience... and a lot of GPU power 😅
Let's just say it was definitely worth the mass GPU hours 😉 But honestly, even if it was 0.01 BTC, the satisfaction of finally cracking it after 7 years would've been worth it.
Not really.... Cheap GPU means more nodes
Isnt GPU Price increase extremely bad for miners and bitcoin? Im worried, 5000 Dollar for a 5090 seems insane
Wen RAM, GPU, CPU RWAs.....?
Gridcoin (GRC) Started off much like Folding@Home where you donated GPU/CPU cycles to projects that actually directly contributed to real life projects and humanist goals. Dev soon enabled ASIC support when they were fresh on the scene and it destroyed the entire narrative of altruistic intent as the blockchain was immediately dominated by the few equipped to go bonkers on it then flood the market. An attempt was made to balance/scale the rewards eventually but not before the damage was done.
I was like, "I absolutely do find this interesting and would like to do it. However, my procrastination wetware dictates that I shelf it for later." After CPU mining is no longer viable. Then GPU, then FPGA, then ASIC (need a farm). Then the dip and the dip after the peak,...
RAM is in extreme demand and zero supply. GPU ‘s used to also be this way back in 2020 and now they are about to be again. So Micron and Nvidia and other AI partners will be even more crazy in the next 10 years
"Nvidia incorporates Bitcoin into GPU development!" Ok, run that baby as a headline. The stonk and BTC price will fly
One could be SOL and the others in utility Altcoins, with solid projects based on RWA, AI, Cloud, Cybersecurity, GPU...
You haven't been paying attention if you don't think Bitcoin is a medium of exchange (yet still assuming you're just meaning like Fiat and everyday product and service purchases)... Even so, Look what Jack Dorsey and square are doing... Jack mallers and strike... It's always been a medium of exchange, all the way back to Laszlo the pizza guy, May of 2010 But it can be lots of things Store of value Money Digital gold Digital capital Delayed proof of work And what I'm not wanting which is ordinals and a spammer storage Paradise What's funny about pizza guy is that it really wasn't a medium of exchange as Fiat was still involved... Which is really what strike is doing putting Bitcoin in between... Laszlo on Craigslist offers to pay 10,000 Bitcoin for pizza delivery, But pizza delivery guy has to use his Fiat to buy the pizza... You might not know it but laszlo did this multiple times! 40 to 50K? I think it's enough to where he doesn't want to admit, but Satoshi was blaming him for hogging all the mining and not getting Bitcoin distributed quick enough, which is ironic. Considering Satoshi now seems to be responsible for a freezing over a million Bitcoin... Laszlo started GPU mining by the way Bitcoin is supposed to be peer-to-peer but involves middlemen anyway because we're still so stuck on Fiat and banking, and exchanges... Someday it may be more strictly peer-to-peer and fully disintermediate the middlemen... We've always been able to operate it that way if we want... Since Hal Finney and Satoshi moved around the first 50 Bitcoin
It's an app that uses GPU... I'll send you the link so you can see for yourself. https://t.me/ProfitHubMining_Bot?start=404207
Look into Level 1&2 coins. At some point in the near future, one of either data centers, miners, or hospitals are going to be odd man out when it comes to sufficient access to power. A bitcoin transaction takes about 7k kw. Level 1$2 blockchain transactions run on $3500 GPU’s and cost $.30/hr to run and do 1000’s of transactions an hour. But it’s a roulette spin which on hits.
You need HDD to mine? I thought GPU only.
Case for yes: throughput, consumer apps, active devs across Jupiter, Phantom, Drift. Keep core in BTC and ETH; add a measured SOL sleeve and review yearly. If you want utility overlap, follow Ocean Protocol for data, Akash for GPUs, Render for GPU work.
No, that's not happening anymore. The mining difficulty level we have today requires operations at industrial scales. You can't mine on a consumer grade CPU or GPU or even a small dedicated miner. The returns will be insignificant.
you can’t. Miners are hardly achieving profits with economies of scale and using free electricity. Your best shot is to mine some GPU alt if you have already some beefy GPU available, do not buy one for this purpose as ROI is just not there
BSC token : 99,9% are scams or memes. I can't name a single serious project on that chain. Twitter feed : more than 400K followers, all are artificial / paid. All the posts are only commented by bots. Pinned post is about a 30% staking revenue, why would a tech token need a staking revenue? Everythiing about their twitter feed screams SCAM Chart : Goblin Town with no sign of recovery in sight As for the tech : yeah there's nothing note worthy or ground breaking. Another GPU wannabe network. There are so many out there already, this one is probably the last one I would pick. I'm not even sure their data center exists, and if it does why would it need a token
In terms of prssing I'd say no. I put in a Ryzen 7700 (which was about $300ish when I bought it), on a reatively cheap motherboardd, but saying that's not powerful is relative to what you're comparing it to. My server is basically a mid to lightweight gaming build. But it doesn't have a monitor of peripherals. I SSH into it over my home wifi, hence the 'headless' label. For context, it cost me about $1500 to build the server from parts. I do have 10TB of on board SSD (which is about half the cost), which is a bit overkill but allows me to index for faster querying. The entire blockchain on a node is compressed, but is still approaching 900GB. Unpacked and indexed, I'm looking at about 3.4TB of space needed just for the database. I have a script running live processing on every block as it arrives, so I'm always within a minute of true live data. I'm looking at upgrading my RAM this month from 32GB to 64GB and I'll be adding a GPU at some point for some more advanced stuff... I just stood up a web server ($5 a month and $10 for the domain), and I'm currently writing some cron jobs to push data up to the web automatically for those who like analytics. Hoping to build a community, eventually... like glassnode, but without a paywall. One can dream... I don't care to create a company... but just spread Bitcoin cheer and awareness. If you're asking about power because of mining (which is a common misconception) which does require a lot of power, I felt it important to highlight that running a node from a power perspective is really light. It might cost me between $10-20 annually to run my node and analytics from an electricty point of view.
probably just broad market weakness and rotation into Bitcoin/AI hype assets ETH hasn't been catching strong inflows like BTC, liquidity's tight and sentiment's titled risk off lately so it's lagging despite any AI/GPU narrative.
I am super glad I finally decided to sell. I bought myself a ford fiesta (2015 model) at a great price, I now own a VR headset, and an all new GPU. It feels good to know I sold at 100K, Was literally one of the fest to h fresh and there was no turning back to such highs. I always knew bitcoin was not going to make it. But I am at least finally living the life
No need to buy it. You just mined it on your laptop GPU before exchanges. And the first exchanges were very early.
This guy invented GPU mining. He had much more Bitcoin than most other players at that time, Satoshi asked him to spend a bitcoin to allow for more to enter the ecosystem.
By this logic we should use AK-47’s and GPU’s as a smart store of tangible value.
By this logic we should use AK-47’s and GPU’s as a smart store of tangible value.
TRUE. How can they spend their money on such a garbage trap? You have to choose the useful ones. I think RWA, GPU and cyber security will work well.
He doesn’t. GPUs haven’t mined Bitcoin since like 2015. And the last GPU miners all basically went extinct in 2022 when Ethereum upgraded to proof-of-stake. I wish there were something worthwhile to mine, it was great heating my house with my gaming PC in these cold winter months.
That depends on where you're looking. Ten years you say? Funny, that's how long it's taken one project to hypothethise a better system and implement it this past year with a new computer language. It's the only player in the AI space that is not a large language model or requires a GPU. They have patents and are implementing the next generation of governance where conversations scale beyond millions of individuals so everyone can have a voice. Gone are the days of weighted bias for today's DAO's where more money = more votes. Decentralized development is taking a leap at Tau.net.
Laszlo Hanyecz wasn't a trader though. He spent bitcoin. It's ok to spend bitcoin but trading bitcoin isn't a wise idea. Laszlo Hanyecz invented GPU mining and he owned so many bitcoins that Satoshi actually encouraged him to spend a bunch of them to get them into circulation. Hanyecz estimates that he spent 100,000 of them on pizza in 2010. We celebrate that day because it was the first known time that someone purchased something with bitcoin.
He wasn't trading. It's ok to spend bitcoin but trading bitcoin isn't a wise idea. He invented GPU mining and he had over 100,000 bitcoins, and he spent a bunch of them on pizza. It actually was good of him to get those bitcoins out into circulation instead of hoarding 100,000 bitcoins only a year after bitcoin was launched.
Did you mine more coins with the new GPU or did you simply stop?
Because they are worthless. If you are looking for an altseason you have to see serious useful projects such as GPU, AI, cybersecurity...not memes
If you’re trying to mine actual Bitcoin with an RX 7600, sorry bro… you’ll find gold in your backyard sooner. Your GPU can mine other coins via NiceHash. It pays you in BTC so you still feel like a Bitcoin miner🤣
Missed GPU mining by a hair (about a decade). Forget it. Mining is an industrial scale competitive business. It's like drilling for oil with a tea spoon.
Again, sort of. Yes for gold but no for GPU and Ram. That was definitely the case previously but now the increases are due mainly to AI. The energy cost of mining and the cost of transactions for Btc are one and the same. A Btc transaction is validated on chain in a block. A block is completed when the Btc within the block is successfully mined. The difficulty, the average amount of energy that’s required, to solve a block goes up and down dependent on the amount of time that the previous blocks took to solve/mine. By and large the whole “Bitcoin uses enough energy to power a small country” narrative was promoted by parties who were against crypto and not the energy usage itself. You can see this is the case with Bill Gates 180 degree pivot from “Bitcoin energy usage is bad” to “AI Energy usage is necessary” for example. But going back to your original point I agree with you for the most part.
Yea but the actual action of trading the gold for goods and services only uses energy with your arm muscles. Crypto currency is sucking up energy via mining AND transactions. It’s also causing GPU and RAM prices to skyrocket. Look at RAM currently… that has to be factored in as well.
Even if BTC faces a down cycle, solid DePIN projects are still building real infrastructure. AIOZ with decentralized streaming, RNDR with GPU rendering, and TAO powering IoT nodes, all continue growing, these are the plays that matter long term. HODL guys.
Let me explain in economic terms why teeth will never replace Bitcoin, no matter how “whitepaper-ready” they look. --- ## Why Teeth Are NOT the Next Bitcoin *(But would make a great meme coin.)* ### **1. Scarcity? Yes. But… too biological.** Bitcoin: fixed supply of 21 million. Teeth: fixed supply of **32**, minus whatever your childhood self already traded to the Tooth Fairy for quarters. Ultra-scarce? Yes. *Investable?* Not unless you want a smile that resembles a broken picket fence. --- ### **2. Permissionless? Technically… no.** Bitcoin: Anyone can mine. Teeth: You technically need: * a dentist’s approval * anesthesia * emotional support * possibly a ride home This is **NOT a decentralized extraction protocol**. --- ### **3. Trustless? Hard no.** Bitcoin lets you transact with strangers without trust. Teeth require: * trust in the tooth’s origin * trust that it’s not “freshly harvested” * trust that the person didn’t just pull it out in the parking lot like a barbarian *No one wants KYC to stand for “Know Your Canines.”* --- ### **4. Fungibility Problem** Bitcoin: every coin is equal. Teeth: * This one has a cavity * This one is yellow * This one is suspiciously sharp * Why does this one have braces still attached? It’s basically the opposite of fungible. It’s **funny-gible**. --- ### **5. Portability** Bitcoin: fits on a USB or your brain for all I care. Teeth: Imagine carrying a bag of loose molars everywhere. That’s not a “wallet”—that’s **evidence**. --- ### **6. Durability** Bitcoin lasts forever. Teeth… well: * Coffee * Sugar * Forgetting to floss * Accidental popcorn kernel attack Your currency shouldn’t be defeated by caramel. --- ### **7. Permissionless Mining** Bitcoin miners: GPU rigs plugged into the wall. Tooth miners: Oral surgeons with a 6-year degree and a dental drill. Not very inclusive for the average degen. --- ### **8. Deflationary Mechanism** Bitcoin: halving events. Teeth: “oops I chipped it on a tortilla chip.” You don’t want monetary policy dictated by snack foods. --- ### **9. Self-custody Is a Nightmare** "Not your keys, not your coins." "Not your teeth, not your… gums?" Losing your hardware wallet is stressful. Losing your tooth-wallet is a trip to the ER. --- ### **10. Network Effects** Bitcoin has millions of users. Teeth currency would have: * dentists * pirates * the Tooth Fairy That’s not an economy. That’s a children’s book. --- ## Final Verdict While teeth are: * scarce ✔️ * unique ✔️ * hard to fake (usually) ✔️ They are also: * gross ❌ * painful to “mine” ❌ * not accepted at Starbucks ❌ * likely to get you arrested ❌ So yeah, Bitcoin will remain safe from being replaced by molar-based money, at least until someone launches **ToothCoin (TTH), “Because value should hurt.”**
ASICS are basically the sha256 algorithm in hardware, it can physically do anything other than hash, it's not a CPU. It's not a GPU.
Examples: Ocean’s Compute-to-Data for safe AI on sensitive datasets, Akash’s growing GPU marketplace, Render’s distributed rendering and AI work, Bittensor’s incentive-driven subnets, AIOZ’s DePIN for storage, streaming, and edge AI.