DGX
Quest Diagnostics Incorporated
Mentions (24Hr)
0.00% Today
Reddit Posts
Be honest. How much of y’all are just mad and Petty you missed the boat on NVDA?
Nvidia earnings to offer first true glimpse of the AI windfall
Tech companies developed its edge-computing AI system to build an AI ecosystem
Seeking Feedback on my Stock Earnings Digest App using ChatGPT!
Wall Street analysts expect Nvidia stock ($NVDA) to surge 25%; are they underestimating?
Unleashing the Hybrid Cloud AI Revolution: Nvidia's DGX, IBM's Ansible, and the Perfect Storm
Unleashing the Hybrid Cloud AI Revolution: Nvidia's DGX, IBM's Ansible, and the Perfect Storm
Unleashing the Hybrid Cloud AI Revolution: Nvidia's DGX, IBM's Ansible, and the Perfect Storm
NVIDIA Co. (NASDAQ:NVDA) Shares Purchased by Polaris Wealth Advisory Group LLC
Nvidia released a new "nuclear bomb", Google chatbot is also coming, computing power stocks again on the tide of halt
Nvidia: Excellent Quarterly Earnings and On Its Way to Next Trillion-Dollar Company
DGX stock slips as outlook disappoints amid hit to COVID-19 test revenue (NYSE:DGX)
"$CEMI - CHEMBIO DIAGNOSTICS - BUY UNDER .50"
"$CEMI - BUY UNDER .50 BEFORE THE NEXT P.R. TRIPLES THE SHARE PRICE"
"$CEMI - Chembio Diagnostics - Buy Under .50"
"$CEMI RECEIVES $3.25M C.D.C. CONTRACT"
Intel falls 10% after disappointing Q2 results: $0.29 EPS vs $0.70 expected. $15.3 billion in revenue vs $18 billion expected. CEO says third quarter is bottom
Intel falls after disappointing Q2 results: .29 EPS vs .70 expected. CEO says third quarter is bottom
At 6.2% CAGR, Viral Disease Diagnosis Market Size to Reach US$ 30,046.1 Mn by 2030| Rising Prevalence of COVID-19 and Advances in Clinical Research and Molecular Diagnostic Technology is Expected to Drive Market Growth $DGX $LH $CODX
At 6.2% CAGR, Viral Disease Diagnosis Market Size to Reach US$ 30,046.1 Mn by 2030| Rising Prevalence of COVID-19 and Advances in Clinical Research and Molecular Diagnostic Technology is Expected to Drive Market Growth $DGX $LH $CODX
Monkeypox declared a global health emergency by the World Health Organization $DGX $LH
Monkeypox declared a global health emergency by the World Health Organization $DGX $LH
$DGX: Quest Diagnostics beats by $0.10, beats on revs; guides FY22 EPS above consensus, revs above consensus (134.84)
$DGX: Quest Diagnostics beats by $0.10, beats on revs; guides FY22 EPS above consensus, revs above consensus (134.84)
Quest Diagnostics (DGX), CDC Sign New COVID-19 Testing Deal
$DGX Big Government Contracts Plus Rise In Monkeypox And Covid meaning more testing…means more contracts coming in going “under the radar”
Quest Diagnostics (DGX), CDC Sign New COVID-19 Testing Deal
Quest Diagnostics (DGX), CDC Sign New COVID-19 Testing Deal
CDC Newsroom-Wednesday, July 13, 2022 Quest Diagnostics will begin testing for monkeypox. $DGX
CDC Newsroom-Wednesday, July 13, 2022 Quest Diagnostics will begin testing for monkeypox. $DGX
$DGX- EPS of $2.36 per share, beating the Zacks Consensus Estimate of $2.26 per share.
Quest Diagnostics Lifts Annual Guidance On Higher COVID-19 Test Revenue Anticipation $DGX
Quest Diagnostics Lifts Annual Guidance On Higher COVID-19 Test Revenue Anticipation $DGX
HHS orders additional vaccine, increases testing capacity to respond to monkeypox outbreak $DGX $LH
HHS orders additional vaccine, increases testing capacity to respond to monkeypox outbreak $DGX $LH
HHS Expanding Monkeypox Testing Capacity to Five Commercial Laboratory Companies $DGX $LH
Quest Diagnostics ($DGX) Tops Q2 Earnings and Revenue Estimates
Quest Diagnostics ($DGX) Tops Q2 Earnings and Revenue Estimates
Norges Bank (NORWAY) - Potentially Something HUGE Here
$NXOPF - NexOptic - STILL MAKING ITS RUN
$DGX Quest Diagnostics is about to SMASH Expectations
$DGX suddenly a PRIME Short Squeeze candidate DD
$DGX Quest Diagnostics - 6 Figure Nasal Reparations
$DGX Quest Diagnostics - 6 Figure Nasal Reparations
$DGX Quest Diagnostics - 6 Figure Nasal Reparations
$DGX Quest Diagnostics Earnings Recap - Go Baby Go
$DGX Quest Diagnostics IV - Earnings Extravaganza
$DGX Quest Diagnostics IV - Earnings Extravaganza
$DGX Quest Diagnostics IV - Earnings Extravaganza
$DGX Quest Diagnostics Pt. 3 - Leading a Redditor to Tendies
$DGX Quest Diagnostics Pt. 3 - Leading a Redditor to Tendies
$DGX Quest Diagnostics Pt. 3 - Leading a Redditor to Tendies
$DGX Quest Diagnostics Pt. 3 - Leading a Redditor to Tendies
Mentions
For context: A 'Kkanbu Alliance' of AI between leading companies from Korea and the United States was formed at a Korean chicken restaurant. Jensen Huang, CEO of NVIDIA, Lee Jae-yong, Chairman of Samsung Electronics, and Chung Eui-sun, Chairman of Hyundai Motor Group, met on the 30th at a chicken restaurant in Gangnam-gu, Seoul, for a three-way 'chimaek' (chicken + beer) gathering. The meeting lasted for about three hours, including the NVIDIA event held nearby on the same day. ● Unprecedented Meeting of Corporate Leaders The meeting was unprecedented. CEO Huang remarked, "Today is the best day of my life." The leaders of global companies, including NVIDIA, Samsung Electronics, and Hyundai-Kia Motors, with a combined market capitalization of approximately KRW 8,300 trillion, visited the 'Kkanbu Chicken' store near Samseong Station in Gangnam-gu, Seoul, and enjoyed a public chimaek in front of hundreds of citizens. CEO Huang entered the chicken restaurant with Chairman Chung around 7:20 PM after arriving in Korea. He wore his signature black leather jacket and a black T-shirt. Chairman Lee arrived about five minutes later and embraced CEO Huang. Chairman Lee and Chairman Chung also wore casual white T-shirts. This gathering was arranged because CEO Huang wanted to experience Korea's chimaek culture. CEO Huang ordered fried chicken, spicy sea snails, and cheese sticks to share with Chairman Lee and Chairman Chung. They drank beer and also tried 'soju tower,' a device for mixing soju and beer, consuming several glasses. The three exchanged drinks in a 'love shot' style. When CEO Huang exclaimed, "Dinner is Free," Chairman Chung replied, "I'll cover the second round." However, it is reported that Chairman Lee actually paid the bill. The total meal cost at the restaurant was approximately KRW 2.5 million. ● Kkanbu Alliance Continued into the Night Jensen Huang, CEO of NVIDIA, takes a commemorative photo with Lee Jae-yong, Chairman of Samsung Electronics, and Chung Eui-sun, Chairman of Hyundai Motor Group, after a 'chimaek' gathering at Kkanbu Chicken in Gangnam-gu, Seoul, on the 30th. The informal demeanor of the global corporate leaders was broadcast live during the meeting. CEO Huang left his seat to distribute kimbap, banana milk, and chicken to citizens. During this time, Chairman Lee remarked, "It's been about ten years since I last had chimaek," to which Chairman Chung replied, "I eat it often." CEO Huang gifted Chairman Lee and Chairman Chung a bottle of Japanese Hakushu 25-year whiskey worth approximately KRW 7 million and NVIDIA's 'DGX Spark' AI supercomputer. The gifts were signed with the message, "TO OUR PARTNERSHIP AND FUTURE OF THE WORLD!" Lee Jae-yong, Chairman of Samsung Electronics, distributes chicken to citizens during a 'chimaek' gathering with Jensen Huang, CEO of NVIDIA, at Kkanbu Chicken in Gangnam-gu, Seoul, on the 30th. The choice of venue, 'Kkanbu,' which means close friend, was interpreted as a nod to the famous line "We are kkanbu" from the Netflix drama 'Squid Game.' CEO Huang stated, "I enjoy chimaek with friends, so Kkanbu is the perfect place." He repeatedly expressed, "So good. So Happy," at the chicken restaurant. Chairman Lee, leaving the restaurant, commented, "Happiness is nothing special. It's about enjoying good food and drinks with good people." The late-night chimaek gathering, lasting about an hour and twenty minutes until 8:40 PM, continued at the 'GeForce Gamer Festival' hosted by NVIDIA at COEX in Gangnam-gu, Seoul.
I just bought a DGX from NVDA… nah if they were that sold out why even bother send me a desktop…
Bought the dip on MSFT and continuing to hold DGX and AZO
I need DGX and AZO to start moving up
50% of Apple's sales are iPhones and they're a $4T company. Nvidia is doing something exponentially greater for humanity. I've been on this roller coaster for 7 years, I've heard every bear argument since then. The reason I bought Nvidia was bc of the DGX, and I thought..wow if they can solve autonomous driving then they will be worth a lot of money. I didn't think they'd have gotten here at that time. There's a lot more to come.
I'm gonna need DGX and AZO to pump
DGX, AZO, no bias
I work in HPC. Why would I spend thousands of my own money on too much hardware, when we have racks of DGX hardware? I wasn't born with money. I earned it, by working hard, saving hard, doing without. And by *not* spending it when stocks are stupidly overvalued. Buffett isn't either.
Autozone (AZO) and Quest Diagnostics (DGX), hoping to buy low after they have taken a beating last week
Medical Diagnostics: VCYT, DGX Legal like DISCO Financial services if you belive mortgage processing is going to face disruption Customer service - NICE and Verint - great poticial but giant companies Professional services - Workday for example
Digipower X DGX- The next runner
By happenstance, I currently work in HPC and have racks of DGX hardware. AI is in a big fucking bubble. Go "Big Badda Boom" soon.
AI to be monetized through many promising AI projects. Many specialized AI generative and be sold to many companies. Corrdiff AI extreme weather modeling. GE partnership and specifically their sonomet AI ultrasound and CT projects. RadimageGan generative AI medical imaging along with Deeptek and for AI augmented radiology projects. Clara parabricks genome sequencing AI software. Nvidia DRIVE solution for autonomous vehicles. Nvidia Omniverse partnered with Siemens for AI industrial production applications. Also has some US government contracts/ projects using DGX Superpod and working with DARPA All of these revolutionize the spaces they've been applied to. The amount of potential revenue derived from these projects could be insane going forward and they're all just getting started. The scope is broad and AI is only limited by power consumption at this point. There's an AI arms race going on across the globe and there's no reason to stop unless constricted by power restraints and production timelines.
> NVIDIA has been throwing billions at AI infrastructure companies but they don't have any optical interconnect plays in their portfolio. Nvidia calls them NVLink you regard. For inter-node (server-to-server) GPU communication, NVIDIA also integrates InfiniBand and NVLink Switch Systems (used in Grace Hopper and DGX SuperPODs). These extend NVLink-like performance beyond a single chassis.
Oh and one last thing. This was from RXRX Article itself and they used BioNeMo and guess what QSI proprietary platform is built on in collaboration with Nvidia? well, not a long shot here but just piecing information to all the similarities. QSI: "We are thrilled to collaborate with NVIDIA to make single-molecule proteomics more accessible to researchers," said John Vieceli, Ph.D., Chief Product Officer of Quantum-Si. We have been leveraging AI protein structure prediction tools with [NVIDIA BioNeMo](https://cts.businesswire.com/ct/CT?id=smartlink&url=https%3A%2F%2Fnam04.safelinks.protection.outlook.com%2F%3Furl%3Dhttps%253A%252F%252Fwww.nvidia.com%252Fen-us%252Fclara%252Fbiopharma%252F%26data%3D05%257C02%257Ckatkinson%2540quantum-si.com%257C0f550c9f118049de621408dd05c4bfd3%257C48afde5b18304e18a221f6417a5a1bde%257C0%257C0%257C638673064827649542%257CUnknown%257CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%253D%253D%257C0%257C%257C%257C%26sdata%3D1FrNzFk8l3KEsejU8%252BV87mg2fw79CNi9nCAvD6zCP%252Fw%253D%26reserved%3D0&esheet=54155579&newsitemid=20241120405534&lan=en-US&anchor=NVIDIA+BioNeMo&index=2&md5=c77614581e3d4ab90f107177f38939c0), both in the cloud and on-premises to design new and improved biomolecules. Now, we are excited to apply NVIDIA technology for downstream data processing and interpretation applications for Proteus." [Quantum-Si to Develop Acceleration Platform and Advance Core Technologies in Collaboration with NVIDIA](https://finance.yahoo.com/news/quantum-si-develop-acceleration-platform-120000059.html) RXRX: "Recursion plans to utilize its vast proprietary biological and chemical dataset, which exceeds 23 petabytes and 3 trillion searchable gene and compound relationships, to accelerate the training of foundation models on [NVIDIA DGX™ Cloud](https://www.globenewswire.com/Tracker?data=HbfHhJGLFLovux_4GAinPDR2wH9w0m3CGf1Fb9Ct-PV00DWmzZS9HUvNao6gCV8tcLmGKs_X3yLyrEcPn3l86GtKVxsoPehDzmGbHLWQcCbrh2f1T6ms3yrlyJPPSgqR) for possible commercial license/release on BioNeMo, NVIDIA’s cloud service for generative AI in drug discovery. NVIDIA will also help optimize and scale Recursion foundation models leveraging the NVIDIA AI stack and NVIDIA’s full-stack computing expertise. [BioNeMo](https://www.globenewswire.com/Tracker?data=8Un3Nqj2782TnmY-gziLiq0rP2rheA9QTSi5YT8FH4GaI8kTUFZwcrGNhHakA5U9GZF9uKpnaNotEZ4CX6BVB-lb0W86a_aBQM48XFxUQWk=) was announced earlier this year as a cloud service for generative AI in drug discovery, offering tools to quickly customize and deploy domain-specific, state-of-the-art biomolecular models at-scale through cloud APIs. Recursion anticipates using this software to support its internal pipeline as well as its current and future partners." [Recursion Pharmaceuticals, Inc. - Recursion Announces Collaboration and $50 Million Investment from NVIDIA to Accelerate Groundbreaking Foundation Models in AI-Enabled Drug Discovery](https://ir.recursion.com/news-releases/news-release-details/recursion-announces-collaboration-and-50-million-investment)
$rxrx Recursion Pharmaceuticals those calls are printing.. Only matter of time for this to explode Recursion Pharmaceuticals and NVIDIA have a multi-faceted partnership focused on AI-driven drug discovery. NVIDIA invested $50 million in Recursion and provided access to its AI expertise and supercomputing hardware, which powers Recursion's own supercomputer, [BioHive-1](https://www.google.com/search?sca_esv=913ae395eea29b77&rlz=1C5ZNUK_enUS1142US1145&cs=1&sxsrf=AE3TifPX-RSqBlMT-4P1XuSxZMeECRCPJg%3A1759330712825&q=BioHive-1&sa=X&ved=2ahUKEwiNtJXRoYOQAxWQIjQIHeSJIKUQxccNegQIBRAB&mstk=AUtExfAIItCYqVo_vzCNva9h1hGDR5AvOxgytsx4IosaxqdUh4obFGJIfSiPgUk18SnH874hcXWu9oijPL50OWY4ypiEzwgKdjjmuf7rKQESOy3MAfnsMiuTMqSLllgjcZev62RfDDb9WxHGMaF9be8wfIg7uzZVQBXn4eQN_AHrncdOI4uu932FLGkoHsMe2sM0gFq3V02UYWHYwO9ZVIZhSsFLGAlonJJaYrjGrV-Y953jDUEQU54AccsbIRWUNjO2Qo6rYsWLaB3C79SbhrsFy6Uf&csui=3). The collaboration aims to accelerate the development of AI-powered foundation models for drug discovery by leveraging Recursion's vast biological and chemical datasets and NVIDIA's leading AI platform, including its [DGX systems](https://www.google.com/search?sca_esv=913ae395eea29b77&rlz=1C5ZNUK_enUS1142US1145&cs=1&sxsrf=AE3TifPX-RSqBlMT-4P1XuSxZMeECRCPJg%3A1759330712825&q=DGX+systems&sa=X&ved=2ahUKEwiNtJXRoYOQAxWQIjQIHeSJIKUQxccNegQIBxAB&mstk=AUtExfAIItCYqVo_vzCNva9h1hGDR5AvOxgytsx4IosaxqdUh4obFGJIfSiPgUk18SnH874hcXWu9oijPL50OWY4ypiEzwgKdjjmuf7rKQESOy3MAfnsMiuTMqSLllgjcZev62RfDDb9WxHGMaF9be8wfIg7uzZVQBXn4eQN_AHrncdOI4uu932FLGkoHsMe2sM0gFq3V02UYWHYwO9ZVIZhSsFLGAlonJJaYrjGrV-Y953jDUEQU54AccsbIRWUNjO2Qo6rYsWLaB3C79SbhrsFy6Uf&csui=3). This partnership will allow Recursion to create new medicines and also enables them to license AI tools to other drug hunters. Key Aspects of the Partnership
Sorry bears but if NVDA was just financial engineering their success, AMD and INTC would be doing it too. They even gave free compute to OpenAI 2016! A whole DGX! And obviously all that achieved is cooking the books.... Lots of fools revealing themselves akin to Deepseek day.
Jensen wanted the entire world's AI workload on NVDA's stack-CUDA, H100, DGX, TensorRT. Gyna said no. lol.
Much of AMD's valuation hinges on the hopes they will eventually compete with NVDA in the lucrative AI GPU market. But thus far it's been nothing but hopes. AMD stock price is same as it was in late 2021. Meanwhile NVDA and AVGO left 2021 in the dust. With AMD, you'd have to select windows of time to make its performance look good, but even then it's not parabolic like other AI winners. AVGO custom chips won't replace NVDA chips; they are just going to handle lower end workloads that don't require top performance. Problem for AMD is, it's not so much what they have done wrong, it's more so what NVDA has done right. NVDA built an ecosystem which has been the AI platform for a decade - when they shipped the first DGX cluster to OpenAI 10 years ago. That's right 10 years ago. It's tough to catchup with first mover; you are copying them, while they are already working with customers to make next generation improvements and always at least a step ahead. > invested in AMD because I always considered it a cost-leading alternative to NVDA And you have what material data to backup these claims?
It's amusing how the masses think something like AI could just spring up out of nowhere overnight. DeepMind was formed 15 years ago. They made an AI chess engine that could easily outclass the best humans - not even a chance. OpenAI formed 10 years ago and got the first NVDA DGX cluster. It's just a matter of whether you had any relation to the field or any interest in it - but it wasn't developed in any sort of secrecy. In early to mid 2010's, ML had started gaining traction as a potential degree option. I started building NVDA position back in 2017/18 timeframe after reading an article where several Silcon Valley seed investors were interviewed and asked which publicly traded company they would invest in - top choice by far was NVDA because of the future of AI.
Actually, GPUs were already being used in machine learning a decade ago. Nvidia went over this during GTC the following year in 2016 and released DGX.
I remember when Nvidia released the Volta based DGX computer for something like $150k and thinking to myself wow that’s cheap, and Reddit was all LOL can it play crysis. Back then if you want HPC for scientific compute, or do AI research, you need a decent size team to babysit the hardware and software. DGX and the entire Nvidia stack make it much more accessible for smaller companies. They’re running the same playbook for robotics. By the time the industry realizes they can’t build robots without Nvidia, Reddit will once again accuse Nvidia of being lucky while kicking themselves for not buying NVDA at $200.
German tech firm sues Nvidia for patent infringement, seeks to block Nvidia across 18 European countries — ParTec lawsuit alleges DGX AI supercomputer design theft.
When you buy or rent Nvidia DGX you have the networking already. Nvidia can scale thousands of DGX clusters thanks to NVLink for Inter-GPU and Nvidia Photonics / Infini-Band / NCCL for Intra-Cluster connectivity. Nvidia NVL Rack is equipped with world’s most advanced memory transfer technology on the Planet, as well as the highest performance AI computing power. Google’s Ironwood is not even available in Google Cloud. And the previous TPU can’t scale in the same way as Nvidia NVL.
someone leaked DGX report early. jumping in afterhours
KO, GPC, DGX puts ong ong 💯 wallahi
Last year, Nvidia made an unusual proposal to Amazon Web Services and other cloud providers that have long been the biggest buyers of Nvidia’s specialized artificial intelligence server chips. Nvidia wanted to lease Nvidia-powered servers in the cloud providers’ data centers so it could turn around and rent the same servers to AI software developers. Those developers included some of the biggest cloud customers in the world. As the discussions progressed, Nvidia’s leverage increased. Demand for Nvidia-powered servers exploded among AI software developers following the launch of OpenAI’s ChatGPT in November, and the cloud providers soon couldn’t keep up. In that delicate moment, Nvidia saw a way to essentially compete with the cloud providers for customers. Nvidia’s trump card? It was about to release a much anticipated new AI chip, the H100, which the traditional cloud providers needed. Microsoft, Google and Oracle agreed to Nvidia’s proposal but AWS did not, according to a person with direct knowledge of the decision. ... For traditional cloud providers, the rise of DGX Cloud risks turning them into intermediaries. For instance, ServiceNow uses DGX Cloud to develop AI that summarizes IT requests and powers customer service chatbots. John Sigler, a senior vice president at the IT software giant, said the Nvidia service makes it easier for ServiceNow to run its new AI software in its own data centers as well as across multiple cloud providers simultaneously because it can use a “single software platform” from Nvidia to manage the process. Not sure if it works, but it's a great gambit to commoditize the CSPs. Coreweave is another example. Nvidia will create their its own cloud customers for their products to keep the CSPs in line. Source -
NVDA sells the entire DGX cluster which is used for both modeling and inferencing. Lower powered GPU can be used for inferencing small scale solutions - I mean I can run AI models on my desktop - for me it works fine - it will take hours/days/weeks if I had to share that load with others. But it doesn't really work so well when a large commerical platform is using it at scale.
Yeah I'm not talking about the current iteration or even generative LLM's in general, but other AI applications and this is only what we build upon. Artificial general intelligence is coming within probably the next 5 years. Then soon after that comes artificial super intelligence where not only the top people in their fields are surpassed but collectively the entire human population' s processing power, including the smartest individuals on Earth are surpassed by one single AI. At that point we don't know what happens. Things like sonomet Al ultrasound and CT, RadimageGan generative Al medical radiological imaging, Clara parabricks genome sequencing Al software and Evo 2 are much more interesting and make the likes of ChatGpt look boring. Then the US government projects using DGX Superpod working with DARPA, and who knows what they're working on.
Buying myself a DGX spark to celebrate today
Deepseek 671b 4bit quant with a CPU and RAM runs at about 3.5 to 4 tokens per second. Whereas the exact same Deepseek 671b 4bit quant model on a GPU server like the Nvidia DGX B200 runs at about 4,166 tokens per second Tldr this is an insanely regarded take
Quest Diagnostics (DGX) reports earnings Tuesday morning. Decent P/E and dividends. And it is one of the stocks likely to be benefiting from UHC's increase in costs.
DGX is probably one of those responsible for UHC's increased costs.
They could would want to buy the Nvidia DGX stuff.
We have a “DGX” In my area, in under a sky scraper. It’s their urban version and honestly it’s quite handy and cheaper than Target for things like toilet paper and other household staples
No, I don't. Here are some of the more promising AI projects I'm way more interested in. Corrdiff AI extreme weather modeling. GE partnership and specifically their sonomet AI ultrasound and CT projects. RadimageGan generative AI medical imaging along with Deeptek and for AI augmented radiology projects. Clara parabricks genome sequencing AI software. Nvidia DRIVE solution for autonomous vehicles. Nvidia Omniverse partnered with Siemens for AI industrial production applications. Also has some US government contracts/ projects using DGX Superpod and working with DARPA All of these revolutionize the spaces they've been applied to. The amount of potential revenue derived from these projects could be insane going forward and they're all just getting started. Said this 👇 back in 2017 and this still applies. I said this before the birth of ChatGPT or before AI was a trend. http://stocktwits.com/mikel3113/message/71190466 This is just my opinion and you don't have to agree with it, but nonetheless this is still how I think of it. Stick to your theory if you wish and we can agree to disagree.
* Announced that NVIDIA will serve as a key technology partner for the $500 billion Stargate Project . * Revealed that cloud service providers AWS, CoreWeave, Google Cloud Platform (GCP), Microsoft Azure and Oracle Cloud Infrastructure (OCI) are bringing NVIDIA® GB200 systems to cloud regions around the world to meet surging customer demand for AI. * Partnered with AWS to make the NVIDIA DGX™ Cloud AI computing platform and NVIDIA NIM™ microservices available through AWS Marketplace . * Revealed that Cisco will integrate NVIDIA Spectrum-X™ into its networking portfolio to help enterprises build AI infrastructure. * Revealed that more than 75% of the systems on the TOP500 list of the world’s most powerful supercomputers are powered by NVIDIA technologies. * Announced a collaboration with Verizon to integrate NVIDIA AI Enterprise, NIM and accelerated computing with Verizon’s private 5G network to power a range of edge enterprise AI applications and services. * Unveiled partnerships with industry leaders including IQVIA, Illumina, Mayo Clinic and Arc Institute to advance genomics, drug discovery and healthcare. * Launched NVIDIA AI Blueprints and Llama Nemotron model families for building AI agents and released NVIDIA NIM microservices to safeguard applications for agentic AI. * Announced the opening of NVIDIA’s first R&D center in Vietnam . * Revealed that Siemens Healthineers has adopted MONAI Deploy for medical imaging AI.
I wonder if GB10 Project DGX will be allowed to be sold in China?
HIMS continues on its Quest. Announcement today purchase of Trybe Labs at home lab testing is indeed a big deal. The testing co I'm most familiar with, Quest Diagnostics (DGX $18B cap) made approx $10B+ in testing last year, and that market is expected to keep rising for the foreseeable future. I'd (wild) guess that HIMS, with their aggressive marketing to its rapidly growing user base could potentially add as much as $1B rev from adding this service in the 1st year alone. Up 25% now.
Better jump on that bandwagon quickly. GRAL, DGX, JNJ, CVS, JAZZ are all great picks in the healthcare and consumer staples sector. I just bought GOLD (Barrick), based on how it’s near the bottom, posting increasing profits, and moving forward with a stock buyback program. I disagree with your assessment that gold is overbought. I think it is the opposite. If you have an open mind to other commodities, energy is a great choice, including uranium. I have also bought LIT, and would be okay with oil.
[https://arxiv.org/abs/2408.14158](https://arxiv.org/abs/2408.14158) > we deployed the Fire-Flyer 2 with 10,000 PCIe A100 GPUs, achieved performance approximating the DGX-A100 while reducing costs by half and energy consumption by 40%. We specifically engineered HFReduce to accelerate allreduce communication and implemented numerous measures to keep our Computation-Storage Integrated Network congestion-free. Through our software stack, including HaiScale, 3FS, and HAI-Platform, we achieved substantial scalability by overlapping computation and communication. Our system-oriented experience from DL training provides valuable insights to drive future advancements in AI-HPC.
MYNZ Compliance will help now to get more trust, interesting, latest news with Clinical Study and Partnerships with DGX + TMO multibillion companies https://preview.redd.it/u4ggmxroujfe1.jpeg?width=411&format=pjpg&auto=webp&s=2d17445b599c0e5b2606aaef4feb0448b46a2a8d
6 million on model training does not equal a 60 million dollar hardware capex. As far as I know they trained their model on 50000 H100 Nvidia GPUs. That's over 6000 fully packed DGX systems which cost more than 50k each. You're looking at a minimum of half a billion dollars in hardware capex just to get started for those kinds of volumes. Inferencing may well be cheaper, but all that means is you can have a much larger number of players in the market renting the services. Any loss of revenue per user will be more than made up for with the orders of magnitude higher number of users that will access the technology now. Demand for Nvidia's chips will only go up, since now there is evidence that even smaller players can enter the game, widening Nvidia's customer base. Also, any tech investor who doesn't know that this is a rug pull sector should not invest in tech or should stick to a mutual fund type of product. If parade around naked on horseback, of course someone will come out and say that you have no clothes on.
MYNZ is definitely one of my top long-term penny stock picks. Their innovative approach to cancer diagnostics, especially with ColoAlert, and partnerships with major players like Quest Diagnostics (DGX) and Thermo Fisher Scientific (TMO) give them a solid foundation for growth. The potential FDA approval is a game-changer, and I'm confident in holding this one for the long haul
Have you noticed MYNZ's recent performance? They were at a 52-week low, but they're doing exceptionally well now following their partnerships with DGX and TMO.
hold those shares until we get that US approval on this first quarter of 2025 sheesh this is now really making sense thanks for that news from DGX and TMO yesterday MYNZ might get it's ATH this day !
# Vishay: The Hidden Key Winner in Nvidia’s AI Servers and RTX 50 Series Supply Chain [https://medium.com/@mingchikuo/vishay-the-hidden-key-winner-in-nvidias-ai-servers-and-rtx-50-series-supply-chain-vishay-3-0-526aec187854](https://medium.com/@mingchikuo/vishay-the-hidden-key-winner-in-nvidias-ai-servers-and-rtx-50-series-supply-chain-vishay-3-0-526aec187854) Ticker: VSH 1. While the tech industry focuses on Nvidia’s AI server and RTX 50 series supply chain, my latest research reveals Vishay as the hidden key winner. 2. Vishay has secured better-than-expected orders for Nvidia’s Blackwell AI servers (GB200 series and DGX/HGX B200) and RTX 50 series graphics cards. The component orders include MOSFET/DrMOS (VRPower), vPolyTan (Polymer Tantalum, such as T55), Current Shunt Resistors, Transient Voltage Suppressors (TVS), and Schottky Barrier Diodes (SBD), etc. 3. Among the components above, MOSFET and vPolyTan deserve special attention. 4. For GB200 NVL72/36’s 8kW 54V-to-12V power supply, Nvidia has switched to a Renesas (controller) and Vishay (MOSFET) combination, replacing power management modules from Flextronics and Delta. Notably, Vishay has replaced Infineon as the current MOSFET supplier. 5. For DGX/HGX’s 2kW 54V-to-12V power supply, Nvidia now uses ADI (controller) and Vishay (MOSFET), replacing MPS and Delta’s power management modules. 6. The GB200 NVLink Switch’s 2kW 54V-to-12V power supply also uses the Renesas (controller) and Vishay (MOSFET) combination. 7. For Nvidia’s upcoming RTX 50 series graphics cards planned for 2025, Vishay has been selected as a new DrMOS supplier, with mass production starting in 1Q25. 8. Based on the above 4, 5, 6, and 7, Vishay’s current MOSFET production capacity in 2025 is fully loaded, and it is expected to contribute approximately 20–30% of revenue, with a higher-than-average gross profit margin. 9. GB200 heavily utilizes Vishay’s vPolyTan (polymer tantalum capacitors, primarily T55). Vishay’s vPolyTan is already facing supply shortages for 2025 and is expected to contribute a high single-digit percentage of revenue with significantly above-average gross margins. 10. Vishay 3.0 stands to capitalize on robust demand for Nvidia’s AI servers and the new RTX 50 series, with meaningful results anticipated in 2025. The company is also well-positioned to benefit from the automotive sector’s rebound and the burgeoning IoT market. Looking ahead, Vishay’s growth trajectory appears strong, driven by the ongoing success of its Vishay 3.0 strategy implementation.
Is this the part where hardware catches up to software? How’s the NVIDIA DGX Quantum going?
What are you talking about? Blackwell DGX servers have ARM based processors. NVDA has not dumped ARM
Nvidia to partner with SoftBank to build Japan's largest AI Factory, Nvidia DGX 25 EF.
I went full regard boiz. I apologize to nana CALLS: GM, DGX, TMUS, UPS, TXRH & SKX PUT: AAL rest in peace
You don't have a clue if you think this. NVDA's customers are the biggest companies in the world who have little to no debt and lots of cash. AAPL may be able to sell iPhones for 1k, but NVDA can sell their “Blackwell” DGX B200 for $500,000 with 70% margins because no one comes close to offering what they can.
Oh you think I'm just talking about LLM's, no not at all. There's plenty of data out there non-specific to language models that is currently still being trained on multiple different AIS that are specialized for a specific design purpose. That's where the money is. Chat gpt and similar rivals aren't at all exciting in the grand scheme of things when I'm referencing to AI Here are some of the promising AI projects that I'm talking about. Corrdiff AI extreme weather modeling. Nvidia and GE partnership and specifically their sonomet AI ultrasound and CT projects. RadimageGan generative AI medical imaging along with Deeptek and for AI augmented radiology projects. Clara parabricks genome sequencing AI software. Nvidia DRIVE solution for autonomous vehicles. Nvidia Omniverse partnered with Siemens for AI industrial production applications. Also has some US government contracts/ projects using DGX Superpod and working with DARPA All of these revolutionize the spaces they've been applied to. The amount of potential revenue derived from these projects could be insane going forward and they're all just getting started.
What is the range in your sensitivity model. In particular, if DGX sales to data center drop by 80%? What does the DCF model equate to for this risk?
Ask the AI to create a DCF model for NVDA if DGX sales to datacenter fall by 80%.
Im buying up MU. VRAM is the bottle neck in DGX and AI compute. If they want to test the scalability hypothesis, they need more VRAM to scale.
That's not CUDA and DGX Cloud revenues. That's data center revenue. Software can be a plethora of things that Nvidia offers, from models, Ominiverse, NIMS, Nvidia drive, custom models... All types of things. Remember, the Azure exact same chip offering is $20k when the DGX Nvidia offering (which is what most people go with) is $36,999k and that's only H100's. It will be screaming more in licensing fees on the new GB200 SuperPODS, CUDA and spectrum X.
Ben Thompson on Stratechery wrote (behind a paywall): >*Nvidia under Huang did everything we hope our greatest companies will do: they had a long-term vision, they innovated relentlessly to find new markets and applications for industry-leading technology, and when a world-changing opportunity presented itself with large language models, they were ready to take advantage of it, for a long-term benefit that may forever be unmeasurable. This is the behavior our government apparently wants to punish?* *Now granted, the demand for Nvidia’s GPUs came on so suddenly and in such an overwhelming amount that the company had to pick-and-choose who got allocation; that the company may have preferred companies it was invested in, or who only bought from Nvidia, or who were willing to host DGX cloud is hardly a crime: it’s the reward for building a product that people want. It’s nuts to me that the DOJ is even considering this case.* *....*Huang and Nvidia made a massive bet on the future, they were right, and because they were right — with no other meaningful change in their strategy — our government wants to investigate them. I am 100% with him. The narrative of rich = bad needs to die, and we need to stop demonify people who are actually heroes! Musk has so often been unfairly scapegoated, which only really escalated when he became the richest man on earth.
Broadcom makes the majority of the optical interconnect used for memory transfer for nvidia. Also other components in DGX. Its one line item and im not sure how much of their revenue is from it. But if rev is showing sub 5% growth, i think hyperscaler arent buying additionally more much more DGX with each quarter. Sales aren't collapsing but exponential growth has slowed alot.
Sify becomes First in India to Achieve NVIDIA DGX-Ready Data Center Certification for Liquid Cooling to Enable Breakthrough AI Performance
Not a bubble. It's way more than just LLM, that's just the tip of the iceberg Here are some of the promising AI projects that I stay updated on. Corrdiff AI extreme weather modeling. GE partnership and specifically their sonomet AI ultrasound and CT projects. RadimageGan generative AI medical imaging along with Deeptek and for AI augmented radiology projects. Clara parabricks genome sequencing AI software. Nvidia DRIVE solution for autonomous vehicles. Nvidia Omniverse partnered with Siemens for AI industrial production applications. Also has some US government contracts/ projects using DGX Superpod and working with DARPA All of these revolutionize the spaces they've been applied to. The amount of potential revenue derived from these projects could be insane going forward and they're all just getting started. Said this back in 2017 and this still applies. http://stocktwits.com/mikel3113/message/71190466 The scope is broad and AI is only limited by power consumption at this point. There's an AI arms race going on across the globe and there's no reason to stop unless constricted by power restraints.
*announces random buyback* stock returns to pre er price. This shit is not going anywhere. You know how many rural towns literally depend on a single dollar general for almost everything they need? A lot. DGX was a good idea too. LONG.
> This article literally ends by saying its not mainstream and lists a bunch of impracticalities. Yeah. GPU weren't either for datacenters. They had been around for years with little traction. Until they weren't. Now they are the hot thing in datacenters. In more ways than one. That heat needs to be dispelled. Air cooling doesn't cut it anymore. > The problem is its monstrously more expensive to install Again, that's the headstart that SMCI has. Read that first article I gave you. > Why would you bother with any of that when you could just make the building 20% bigger and air cooling is still fine? Because air cooling isn't fine. Because air cooling isn't enough. Liquid cooling will be pretty much required for blackwell to reach it's potential. Otherwise, people will be running their oh so expensive blackwell GPUs in nerfed mode. "Five times the performance of the H100, but **you'll need liquid cooling to tame the beast**" https://www.theregister.com/2024/03/18/nvidia_turns_up_the_ai/ "Nvidia CEO admits next gen DGX systems **necessitate liquid cooling** - and the new systems are coming soon" https://www.tomshardware.com/pc-components/gpus/nvidia-ceo-admits-next-gen-dgx-systems-necessitate-liquid-cooling-and-the-new-systems-are-coming-soon "The GB200 is a key component of the NVIDIA GB200 NVL72, a multi-node, **liquid-cooled**, rack-scale system for the most compute-intensive workloads. " https://nvidianews.nvidia.com/news/nvidia-blackwell-platform-arrives-to-power-a-new-era-of-computing "Nvidia and Dell took the wraps off the upcoming Dell PowerEdge XE9680L server with **liquid cooling** and eight Nvidia Blackwell Tensor Core GPUs." https://www.tomshardware.com/tech-industry/artificial-intelligence/dell-will-have-an-eight-way-nvidia-blackwell-server-with-liquid-cooling-coming-later-this-year
Nvidia's DGX uses Intel Xeon. Blackwell uses nvidia's in-house ARM CPU. But that's not the only way people are deploying Nvidia stuff for AI. Meta for example only indicated they bought $10bn in H100 cards, not Nvidia reference servers. Nvidia doing whole servers is relatively new and it's not entirely clear how successful it'll be long-term vs. companies doing quick deployments of entire racks to "catch up" in the AI scramble.
I thought the fast interconnect DGX servers were still using PowerPC
You're trying to extrapolate their gaming division with their enterprise division. Enterprise is composed of more than just hardware, and that hardware is also in many different verticals now, not just in the datacenter. There's also more than just Mag7, but I will leave you to do that research. 2015 was before DGX-1 was shipping. The transformer was not developed/released until 2017. There's a reason for their explosive growth.
Bro really compared a Intels AI chips to Nvidia’s older DGX models 😂
That's not correct. Yes, GPUs need to be assembled with CPUs, but most Nvidia servers don't necessarily include Intel CPUs. For example, the DGX GB200 system has 36 NVIDIA GB200 Superchips, which have 36 NVIDIA Grace CPUs based on Arm architecture and 72 NVIDIA Blackwell GPUs connected to NVIDIA NVLink. The real gaps here is that Grace + Hopper GPUs support up to 144TB of RAM for inference/training. HBM on Intel Xeon supports up to 64GB, which is simply inadequate for larger models. That's the reason why Nvidia has nailed it.
You think F500 companies are dropping $30M a piece on DGX Superpods and have no idea on how to monetize it? 
NVDA looks to be making the new gen products (DGX) entirely themselves instead of going through server companies like SMCI
No one is cutting out NVDA lol. NVDA pushes their propriety NVLink which is deeply embedded in their DGX architecture for GPU clusters. NVDA jumping on board now when they control maybe 90% of the AI GPU market would just lower the barrier to entry. They'll probably support it at some point in the future when their customers prefer it.
Nvidia AI in the cloud. DGX do the DAMN thing. People don't realize they are the CLOUD
Opinion welcomed but but would love to know how many people you know who are not dealing with HPC and knows about P100 or V100. A100 and H100 were made popular by media recently but how many have heard about ancient T4. How many can differentiate between DGX and HGX. Those stuff makes more money for Nvidia than any RTX out there.
Think of NVDA as software, DGX, not just tech hardware. Still undervalued. Big names pouring billions into AI space. I would wait to short as well
lol @redditors who think nvidia is just a company that makes video game gpus… lol the nvidia DGX platform is what nearly every AI including OpenAI built on and the nvidia library is the basis for most newborn AI constructs
Why isn’t anyone mentioning he missed the boat on a NVIDIA partnership for DGX?
>You not able to comprehend how AI can't decrease the speed by a massive amount is not my problem. No wonder people lose money here. well, then it should be easy to tell me how then right? give me a specific example of where AI can be used in nursing where it will save a bunch of time, and explain why it cannot be done without AI >But sure, why would businesses possibly need the use of Nvidia's DGX chips if we already have humans to do it by paying them a fat salary and taking all the time in the world to get tasks done? they can say AI 100 times on an earnings call and double their valuation, that is the use case right there "spend a billion on chips to increase business valuation by 100 billion", its an absolute no-brainer to buy the chips and then tell every investor you have them
You not able to comprehend how AI can't decrease the speed by a massive amount is not my problem. No wonder people lose money here. If your doubts held any merits then certain sectors wouldn't have started to replace their workforce by AI already. Entertainment/Arts industry is already compromised as is. But sure, why would businesses possibly need the use of Nvidia's DGX chips if we already have humans to do it by paying them a fat salary and taking all the time in the world to get tasks done?
Correction: training is extremely latency and bandwidth sensitive, because you want the all the chips to talk to each other, and maintain coherency through the computational phase. This is why companies are willing to drop hundreds of thousands of dollars on massive systems like DGX or HGX with NVLink. While it is true that NVDA can design a competitive inference chip (and they’ve already done that, with Jetson, AGX, A2, L4, etc), and some customers’ workloads absolutely require significantly more computational power than a typical ASIC, GPUs in general isn’t really the best for inference, especially considering performance / wh or performance / $. But again, since the barrier to entry for inference is so much lower, it’s far easier for companies to jump in, thus pushing margin low.
meta has a lot of ground to cover and is still distracted with metaverse / vr etc. nvda just gave first DGX H200 to OAI and msft has an obvious way to capture value of ai through azure
Thanks. I went with DGX over LH but both are big in the D2C lab market either directly or as backends to other retail lab services. TDOC, TALK, and GDRX are acting like tech companies and buying market share so I expect losses to continue until the market matures and some of the players either exit or are acquired.
You got it! Ticker Symbol: TDOC P/E: 10000.00 P/E Rank: 17.15 P/S: 0.92 P/S Rank: 77.88 P/B: 1.02 P/B Rank: 82.40 P/FCF: 7.11 P/FCF Rank: 88.07 SHYield: -0.46% SHYield Rank: 16.21 EV/EBITDA: 61.58 EV/EBITDA Rank: 36.29 Overall Score: 318.01 6 month price momentum: -26.68% Ticker Symbol: TALK P/E: 10000.00 P/E Rank: 17.15 P/S: 3.92 P/S Rank: 31.41 P/B: 4.94 P/B Rank: 25.95 P/FCF: 10000.00 P/FCF Rank: 16.40 SHYield: -0.46% SHYield Rank: 16.27 EV/EBITDA: -21.07 EV/EBITDA Rank: 16.44 Overall Score: 123.61 6 month price momentum: 71.43% Ticker Symbol: DGX P/E: 17.17 P/E Rank: 71.30 P/S: 1.54 P/S Rank: 62.60 P/B: 2.26 P/B Rank: 50.13 P/FCF: 16.47 P/FCF Rank: 64.52 SHYield: 3.67% SHYield Rank: 70.12 EV/EBITDA: 10.80 EV/EBITDA Rank: 72.53 Overall Score: 391.19 6 month price momentum: 4.73% Ticker Symbol: GDRX P/E: 10000.00 P/E Rank: 17.15 P/S: 3.60 P/S Rank: 33.74 P/B: 3.53 P/B Rank: 34.54 P/FCF: 19.66 P/FCF Rank: 59.58 SHYield: 3.50% SHYield Rank: 68.72 EV/EBITDA: 40.85 EV/EBITDA Rank: 38.56 Overall Score: 252.30 6 month price momentum: 28.38% Ticker Symbol: LH P/E: 44.21 P/E Rank: 45.76 P/S: 1.34 P/S Rank: 67.11 P/B: 2.19 P/B Rank: 51.36 P/FCF: 19.79 P/FCF Rank: 59.40 SHYield: 6.26% SHYield Rank: 84.77 EV/EBITDA: 14.05 EV/EBITDA Rank: 61.36 Overall Score: 369.74 6 month price momentum: 2.92% DGX is the only one that looks interesting... Hell, only 2 of the 5 are profitable.
Very interesting! Could you please try some healthcare? TDOC TALK DGX GDRX LH
Who ever thought EV could be an issue in 2024? https://www.forbes.com/sites/peterlyon/2024/03/03/bucking-industry-trend-toyota-chairman-downplays-ev-growth-predictions/?sh=58dad0874621 https://www.digitimes.com/news/a20231120PD209/tesla-model-3-fault-rate-ev.html#:~:text=According%20to%20the%20latest%20report,offering%20testing%20and%20certification%20services And read articles about "Hertz ev fleet" Same could happen for AI, my guesses could be: 1) the hype is soon over, now you can run your own LLM (ala chatgpt) on your cheap laptop fast and as good as chatgpt. Same for companies, they can make a model with their own knowledge database for their Customer service. I use "LM Studio" on a lenovo legion 5. Model mistral-7b-instruct-v0.2.Q4_K_S.gguf is only 4GB!! I am satisfied with the speed and the answer :) Yes everyone now can have their own "chatgpt" at home, no limit, cheap laptop, no WiFi needed, nsfw, no big brother watching... 2) i used Ai to draw a lot in the beginning, but not anymore, only 1-5 a week. 3) hardware can be used at least for 5 years, as soon as Microsoft, Facebook, Amazon, Google... have invested millions in DGX GH200, it will take some time before they buy again.... And at that time, maybe Amd, Intel, Broadcom, TSM... could be an alternative. But Jensen Huang is a dedicated, clever guy, other needs to be good (and cheap) to beat him.
brain dump of trades i am thinking on: - LEO stocks for Nov: partisan meltdown probabilities increasing, losers mad enough for tear gas? maybe.. - rona redux: TX dairy cattle done caught bird flu; TX has a lot of wild swine & them fucks are quite similar to humans. o shit, next step? - PP at a premium: why tf tutes/accredited investors buy in at +100% over last close? bullish, fire up the radar on 'em. hi I-BIO! - guns: you can profit off tragedy if the score is high enough to get pols/MSM yapping - confirmed+banked. no tots n pears lately - MDMA: 1st to be approved by FDA reckon, recall the leaked FDA letter. check the dang news even Utah is hip to melting walls - PFAS: CDC told docs to test for this shit in your blood, it's fucking everywhere. DGX is CLIA-certified n launched a blood test last mo why fixate on one or two stocks/thesis?? there are thousands of them! the angles are damn-near infinite. goodnight gards. zero emotions in the markets.
I didn’t say they had Nvidia on board, but it wouldn’t surprise me. There are integrated devices like the DGX the Tesla uses. Microsoft azure AI is Nvidia. They don’t have their own chips. OpenAI is using Nvidia. No matter what you touch Nvidia is at the core of it.
Not that. The new DGX is a game changer imo
AGX - SOC - Designed for Robotics JETSON - This is legit. I work on this. The virtual world is Omniverse - It is hosted in the Azure Cloud AI and Omniverse OVX (digital twin) They are giving a platform for A robotics building / warehouse and there will be some autonomous systems. This is wild, the NIMS and DGX Cloud you can virtualize a world that you can train your robots and hardware on. it's how you'll test the hardware running their autonomous robotic stack. You will have to have the digital twins. ​ https://preview.redd.it/77xtwkduw5pc1.png?width=1794&format=png&auto=webp&s=51071541e8fcc5c1e22b47f1825f3cdfeedf728e
AWS and Google is signing up for Blackwell. Oracle is gearing up for Blackwell. Microsoft Azure is gearing up for Blackwell. DGX Cloud is in Azure - The entire industry is gearing up for Blackwell.
2.5 x FPA for training over G Hopper FP6 - FP4 40 flops 5x - important for inference. Inference is generation - old computing is retrieval of data - this is generating tokens. Vast majority of content will not be retrieved but rather generated. This is FP4 - and Blackwell is 5x the inference capability. Seems like enough but why stop there. It's NOT enough. We NEED A BIGGER GPU. J Huang is cooking Blackwell is a BEAST We still NEED MORE Need another chip NVLINK Switch Chip 50B Transistors - It doesn't even make sense. Build a system to talk to every GPU and CHIP at the same time - connect over a coherent link - 1 Giant GPU. DGX GB200 NVL72 1 Giant GPU Training FP8 720 PFLOPS 22x WTF Inference FP4 = 45x WTF LOL Worlds first 1 xoflop machine in 1 rack. Only 2 or 3 on the planet. The world entire internet in these wires. and it's liquid cooled. https://preview.redd.it/83rlhlxrn5pc1.png?width=1539&format=png&auto=webp&s=9767f948d0da5402e2ac5b8154e663651079ce4f
GOING IN ON BITDEER! [https://www.globenewswire.com/news-release/2024/03/18/2847918/0/en/Bitdeer-Announces-Completion-and-Successful-Validation-of-NVIDIA-DGX-SuperPOD-H100-System.html](https://www.globenewswire.com/news-release/2024/03/18/2847918/0/en/Bitdeer-Announces-Completion-and-Successful-Validation-of-NVIDIA-DGX-SuperPOD-H100-System.html)
https://www.globenewswire.com/news-release/2024/03/18/2847918/0/en/Bitdeer-Announces-Completion-and-Successful-Validation-of-NVIDIA-DGX-SuperPOD-H100-System.html
Let me help you out. There maybe a secret announcement (and 1 more thing) Could be the B200 or something else. There maybe a foundational model release from OpenAI GPT-5. Microsoft - for some odd reason is having an announcement too which is going to be the Surface PC refresh and ARM is going to take center stage. Lastly, look for the consumer 5080/90 series to be released too. Oh, and 1 more thing. Look for Nvidia to blow the fucking door open with a crazy cloud DGX accelerated compute offering putting the big Cloud providers on notice. Stock going up
You're right, I'm talking out my ass a little. [Here is the immense list where you can see all the third party providers](https://www.nvidia.com/en-us/about-nvidia/partners/partner-locator/?competency=DGX%20AI%20Compute%20Systems%2CDGX%20Cloud%2CNVIDIA%20AI%2CNVIDIA%20Omniverse&level=Elite%2CPreferred%2CPartner&page=1&type=Data%20Center%20Provider%2CCloud%20Partner) My main point is NVDA sells because they met customers where they are, which generates the demand that influences DC buildouts, because idle chips are wasted space.