Reddit Posts
Download dataset of stock prices X tickers for yesterday?
Tech market brings important development opportunities, AIGC is firmly top 1 in the current technology field
Tech market brings important development opportunities, AIGC is firmly top 1 in the current technology field
AIGC market brings important development opportunities, artificial intelligence technology has been developing
Avricore Health - AVCR.V making waves in Pharmacy Point of Care Testing! CEO interview this evening as well.
OTC : KWIK Shareholder Letter January 3, 2024
The commercialization of multimodal models is emerging, Gemini now appears to exceed ChatGPT
The commercialization of multimodal models is emerging, Gemini now appears to exceed ChatGPT
Why Microsoft's gross margins are going brrr (up 1.89% QoQ).
Why Microsoft's gross margins are expanding (up 1.89% QoQ).
Why Microsoft's gross margins are expanding (up 1.89% QoQ).
Google's AI project "Gemini" shipped, and so far it looks better than GPT4
US Broker Recommendation with a market that allows both longs/shorts
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
Best API for grabbing historical financial statement data to compare across companies.
Seeking Free Advance/Decline, NH/NL Data - Python API?
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
Delving Deeper into Benzinga Pro: Does the Subscription Include Full API Access?
Qples by Fobi Announces 77% Sales Growth YoY with Increased Momentum From Media Solutions, AI (8112) Coupons, & New API Integration
Qples by Fobi Announces 77% Sales Growth YoY with Increased Momentum From Media Solutions, AI (8112) Coupons, & New API Integration
Qples by Fobi Announces 77% Sales Growth YoY with Increased Momentum From Media Solutions, AI (8112) Coupons, & New API Integration
Aduro Clean Technologies Inc. Research Update
Aduro Clean Technologies Inc. Research Update
Option Chain REST APIs w/ Greeks and Beta Weighting
$VERS Upcoming Webinar: Introduction and Demonstration of Genius
Are there pre-built bull/bear systems for 5-10m period QQQ / SPY day trades?
Short Squeeze is Reopened. Play Nice.
Created options trading bot with Interactive Brokers API
Leafly Announces New API for Order Integration($LFLY)
Is Unity going to Zero? - Why they just killed their business model.
Looking for affordable API to fetch specific historical stock market data
Where do sites like Unusual Whales scrape their data from?
Twilio Q2 2023: A Mixed Bag with Strong Revenue Growth Amid Stock Price Challenges
[DIY Filing Alerts] Part 3 of 3: Building the Script and Automating Your Alerts
This prized $PGY doesn't need lipstick (an amalgamation of the DD's)
API or Dataset that shows intraday price movement for Options Bid/Ask
[Newbie] Bought Microsoft shares at 250 mainly as see value in ChatGPT. I think I'll hold for at least +6 months but I'd like your thoughts.
Crude Oil Soars Near YTD Highs On Largest Single-Week Crude Inventory Crash In Years
I found this trading tool thats just scraping all of our comments and running them through ChatGPT to get our sentiment on different stocks. Isnt this a violation of reddits new API rules?
I’m Building a Free Fundamental Stock Data API You Can Use for Projects and Analysis
Fundamental Stock Data for Your Projects and Analysis
Meta, Microsoft and Amazon team up on maps project to crack Apple-Google duopoly
Pictures say it all. Robinhood is shady AF.
URGENT - Audit Your Transactions: Broker Alters Orders without Permission
My AI momentum trading journey just started. Dumping $3k into an automated trading strategy guided by ChatGPT. Am I gonna make it
The AI trading journey begins. Throwing $3k into automated trading strategies. Will I eat a bag of dicks? Roast me if you must
I made a free & unique spreadsheet that removes stock prices to help you invest like Warren Buffett (V2)
I made a free & unique spreadsheet that removes stock prices to help you invest like Warren Buffett (V2)
To recalculate historical options data from CBOE, to find IVs at moment of trades, what int rate?
WiMi Hologram Cloud Proposes A New Lightweight Decentralized Application Technical Solution Based on IPFS
$SSTK Shutterstock - OpenAI ChatGBT partnership - Images, Photos, & Videos
Is there really no better way to track open + closed positions without multiple apps?
List of Platforms (Not Brokers) for advanced option trading
Utopia P2P is a great application that needs NO KYC to safeguard your data !
Utopia P2P supports API access and CHAT GPT
Stepping Ahead with the Future of Digital Assets
An Unexpected Ally in the Crypto Battlefield
Utopia P2P has now an airdrop for all Utopians
Microsoft’s stock hits record after executives predict $10 billion in annual A.I. revenue
Reddit IPO - A Critical Examination of Reddit's Business Model and User Approach
Reddit stands by controversial API changes as situation worsens
Mentions
Companies usually use API instead of UI for SAAS. API is cost by tokens. Some of the skill related like photoshop definitely on the way to replace/add in. The risk of servicenow, etc are not being replaced completely but combined in a comprehensive tool/ platform that will do all the end to end job in workflow. The first stage is that SAAS who do not integrate AI will be falling back. The ones who do however will likely to reduce their price first and then few names will disappear because of no business but not replaced. So for stock, the real question is who will last longer and stronger in the end.
Companies usually use API instead of UI for SAAS. API is cost by tokens. Some of the skill related like photoshop definitely on the way to replace/add in. The risk of servicenow, etc are not being replaced completely but combined in a comprehensive tool/ platform that will do all the end to end job in workflow. The first stage is that SAAS who do not integrate AI will be falling back. The ones who do however will likely to reduce their price first and then few names will disappear because of no business but not replaced. So for stock, the real question is who will last longer and stronger in the end.
Unless the saas you’re using is a web API that tells you whether a number is odd, it’s absolutely impractical. You’re not paying for a few lines of code, you’re paying for proven solutions, for access to marketplaces with other companies, for compliance and security, for not having to maintain a product in an area where you have no domain knowledge and don’t even understand the complexity under the hood, for paying only a fraction of what your own infra might cost and for someone to yell at over the phone when shit breaks.
I wrote this myself, if you don't believe me that's on you. Just as a defense for my case, look into the links about suppliers. That data is not callable by API, instead you can download an Excel file, from which I then roughly calculated the % of regional exposure to SEA
> ar. Gemini isn't being used as some API call. They're literally using it as a base model on their own hardware. >More importantly, it's who didn't get the agreement: their competitors. OpenAI was their first choice. But OpenAI realized that giving away their model for $1 Billion is not strategic long term. As a consumer, why would I pay for Gemini when I get it for free on my iPhone? In 5 years Apple's foundational model will likely be an open weight model that they don't have to pay license fees for.
I am going to give you a historical reference to why "the big fat panda " will lead at the end. in the 90's when the apple / Microsoft / IBM fights happened... IBM stepped out and went into another direction. Apple was forced to bring in Steve again, and to look at new design products because the slow lumbering ship called Microsoft was turning and refocusing. Microsoft then went to dominate the software wars that happened.. on the same note: Sears, because it lost it's focus in the 70's & 80's, fell to earth and is no longer around as the same Sears I grew up with in the 70's. Sear's has been replaced by Amazon, Ebay, and Ali Baba The only 2 that have slow moving ships are Google and Microsoft, and they are the biggest embeded system platforms, they can easily modify the API to throttle 3rd parties, like what happened to openClaw over the weekend I used Gemini for this : \## Rankings by Consumer Usage (MAU & Web Traffic) The top AI tools are generally ranked as follows, based on recent trends and reported site visits: 1. ChatGPT (OpenAI): This AI tool is the market leader with the highest global monthly visits and total user base. 2. Gemini (Google): Gemini is rapidly gaining users due to its deep integration with Google Search and Android ecosystems. 3. DeepSeek: This AI tool was a major breakout in 2025–2026, reaching the #1 spot on the US App Store and gaining massive traction for its high-efficiency reasoning models. 4. Claude (Anthropic): Claude is highly popular among technical users for coding and long-form reasoning. 5. Microsoft Copilot: This AI tool is extensively used within enterprise environments due to its Microsoft 365 integration. 6. Perplexity AI: Perplexity AI leads the "AI Search" category with steady growth in monthly active users.
They tried. Oh, how they tried. But they failed so hard their head of the program is 'retiring' this year. Gemini isn't being used as some API call. They're literally using it as a base model on their own hardware. I'm not sure you understand how expensive and difficult it is to spin up a viable model from scratch. More importantly, it's who \_didn't\_ get the agreement: their competitors.
Natively, no, Claude doesn’t produce image and video. But you can set it up to use external tools and services and act as an orchestrator. So in theory you could have Claude write (1) the script and dialog, (2) write the prompts for the AI video program (Veo 3, Seadance…), (3) write the prompts for the audio (ElevenLab…). Then the agent directly prompt the other AI tools using an API or a native connector if there’s one. Outputs could be saved if a folder. I’m not sure about the video editing to assembly everything together. Last I checked people still needed to manually edit it. But maybe there’s an AI video editing tool now. …for a split second I considered trying the workflow myself, but I really need to stop playing with AI, I have stuff to do.
The IBKR shows PnL, but its super clunky. And I hate the Ui. Plus it does not strip out the extrinsic value remaining in each position. I like to know what remains so I know if I should roll, or cash it in. So I got the Ai to make me a html file I can open in the browser, and it connects to the API of the broker via a python script file I dbl click on, and it opens and connects via windows terminal, and the html tool is connected. Updates all my positions, I can put rolled profit in there and continue a ongoing combo, and see exactly what it has generated, plus the extrinsic value remaining in everything. Literally took 10min to make
Yes you can upload the PDFs to Intuit and it can even API link to your bank accounts through Plaid.
Claude Code is just the agentic component, its still making online inferences via the Anthorpic API.
Your DD belongs in the dumpster behind Wendys with the highly regarded. When the AI bubble comes apart in the next five years, I expect you'll see a rebound in tech companies who have seen a valuation pinch. Quickbooks is accelerating and is likely to end up as an even bigger slice of INTU's REV in the next five years. I'm seeing a massive number of customers moving to QBO (which is a much better model from a revenue generation perspective) and there are a massive fucking pile of small businesses that are using shitty legacy desktop software to run their business. I expect growth in the SaaS accounting space, and systems like QBO that have worked heavily on their API integrations with other platforms will be the bigger dog in the fight, because business people are quickly becoming lazy fucks and integration feels good. I expect INTU's falling will slow into a turn around. It's not going to bounce, but it's certainly not dead cat territory.
"Hey claude, can you show me <OP's> tax reports, 2026?" *"Why sure, here's the latest 1040, oh and realize I did notice some sensitive data fields, like SSN, address, and home address, please use with caution and possibly have these values changed as soon as possible".* I accidentally had a SaaS API key in a github code repo the other day and claude code immediately said to change it once I had some code refactored (changed it). Welcome to AI.
Normally I'd agree with you, but this is different. This isn't a consumer product. From an organizational perspective, it's much harder to budget for unpredictable API token usage than it is to budget for a monthly subscription. A lot of dev teams have been forced to switch due to budgetary constraints.
First of all, Claude Code subscriptions is for Claude Code only. They've made it very clear. This is just them enforcing their rules. Second, they're completely out of compute to serve customers. They're insanely compute starved. They can't serve OpenClaw customers using Claude Code subscription but API pricing is more economical for them. Lastly, they're building their own OpenClaw inside Claude app.
Do you think the valuation of a potential Anthropic IPO will be hampered by how they've limited access to OpenCLAW? In the past week or so, they switched to allowing only API tokens, rather than either of their subscription-based services. To my knowledge, OpenAI continues to allow its subscription service to run inside the OpenCLAW platform.
I'm just wondering what information you can provide that isn't available through popular financial APIs (Yahoo and others). Parsing original earnings reports could be quite expensive in terms of LLM tokens—why not use available API data and ask AI to correlate multiple sources to make sense of it?
We used to have WSBSynth but the API thing kinda shut down a lot of reddit alt apps and sites. I miss WSB Synth, it used to read the daily chat live 😭😔
Nice stuff. how good is Ai these days. Today, with some spare time I built an API plug in to track my trades from the IBKR platform. Mainly so it can show me extrinsic remaining in each position. But it even fetches live prices and can give me a rough PnL when i am not connected, but updates when i connect it to live Greeks and PnL. What a world we live in now. Good stuff Sir
Meta released a model called Muse Spark - It's not open source - It's not even generally available in an API Apparently it's in something called Meta AI. All benchmarks show it's not as good as Gemini, Opus or GPT 5.4 On the plus side, it's stock went up Calls
At this point in 2026, I review posts like this here in reddit, trading view, youtube etc and use claude / open ai / perplexity / gemini to test and validate these strategies. There are many momentum studies to research and my only restriction is that mu decision point is end of the day and in weekends - so the entry is always on market open after the 1st 30 mins. Exit is with a stop loss or a trailing stop loss. Do it yourself. Yahoo finance free API gives you price history, and don't forget to include dividends. Consider Massive's API services. Don't be afraid - the AI's take care of everything and will produce teh necessary files in CSV so you can use excel to analyze. Think this as a simulation of multivariate time series with SPY as the reference series and instruct AI that this is a simulation and you need probability distributions. Then it is up to you to review and decide. I don't do back testing or anything like that. For me those are necessary for hands on trading and I have a day job, and this probabilistic trading even when there are losses - see Trump, suits my style of thinking. Not paying for any services except for AI
So here's the thing. I've offered to share it with people before, they DM me, then the ghost. If I just gave someone my sheet, it would blow their mind. There's a lot of tuning it would require to your local folders, your API hashes (if you have an API), and most importantly you need to know how to use it. There's no instruction manual. I would probably have to go through every piece and explain "This means X. Most of the time. But sometimes I do Y. And this column of formulas, there's no way it's right, but I quit using it months ago and haven't tied it off". What I would suggest is start your own spreadsheet. Use ChatGPT or Claude for the VBA coding. Every time you encounter something you need to know, add a helper column for it and create a formula. Upload your sheet and ask Claude "how can I improve this?" You will develop something that works for you.
The “product” in question is literally just an over glorified API btw
API test you kind of do. They are pretty black and white in terms of they work or they don't, especially since all requests come with status codes, making it pretty straight forward if things are working as you expect. Especially when building out a load test, you're basically just making calls and then using thread groups to control the through put of the requests. Usually you include things like contracts testing and status codes.
Yeah, auth is like the worst part of doing API stuff. Especially it's really annoying since there ins't like a stack trace or error, it like works or doesn't. Like building out API tests and load isn't the worst thing, but it's kind of tough when you don't know what you are doing. I still think the coding aspect of these LLM tools are really great. At least for me, it help speeds up tasks.
Their revenue is from subscriptions and API usage, not from microsoft. Investor funding is not counted as revenue. You clearly don't understand how accounting works.
But still, why is a stablecoin superior to any other API provided by a payment processor?
Thanks for the link. I read it - solid write up and I think you might want to read it again, because it actually makes my point. Let's walk through what Alderson is saying: 1. There's a ~10x markup between actual compute cost and API price. Opus 4.6 is $5/$25 per million tokens at the API. Comparable open-source models run at $0.39/$2.34. Alderson uses this to argue Anthropic isn't losing $5k per user. The real compute cost is closer to $500. Fine. But that 10x gap is exactly where the subsidy lives. API customers are paying retail. Subscription customers are not. The API is the profit center funding below-cost subscription plans. 2. His own numbers show the $20 plan roughly breaks even on compute. He says the average user burns ~$6/day in API-equivalent spend, which at 10% true compute cost is ~$18/month. Against a $20 subscription. That's not 50%+ gross margin on the subscription product. That's razor-thin at best, before you account for infrastructure, support, and the overhead of serving millions of concurrent users. And that's the average user. The moment someone uses it moderately above average, that unit is underwater. 3. He confirms the gym membership model. 90% of users consume under $12/day, fewer than 5% hit weekly caps. The whole economic argument depends on most people not using what they're paying for. That is textbook subsidization. Light users subsidize heavy users, and API margins subsidize the subscription product as a whole. Gyms work the same way. Planet Fitness would collapse if everyone showed up. In the SaaS world, we call that batch of users the "sleeping dogs" and let them lie. We build our infrastructure with them in mind, knowing it would crumble under 100% usage. 4. He says the losses come from R&D and infrastructure, not token serving. That doesn't help your case. It means the subscription price doesn't reflect the full cost of delivering the product. It only covers marginal inference. The R&D, training runs, and infrastructure that make the product worth $20/month are being funded by venture capital and API revenue. That's a subsidy by any definition. So yes, Anthropic probably isn't hemorrhaging $5k per user. Alderson is right about that. But "it doesn't cost $5k" and "it isn't subsidized" are two completely different claims. The subscription plans are priced to acquire users, not to generate standalone profit. That's the strategy, and Alderson's own math confirms it.
We call that SaaS in tech, I think. Here’s a great post on the math: https://www.reddit.com/r/ClaudeAI/s/oRDCQLRsNV You need to remember that claude is looking at each implantation of the product as an incidental business unit, effectively a business within the business. each one has their on P/L, and, of course, at end of day you want all that to add up to a positive integer. Businesses operate units at a loss regularly, being subsidized by other units and providing some non-monetary value that makes it worth it. YouTube is a great example. Haven’t looked recently, but it was a net loser as a unit for years at Google, may still be. Turns out warehousing near infinite hours of video is expensive. But it lept users in their ecosystem and allowed them to sere improved ads across the rest. Open ai and Claude subsidizing the subscription plans with API spend as a customer acq strategy. Some percentage of those subscribers, like me, will spend against api regularly, increasing LTV hopefully. It’s a marketing cost, and they probably have precise CAC numbers that are supremely dialed. The API cost is the true cost of the product at max delivery. No variable compute like on subscription, alway delivers consistently. Is not throttled during retraining or peak use periods. If you still don’t believe me, I think you should still have access to free api tokens on your usage dash. Grab those, see how long they last you vs your $20 plan. Again, I think you might have an enlightened experience. Then, if you’re interested in learning how to stretch your tokens further, getting even more value out of each, I’d be happy to share some tips. There is quite a learning curve there. Let me know.
Bro, I'm not asking it to show me how to pleasure my GF... This took fking hours. Literally 9 hours. so many missed critical details **100TB GDrive Backend** # I. Firmware & OS Hardening (Headless Optimization) * **UEFI/BIOS Configuration**: * Set `AC Recovery` to `Power On`. * Set `POST Behavior` to `Continue on Warnings and Errors`. * Disable `SupportAssist OS Recovery` and `BIOSConnect` (UEFI Network Stack dependency). * Clear `BIOS Event Logs` to reset pending diagnostic flags. * **OS Level (Windows 11)**: * Disable `Fast Startup` via Power Options (Registry: `HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\Power\HiberbootEnabled = 0`). * Set Power Schema to `High Performance`; disable `PCI Express Link State Power Management`. * Configure `Automatic Login` via `netplwiz` for persistent session restoration after power cycling. # II. Storage Abstraction Layer (Rclone VFS) * **API Ingress**: * Provision a dedicated **OAuth 2.0 Client ID and Secret** via Google Cloud Console. * Enable `Drive API` and bypass default `rclone` shared client IDs to eliminate 403 Rate Limit errors. * **VFS Mount Parameters**: * **Mount Command**: $$rclone \\ mount \\ remote: \\ G: \\ --vfs-cache-mode \\ full \\ --vfs-cache-max-age \\ 24h \\ --buffer-size \\ 256M \\ --dir-cache-time \\ 1000h \\ --attr-timeout \\ 1000h$$ * **Logic**: Map `G:` as a virtual filesystem with `--vfs-read-chunk-size 128M` for sequential 4K stream buffering. * **Persistence**: * Implement a `.vbs` wrapper in `shell:startup` to execute the mount in a suppressed background process (WindowStyle 0). # III. Containerization & Orchestration (Docker) * **Engine**: Docker Desktop (WSL2 Backend). * **Stack (Docker-Compose YAML)**: * **Plex**: Enable `/dev/dri` passthrough for **Intel QuickSync (iGPU)** hardware transcoding. * **Automation (Radarr/Sonarr)**: Configure binary path mapping (`/data/media`) to ensure **Atomic Moves** (Instant Imports) across the VFS mount. * **Indexer Logic (Prowlarr)**: Centralize API keys for all Usenet/Torrent indexers. * **Database Management**: Force all SQLite databases (`AppDir`) to the local **NVMe SSD** to prevent latency-induced database locks frequent in cloud-based storage. # IV. Network Ingress & Automation * **External Access**: Deploy `cloudflared` (Cloudflare Tunnel) to route **Overseerr** (Request Management) through a secure HTTPS tunnel, bypassing CGNAT and local firewall port-forwarding. * **Library Scanning**: Disable Plex 'Auto-Scan' on the VFS mount; implement a **Websocket-based Plex Autoscan** script to trigger targeted library updates via filesystem change notifications.
And now they require you to use their expensive API for any third party products like Openclaw, Cursor, etc. I don't see myself going back to Claude. GPT 5.4 just performs better and for way cheaper.
OpenAI's models perform better at reasoning, coding, and search in my experience. The big reason Anthropic was successful was they use to let you use a subscription with third party products like Cursor and Openclaw. Anthropic pulled the plug on this and is making people pay via API now.
Most of Anthropics run rate growth is from raising pricing by requiring most customers to go through the more expensive API rather than subscription plans.
1. Make tutorial on how to setup openClaw 2. Tell people to put their API keys in config files 3. Watch as chaos ensues
They are literally repricing all these plans right now. Go have a look at the codex and Claude subs. These plans will be profitable it's just a matter of time. Enterprise AI is literally happening as we speak. Enterprises pay API prices for Claude and codex which is much more expensive and makes those companies a lot. All of the massive law firms I work with are using legora, Harvey, openai enterprise, Claude etc. I was talking to a dev and goldman the other day, they have a huge contract with anthropic to provide Claude that is fine tuned to their internal custom language. It's literally everywhere
Imagine the reason why private credit collapsed was because rich people lost money on their friend’s unprofitable companies run by spoiled brats who vibe coded and API chatgpt
You were risking 4.8M to make 24k. I wouldn't. I made 1M+ in 0DTE last year (risking 200-500k daily) and gave it back in 4 days of bad trading, I would make 20k/30k many days till the 1000+ point NDX drop hit me hard. You have wider bands but you can't survive a 2000 point drop. I am now trading "smarter" -- taking fewer risks, capping risk. And making good returns selling fewer 0.05-0.1 spreads vs more number of 0.03-0.05 spreads. I now have tools to monitor how the spreads are trading every minute so I can make adjustments in a calmer market environment (and calmer trader environment :)). (eg) Delta moving above .25 will make me monitor closely. I moved fto risking 80-100k daily -- usually stays below 60-70k. Aiming for smaller gains -- 2-4k a day. I use .1 to .05 delta for my short spreads. SPX and RUT always 5 pt wide, NDX 10 always points wide. A spread far OTM will count as lower risk vs a closer spread and this is reflected as blended risk. The blended risk will have a floor to reflect black swam events. I use 4 metrics -- peak risk, avg risk, peak blended risk, avg blended risk. I used LLMs to come up with a more detailed formula for calculating blended risk. This is tracked every minute during trading day -- via Schwab API to get real time data -- positions, orders, quotes, greeks,.. Apr 2, blended risk moved very close to max risk and I decided to take a loss -- your spreads were further OTM and were fully safe. My blended risk formula would have rated your positions at \~.2 of max risk -- about 500k-1M in risk. I can't stomach that type of risk any more. If something unthinkable happens in the war during trading hours -- Qatar's LNG plants fully destroyed, major Saudi fields hit, desal plants hit, sinking of a few LNG tankers,.. -- NDX will drop 5-10% instantly. You have to be prepared for it (or not based on your risk tolerance). here is my return for today for one of my accounts -- 25k max risk, avg risk of \~21k, avg blended risk of 15k and made 2.5k (5k is in SPX 1DTE IC). One tenth of your Apr 2 return at vastly lower risk. And I was in the market for 2-3 hours (SPX lasted longer as I did not close some very far OTM spreads and let them expire). INDEX\_RISK\_SUMMARY underlying,avg\_risk,avg\_blended\_risk,peak\_risk,peak\_blended\_risk,minutes\_open NDX,10000.00,6233.96,10000.00,8939.57,128 SPX,5847.27,5157.59,10000.00,9307.63,275 RUT,5000.00,3480.44,5000.00,4624.78,177 DAILY NET PROFIT (CASH FLOW): \- SPXW: $1,659.16 \- RUTW: $529.04 \- NDXP: $341.04 TOTAL NET PROFIT: $2,529.24 DATA FILE 1: TRANSACTION LOG ═══════════════════════════════════════════════════════════════════════════════ Columns: Time, Symbol, Type, Short Strike, Long Strike, Qty, Effect, Net Premium KEY: OPENING positive = credit received CLOSING negative = debit paid Time,Symbol,Type,Short,Long,Qty,Effect,Net 06:33:34,SPXW,P,6505.0,6500.0,10.0,OPENING,296.56 06:33:34,SPXW,C,6665.0,6670.0,10.0,OPENING,326.56 06:33:44,SPXW,P,6505.0,6500.0,10.0,OPENING,296.56 06:33:44,SPXW,C,6665.0,6670.0,10.0,OPENING,376.56 06:36:48,RUTW,C,2575.0,2580.0,10.0,OPENING,264.76 06:36:48,RUTW,P,2490.0,2485.0,10.0,OPENING,424.76 07:03:51,NDXP,C,24500.0,24510.0,10.0,OPENING,297.76 07:03:51,NDXP,P,23700.0,23690.0,10.0,OPENING,197.76 08:19:22,SPXW,C,6665.0,6670.0,10.0,CLOSING,-111.64 08:19:22,SPXW,P,6505.0,6500.0,10.0,CLOSING,-111.64 08:32:20,SPXW,C,6665.0,6670.0,10.0,CLOSING,-61.64 08:32:20,SPXW,P,6505.0,6500.0,10.0,CLOSING,-111.64 09:11:32,NDXP,C,24500.0,24510.0,10.0,CLOSING,-62.24 09:11:32,NDXP,P,23700.0,23690.0,10.0,CLOSING,-92.24 09:33:58,RUTW,C,2575.0,2580.0,10.0,CLOSING,-85.24 09:33:58,RUTW,P,2490.0,2485.0,10.0,CLOSING,-75.24 10:24:52,SPXW,C,6630.0,6635.0,10.0,OPENING,297.46 10:24:52,SPXW,P,6525.0,6520.0,10.0,OPENING,78.36 10:41:11,SPXW,C,6630.0,6635.0,10.0,CLOSING,-159.64 10:44:59,SPXW,P,6525.0,6520.0,1.0,CLOSING,-5.96 11:03:56,SPXW,P,6525.0,6520.0,4.0,CLOSING,-23.86 12:59:07,SPXW,P,6480.0,6475.0,10.0,OPENING,206.56 12:59:07,SPXW,C,6695.0,6700.0,10.0,OPENING,366.56
blue sky's API is also very friendly to real-time scraping with bots, something prohibitively expensive with X/Twitter
Banks file enormous, heavily lawyered 10-Ks covering every conceivable risk across every global market they touch. Citigroup operates in 160 countries. Of course they mention “geopolitical” 40 times, their legal team has to. A small pharma with one China API supplier that mentions “tariffs” twice might actually be more operationally exposed than Citi. Raw mention volume is a measure of filing length & legal conservatism, not actual vulnerability. Progressive is an auto insurer. Netflix is a streaming service. Of course they don’t mention the Iran war. It’s not in their risk universe. That’s not resilience, it’s irrelevance. And I’d argue that Costco’s supply chain IS at risk.
The AI labs operate very differently. Anthropic is bringing a lot of money in enterprise subscriptions and API calls. xAI no idea why would anyone pay for it at all.
My startup idea is an egg that connects to the lovsense API and your Reddit account to activate whenever there is a new comment in the daily WSB thread.
Reddit replaced the mods of top subreddits a few months after Netanyahu was reelected. https://en.wikipedia.org/wiki/Reddit_API_controversy https://en.wikipedia.org/wiki/Steve_Huffman#Personal_life
Your broker's API is often the best
Any oil companies that have the replacement oil of the same API that’s no longer on the market , will do well out of this war and damage to the ME infrastructure
what futures broker has the best API? Please don't lie to me.
Great job! Thanks for sharing this. When you say you use automated entries for your SPX 0DTE trades, are you using a platform like Option Alpha / bots, or did you build your own automation through a broker API? I’m especially curious how you handle the entry logic and risk management when you can’t monitor the open because of your day job. Thank you.
Know any reasonably priced API's to pull live premarket data from?
Agreed, hard to test it. I ended up paying for a solution and it’s been worth the money. I tried making a homegrown solution with the Massive API and quickly learned it’s just best I pay for tools that are already built and have support.
There is no single switch you can flip in SEC filings to bulk export “retention metrics” because the SEC does not mandate a standard DAU or NRR metric, and companies disclose them in different sections, formats, and even with changing definitions over time. In practice, you either have to combine a bit of manual work with a “brain friendly” workflow, or lean on tools that already structure this kind of KPI data. A practical approach is to identify a small KPI set you care about, then consistently pull those from each company’s 10 K and 10 Q MD&A or KPI sections into your own sheet or database, accepting that Snap’s DAU or HubSpot’s NRR will not be defined exactly like Netflix or Spotify. If you want something more automated, your best bet is to use a financial data API or LLM based extractor pointed at the filings or earnings decks, then spot check the outputs against the original documents so you are not blindly trusting the model
Yah but it becomes a whole new ball game at that point. Right now Microsoft is irreplaceable cuz its OS and interface are entrenched in businesses. With humans out of the way, that is no longer the case. You just need AI talking to each other with some simple API
Because every dollar a legacy company currently spends on white-collar payroll is about to become an API expense. Traditional companies have massive overhead. To survive, they have to automate. When they fire 100 analysts and replace them with an AI agent, that salary money doesn't just disappear—it gets redirected to OpenAI, Microsoft, and AWS to pay for compute and tokens. AI companies are capturing the payroll of the entire global economy.
I’m sorry I’m not sure I know which API you’re referring to. For retail investors, it’s important to have stable, repeatable policies and procedures for long term success. Especially during times such as now
Through the API they have full gics description (all 4 levels) though lack of the code it’s hard to do the hierarchical groping by intuition of the names rather than 2030xxxx. That’s definitely a plus compare to other apis. However it didn’t have fund information, and doesn’t have fund analysis. Since it’s a completely different dataset I don’t think fund players are their target market. Finally pricing data is lacking. Didn’t see anything on corporation action adjustments or even just simple dividend details to do some kind of manual returns adjustments.
API is forecasting a 10+ million barrel build in inventories, wtf man. Makes zero sense, but they weren't far off last week either.
Brand new idea for the market. Instead of these silly algos buying and selling, why not use the chatgpt API and have it buy and sell instead?
I've used Unusual Whales forever. Their UI is great and I have a lot of custom watch lists and indicators. I am using AI now to code an agent and pull directly from their API. It's pretty simple even for a novice like me, I just need a long weekend to get it done.
😂😂😂 Anthropic's Claude Code has killed the stocks of many Software and IT Security companies in past 6 months. But they couldn't even protect their own source code from leaking (someone easily reverse engineered from their Claude Code installer): [https://cybernews.com/security/anthropic-claude-code-source-leak/](https://cybernews.com/security/anthropic-claude-code-source-leak/) >Anthropic, the flagship AI company, has inadvertently exposed the source code for its major tool Claude Code. It has already been extracted with thousands of mirror copies published online! The leaked code includes the core engine for LLM API calls, handling streaming responses, tool-call loops, thinking mode, retry logic, token counting, permission models, tools, etc. Exposed internal logic makes it very easy to reverse-engineer the tool, identify security risks, or steal intellectual property. It appears that Anthropic scrambled to remove the npm package. However, it was too late.
Please build a calendly alternative in a single morning. Remember to include - API integration across Google Calendar, Microsoft Graph, and Apple CalDAV — three separate protocols, three separate auth flows, three separate deprecation cycles you don’t control - Bi-directional sync with real-time conflict detection across all three simultaneously, accounting for propagation delays where a booked event hasn’t yet appeared in the calendar API - Distributed concurrency control so two people hitting the same open slot at the same millisecond don’t both get a confirmation email - Booking atomically touches Postgres, Redis, a job queue, a calendar API, a video conferencing API, and an email provider — partial failure at any step leaves corrupted state - Timezone resolution across every UTC offset on Earth including half-hour zones, DST transitions, and governments that change their DST rules with weeks of notice - Availability logic that simultaneously enforces buffer times, minimum notice windows, maximum advance booking, daily meeting caps, per-day custom hours, and date-specific overrides without any combination breaking - Round-robin assignment with fairness weighting, capacity limits, and account ownership routing that stays correct across cancellations, no-shows, and reschedules - An embeddable widget that runs sandboxed inside iframes on customer websites - Zoom, Teams, and Meet auto-link generation with unique credentials per meeting - Stripe integration with refund logic, failed payment handling, and no ambiguous confirmed/unconfirmed booking state - Salesforce and HubSpot sync that logs meetings on their API versioning schedule - Webhook delivery with retry queues, exponential backoff, dead letter queues, and idempotent replay - Reminder sequences that are atomically cancelled when a meeting is cancelled 30 minutes before the job fires, across a distributed queue - GDPR deletion covering your primary DB, backups, analytics, third-party logs, and audit trails - SOC 2 Type II continuous evidence collection, access reviews, and incident response
"Dev". The thing is sure the core basic features of a web app Calendly could be vibe coded to some degree. But there are a lot of back office processes etc which go into an enterprise level app. Not to mention Calendly seems to offer more than just that based on their website footer. If you can vibe code all of this in a morning I'll be impressed. **Product** Scheduling automation Meeting Notetaker Payments Customizable availability Mobile apps Browser extensions Meeting routing Event Types Email & website embeds Reminders & follow-ups Meeting polls Analytics Admin management **Integrations** Google ecosystem Microsoft ecosystem Calendars Video conferencing Payment processors Sales & CRM Recruiting & ATS Email messaging Embed Calendly Analytics API & connectors Security & compliance
Training may be expensive, but inference isn't. Rent a single B200 instance and try serving MiniMax-2.5 to a hundred or so concurrent users, then calculate the cost per 1M tokens and compare to API pricing. You'll see their margins on that are anything but thin, and the big providers use a bunch of other tricks that scale even better on multiple instances (e.g. PD disaggregation, speculative decoding, multi-tier KV caching, etc).
Ha! Got the Schwab API feeding my gsheet. Very cool.
Yeah even highly profitable companies like Google or Microsoft aren't going to continually light money from other divisions on fire so that they can indefinitely subsidize AI. Even at the $200/mo subscriptions, it's likely they are losing huge amounts of money, especially at some of the token usage rates I've seen. It's why all the big companies trying to crack down on usage on the subscription plans and move people to token/$ API usage plans. All of the big players were already greatly subsidizing AI usage even before the Iran war. None of these companies will continue to subsidize tokens so that AI bros can do their bootleg version of algorithmic trading or so that they can create some shitty apps on OpenClaw to try to scam people on the iOS store. Nor will they continue to give away access so that people can have a pretend AI girlfriend or spam social media and Github with AI slop. None of that actually makes money. AI has a considerable number of legitimate business uses so I don't see it disappearing but the argument that "some AI companies are already very profitable so they are okay throwing money away" makes no sense to me. They will ultimately adjust their ambitions to capture the most valuable business customers and also make sure they can keep their models at the cutting edge, but a lot of the data centers for general AI bro usage are probably not going to materialize.
Potentially use Claude Code for it? I had a bit different case but similar where I want to check specific scenarios for my entries to positions and my exits. So I basically asked Claude Code to develop me an Android App (since I use Android but should also work for iOS if you are an iphone user) that basically checks exactly the parameters that are relevant for me in the timefrales that are relevant etc. Basically I copy/pasted my Strategie into it and after several modifications had my strategy and what I want to achieve in a prompt that I then used to develop an app. The app only pulls the data from the data sources I specified (depending on the sources you might need to ensure you plug them in via API which sounds more complex than it actually is but can occur costs depending on the data source). I can not execute trades etc from the app as I would never trust a vibe coded app with financial logins but I can easily check it throughput the day and get notifications exactly for the sveanrios that matter to me. Overall effort in my case was around 2h. Only thing to keep in mind for Claude Code you need at least the 20$ Pro Subscription. However I really like that I can adapt it exactly to my setup instead of trying to modify an existing (usually paid) tool for this that only works halfbaked.
Yeah that's awesome!! TView was pay to play, but I JUST MOVED TO SCHWAB!!! I have a little money in the account, I should start playing there. I messed with the API a little, but have been distracted by market crashes and playing with Gemini on TradeView. XD
Hilariously this company was founded 2010 and was a real AI player before it was cool to be. But why I use them is because its API is pretty great.
This is correct. Also, If you want a 'live' SPX price during global hours, you can use the CMEs BTIC contract in an internarket spread. Which on trading view is !ES1-!EST1. Or if you're handy with an API, using put-call parity to back out the implied index price from the options themselves.
then give it the rights to query the API to get live data.
I can spin up mocks, requirements, architecture design and API designs and have it close to ready to build in like a day Previously would’ve taken months and required close collaboration with a bunch of different roles
Yes please share code I joined Discord as well and posted under #API
I am working on connecting gsheets to Schwab API as we speak. Gemini has been showing me the details and it seems to be almost ready to rock. Just waiting on a final approval from the Schwab developer backroom. I understand what is needed but, honestly, if I didn't have previous coding experience it would be very difficult to set up. Btw, marketdata extension is very easy to implement and it provides reasonable service for a reasonable fee. I have used marketdata for two years and I'm happy with it but I decided to see how the direct API works out.
If you want free and easy, CBOE's delayed chain CSVs are probably the cleanest source and you can import them into Sheets on a timer. Schwab API is nicer but the weekly token refresh is the annoying part.
If you ever get claude code it's very easy to have it work with Schwab's API. I just have it back up my order history to Google Sheets and track/place orders on a web app, but you could have it build something to pull options chains into Google Sheets no problem.
Yeah but the IRGC bots that drive up those API fees are getting bombed, reddit about to be a ghost town
If you're on Schwab you can use their API. You get 120 calls/minute and 1 call will get the options chain for an underlying.
0DTE/1DTE buyer here, mostly SPX. I have a Python script polling a GEX/Greeks API (FlashAlpha) and it basically answers all your questions. Stop-loss: I set it at the nearest big gamma wall. It's a structural level, not some random %. Entries: I only go in when price is near a dealer gamma flip zone. No setup, no trade. Overtrading: I check VRP before every session. If implied vol is running way above realized, I know I'm overpaying for premium so I just skip the day. Speed: Usually hold 20min to 2hrs. The gamma data also tells me when decay is about to speed up so I don't sit in a trade too long. Honestly the biggest value is having real reasons to sit out. Pure buying is more about discipline than anything else.
This. Now that third party clients are shut down, it should be SUPER easy for them to see who's hitting the API from the app or the site vs. somewhere else. If they wanted to shut down the bots, they could.
Google and Amazon who Claude uses have your data already. You have to rely on privacy policies and if they say the API does not store data it better not. They have big money on the line and don't want to get sued.
Most people would probably be using paid data subscriptions. I do it’s like $20 a month for am API
I needed a simple solution for automatic document naming of complex documents, including logic to know when to split apart multi-page documents or to keep them together. With AI and a couple of hours, I have a python script (I an not a programmer and never used python at all until yesterday) that monitors an input folder, goes through every document dropped in there and outputs the finished product to an output folder. The original documents go to a processed folder so you can review that against the output folder, so if anything messes up you can go back. The names of the documents match the exact naming convention I wanted and the system is smart enough to figure out what to do with documents that are different. If scanning something like a closing package for a property, it knows to keep that as a single document even though it is really a few separate documents but if scanning something like a tax document, it will split up multiple forms. Going back to the closing document, I was able to tell it to name the document according the property involved and not the parties (e.g. law firm), whereas an invoice will specify the vendor. I even had it add to the name of documents if they are preliminary or amended... Took a few tries going back and forth before it gave good results. It is dead simple to use and gives me exactly what I want. Costs pennies per document to use the Claude API. I can ask AI to modify the script to make any changes I want... For example, if I decide I want it to use a more complex folder structure to keep track of each batch of documents it handles, I am sure that is a few minutes with AI asking it to modify the script. Basically, any feature I want to add to what I have already, I can ask the AI to do and I can have that result right away and not have to ask a company to implement a feature that I may be the only person that cares about. This is pretty much my first serious attempt to use AI and I am hooked and only thinking about how it can do more and more things. Why should I spend hours evaluating software that claims to do these things when I can ask AI to do it and if I don't like the first result just ask it to make changes and pretty soon have the exact result I want?
Massive API and Claude code and you should have it easy
One good programmer with AI can do the job of 100 programmers. Layoffs of programmers are only going to accelerate. Myself with near-zero programming experience had AI walk me through installing python and writing a script that ties into Claude's API to do all sorts of document recognition tasks. I have no idea what programming something like this would be like a few years back, but I would imagine it would require a team of programmers spending a few months to even get close to the result I got in a few hours with no real programming experience using AI and spending under $100.
You're 100% right—Schwab (and basically every retail API) doesn't provide 'signed' trade data or dealer-side flagging. That’s the classic hurdle for building these tools. However, **GammaPulse Pro** uses the 'Standard Model' for GEX calculation. It aggregates the Gamma of the total Open Interest (OI) under the core assumption that dealers are the net liquidity providers (short the options to the public). While the $2k/month institutional terminals pay for exchange-direct feeds to get 'signed trades' (guessing buy/sell side based on bid/ask hits), the structural **'King Nodes'**—the massive OI walls where dealers are forced to hedge—remain the same regardless of the feed. For highly liquid tickers like $SPY, $QQQ, and $NVDA, the 'Standard Model' captures \~90% of the structural signal. The goal here isn't to compete with a Bloomberg terminal on tick-level precision, but to give retail traders a high-fidelity map of the dealer floors and ceilings that they’d otherwise be flying blind into. Engineering-wise, it's about the signal-to-noise ratio. For day trading these levels, the OI-based GEX is the signal.
Schwab API doesn't provide information to distinguish dealers'positions. Why do you think other GEX providers pay thousands of exchange fees to get this data and could do this more or less reliebly for SPX only ?
Perplexity Finance will keep context/awareness of your positions. Just connect your portfolio account over API. And yes, PF has in depth data on everything you mentioned, just need to know where to look.
I'm covering it for now. Might allow users to bring their own API key later.
The lawsuit is because OpenAI built a system with AWS/Amazon that Microsoft alleges violates a contract that dictates that Microsoft Azure the exclusive provider of OpenAI's API traffic. Obviously, this will upset Microsoft because they consider OpenAI to be violating an agreement to go with a competitor.
I had no idea API pricing was so expensive lmao. You’re telling me I have to pay hundreds a month for data to lose tens of thousands of dollars?
Yeah I can’t wait to link my financial accounts into Reddits API
Yeah I’m used to sharing Gemini in chrome with the unusual whales dashboard. Honestly after setting all this up I don’t think I’m going to stick with it longer than a month. API 150 Claude 100. I wrote a few prompts that should be repeatable but seem to execute differently each time on the projects as well
Interactive Brokers? The UI is not very modern I have to admit, but lots of instruments for trading. And if you care about API access, there are lots of implementations (e.g. python)..
Update: my broker Schwab TOS offers API but I must have administrator rights for desktop Excel, which I don’t have nor I want to pay. Anybody tried Libre Office to scrub data from TOS?
I'm using Lynx which is connected to Interactive Brokers. I'm quite happy with it. The same reason for me, wanted to trade real options (and not only Optionsscheine..). I even you their API (also interactive brokers) which is quite useful if you're interested in that
Mods are bears and they keep deleting the AI 10x play. Below is my AI 10x play. Go long 10x leverage AI plays when openAI and anthropic IPOs, so 10x upside with 10x leverage is easy 100x in short time: Below is my DD. Calls on AI companies and VCX. 100x tendies. Everyone says AI is a bubble and its gonna burst and blow up like 2000 internet bubble. But I think they are wrong. AI in current stages is actually undervalued and still has 5x to 10x upside left in medium term. **The numbers:** AI is at least good enough to do the work of most desk based purely computer oriented jobs (aka white collar jobs). Now , lets arrive at the market for this white collar jobs World pop - 8 billion Lets assume 4 people per family, this gives us 8/4 = 2 billion families of which on average 1.5 people will be working in some form or other, which is 3 bn total jobs. Now of this, according to many sources, white collar jobs are around 1.2 billion (it was around 800mn jobs globally in early 2010s) Of these jobs, not all are prone to replacement or something AI can do, so lets assume of these 1.2bn, even a conservative 30% of it can be replaced by AI , which gives us 1.2\*0.3 = \~350mn jobs that can be easily replaced by AI at least 80% to 90%. These are your top white collar jobs that at least pays 50k$ annually if you average out geographies , experience of the person etc. You might wonder someone working in Asia in white collar job is not going to be paid 50k$, but actually they are paid anywhere from 20k$ to 30k$ if its a proper computer based white collar job. And in most developed countries, the average pay of such jobs is 90k$+, so on a weighted average basis, 50k$ per job per year is a very optimum assumption to make as global average. Now, the current annual cost to companies employing these jobs are 350mn \* 50k$ = 17.5tn$ per year. Now, this is the market that is the prime target of AI companies. But not all such jobs can be replaced by AI, so lets assume 50 to 60% jobs are replaced with AI. This gives a TAM of 10.5tn$ (at 60% of total possible market) Now, second part is no company is gonna pay for AI if it costs the same as what they pay for a worker. So lets assume in long run, after AI companies adjust pricing etc, AI subscription costs 25k$ per year (this is actually what will be required by AI companies as monthly subscription if they wanna breakeven on compute costs , translating to 2.1k$ per month) So, our Total revenue potential becomes 10.5tn \* 50% = \~5.25tn$ (since we are assuming AI companies will charge 25k$ per year instead of 50k$ cost per employee ) Since AI companies are mostly gonna follow Uber or Amazon model, once they breakeven, being purely compute based , their profit margins could go has high as 30% if models reach maturity stage, but lets be conservative and assume 15% net margin. so annual income from all these companies will be 5.25tn$ \* 15% = \~800bn$ in pure profits. Since most companies can be expected to trade at 25x to 30x easily, given the high entry barrier and critical nature of AI services, we can assume a P/E multiple of 30x, which gives us a value of 800bn \* 30 = 24tn$. Lets round it to 25tn$ for easy So the AI market is conservatively worth around 25tn$. As of now, most AI companies put together are valued around only 3 to 4tn$. So the undervaluation is massive and people are not looking at the total market it will replace. The only thing were it could be wrong is companies might not pay 25k$ per year, but be realistic, AI companies will hook up users, make CEOs addicted to the AIs and raise monthly fees to reach break even no matter what. And ask any serious dev or heavy computer user, your job will require you to use deep research type AI at least 10x a month and it will easily rack up API costs exceeding 1.5k to 2k per month. Hell, i myself use deep research 5 to 6x a day for work and our bill is over 50k$ per year per person. Plus these are business consumers, so they will definitely pay the fees unlike individual subscribers who will cancel if it costs too much or stick to free tiers. Another thing is our numbers don't include any paying casual users , which can easily add 3 to 4tn$ to the market . Realistically I think there will be a Business tax on AI usage to pay for UBI to compensate for job loss, but given how slow legislation moves, there could be a gap in-between government realizing the impact of job loss and UBI, where this value can be traded and profits can be made.
OpenAI’s current Sora API prices are about $0.10/sec for Sora 2 and $0.30/sec for Sora 2 Pro at 720p, with higher prices for higher-end outputs. That puts revenue at roughly $360/hour of delivered video for the base model and much higher for Pro tiers. By comparison, published 2026 cloud rates for top-end inference GPUs are often around $2–$4 per H100-hour on lower-cost providers and about $4+ per H100-hour on managed providers. Lambda currently lists H100 around $3.29–$4.29/hour, and Runpod advertises H100 from $1.99/hour. That means the break-even bar on raw compute is not crazy high. At $0.10/sec, if OpenAI were using an 8×H100 equivalent cluster at $4/GPU-hour, the raw compute cost would be about $32/hour of wall-clock generation. They would still break even on raw GPU spend as long as generating 1 second of output takes less than about 11.25 seconds of 8-GPU wall time. At $0.30/sec, that threshold rises to about 33.75 seconds; at higher Pro prices, the cushion gets much larger. That is an inference from the pricing math, not a disclosed OpenAI figure. Industry pricing also points the same way. Runway’s API pricing works out to about $0.05/sec for Gen-4 Turbo and $0.10–$0.40/sec for Veo-family models depending on tier and audio, so Sora’s API pricing sits in the same commercial band or above it rather than looking obviously underpriced. So my best read is: Yes, they are likely at least breaking even on the variable cost of serving many API calls, especially Pro tiers.
I want to learn more about pulling options chain data into my spreadsheets. Can you say more about how you do that? Currently I use marketdata API to pull into Google sheets. Also beginning to investigate Schwab API .
> They're not burning cash through the API. The video generation absolutely is burning cash
They're not burning cash through the API. The app was burning cash because it was free. It costs around $2.50-$3.50 per hit on the API, depending on the aggregator site used to generate it.
Throwing out there I've basically never used GPT, I use others. However, most people use GPT and the most noise I heard that a model "got worse" was one of the GPT 4 branches. "Turbo"? I dunno, but it was silly noise because they also dropped the API token costs by 3x. It was very obviously an efficiency push. They pushed a model they though had a similar "AI Omph" but costed them less money. I don't think there was another hit like that after 4 (I could be wrong). Also I want to say the OpenAI will throttle more than the other models (shadow switch users to a worse model). https://www.cl.cam.ac.uk/~is410/Papers/dementia_arxiv.pdf The study I am talking about(and that people with throw out all the time), is basically the root of the idea that AI is going to kill itself through training off it's own data that gets put on the internet. This isn't a problem people make it out to be, the TLDR is you can't leave it in a box and let it talk to itself. Which was never a thing. AI researchers are beyond aware of the phenomenon. In fact one of the most effective ways they've made models better is to have them generate their own examples to train the next model (synthetic data). It works remarkably well. The difference is guided vs blind recursion.
I looked at the API prices without reading about it being a price per second of a video, not for a 10s video, so nvm. It could still be profitable on the API though. Other API providers for video models charge similar kinds of money for models like Kling v2.6 or Veo 3.1 Fast. The pattern is that usually API pricing is profitable when you look at it in the vacuum without having to train the model first, but it's rarely profitable enough to recoup model training costs.
I'm pretty sure even their API is priced at a loss.