Reddit Posts
Download dataset of stock prices X tickers for yesterday?
Tech market brings important development opportunities, AIGC is firmly top 1 in the current technology field
Tech market brings important development opportunities, AIGC is firmly top 1 in the current technology field
AIGC market brings important development opportunities, artificial intelligence technology has been developing
Avricore Health - AVCR.V making waves in Pharmacy Point of Care Testing! CEO interview this evening as well.
OTC : KWIK Shareholder Letter January 3, 2024
The commercialization of multimodal models is emerging, Gemini now appears to exceed ChatGPT
The commercialization of multimodal models is emerging, Gemini now appears to exceed ChatGPT
Why Microsoft's gross margins are going brrr (up 1.89% QoQ).
Why Microsoft's gross margins are expanding (up 1.89% QoQ).
Why Microsoft's gross margins are expanding (up 1.89% QoQ).
Google's AI project "Gemini" shipped, and so far it looks better than GPT4
US Broker Recommendation with a market that allows both longs/shorts
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
Best API for grabbing historical financial statement data to compare across companies.
Seeking Free Advance/Decline, NH/NL Data - Python API?
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
Delving Deeper into Benzinga Pro: Does the Subscription Include Full API Access?
Qples by Fobi Announces 77% Sales Growth YoY with Increased Momentum From Media Solutions, AI (8112) Coupons, & New API Integration
Qples by Fobi Announces 77% Sales Growth YoY with Increased Momentum From Media Solutions, AI (8112) Coupons, & New API Integration
Qples by Fobi Announces 77% Sales Growth YoY with Increased Momentum From Media Solutions, AI (8112) Coupons, & New API Integration
Aduro Clean Technologies Inc. Research Update
Aduro Clean Technologies Inc. Research Update
Option Chain REST APIs w/ Greeks and Beta Weighting
$VERS Upcoming Webinar: Introduction and Demonstration of Genius
Are there pre-built bull/bear systems for 5-10m period QQQ / SPY day trades?
Short Squeeze is Reopened. Play Nice.
Created options trading bot with Interactive Brokers API
Leafly Announces New API for Order Integration($LFLY)
Is Unity going to Zero? - Why they just killed their business model.
Looking for affordable API to fetch specific historical stock market data
Where do sites like Unusual Whales scrape their data from?
Twilio Q2 2023: A Mixed Bag with Strong Revenue Growth Amid Stock Price Challenges
[DIY Filing Alerts] Part 3 of 3: Building the Script and Automating Your Alerts
This prized $PGY doesn't need lipstick (an amalgamation of the DD's)
API or Dataset that shows intraday price movement for Options Bid/Ask
[Newbie] Bought Microsoft shares at 250 mainly as see value in ChatGPT. I think I'll hold for at least +6 months but I'd like your thoughts.
Crude Oil Soars Near YTD Highs On Largest Single-Week Crude Inventory Crash In Years
I found this trading tool thats just scraping all of our comments and running them through ChatGPT to get our sentiment on different stocks. Isnt this a violation of reddits new API rules?
I’m Building a Free Fundamental Stock Data API You Can Use for Projects and Analysis
Fundamental Stock Data for Your Projects and Analysis
Meta, Microsoft and Amazon team up on maps project to crack Apple-Google duopoly
Pictures say it all. Robinhood is shady AF.
URGENT - Audit Your Transactions: Broker Alters Orders without Permission
My AI momentum trading journey just started. Dumping $3k into an automated trading strategy guided by ChatGPT. Am I gonna make it
The AI trading journey begins. Throwing $3k into automated trading strategies. Will I eat a bag of dicks? Roast me if you must
I made a free & unique spreadsheet that removes stock prices to help you invest like Warren Buffett (V2)
I made a free & unique spreadsheet that removes stock prices to help you invest like Warren Buffett (V2)
To recalculate historical options data from CBOE, to find IVs at moment of trades, what int rate?
WiMi Hologram Cloud Proposes A New Lightweight Decentralized Application Technical Solution Based on IPFS
$SSTK Shutterstock - OpenAI ChatGBT partnership - Images, Photos, & Videos
Is there really no better way to track open + closed positions without multiple apps?
List of Platforms (Not Brokers) for advanced option trading
Utopia P2P is a great application that needs NO KYC to safeguard your data !
Utopia P2P supports API access and CHAT GPT
Stepping Ahead with the Future of Digital Assets
An Unexpected Ally in the Crypto Battlefield
Utopia P2P has now an airdrop for all Utopians
Microsoft’s stock hits record after executives predict $10 billion in annual A.I. revenue
Reddit IPO - A Critical Examination of Reddit's Business Model and User Approach
Reddit stands by controversial API changes as situation worsens
Mentions
then give it the rights to query the API to get live data.
I can spin up mocks, requirements, architecture design and API designs and have it close to ready to build in like a day Previously would’ve taken months and required close collaboration with a bunch of different roles
Yes please share code I joined Discord as well and posted under #API
I am working on connecting gsheets to Schwab API as we speak. Gemini has been showing me the details and it seems to be almost ready to rock. Just waiting on a final approval from the Schwab developer backroom. I understand what is needed but, honestly, if I didn't have previous coding experience it would be very difficult to set up. Btw, marketdata extension is very easy to implement and it provides reasonable service for a reasonable fee. I have used marketdata for two years and I'm happy with it but I decided to see how the direct API works out.
If you want free and easy, CBOE's delayed chain CSVs are probably the cleanest source and you can import them into Sheets on a timer. Schwab API is nicer but the weekly token refresh is the annoying part.
If you ever get claude code it's very easy to have it work with Schwab's API. I just have it back up my order history to Google Sheets and track/place orders on a web app, but you could have it build something to pull options chains into Google Sheets no problem.
Yeah but the IRGC bots that drive up those API fees are getting bombed, reddit about to be a ghost town
If you're on Schwab you can use their API. You get 120 calls/minute and 1 call will get the options chain for an underlying.
0DTE/1DTE buyer here, mostly SPX. I have a Python script polling a GEX/Greeks API (FlashAlpha) and it basically answers all your questions. Stop-loss: I set it at the nearest big gamma wall. It's a structural level, not some random %. Entries: I only go in when price is near a dealer gamma flip zone. No setup, no trade. Overtrading: I check VRP before every session. If implied vol is running way above realized, I know I'm overpaying for premium so I just skip the day. Speed: Usually hold 20min to 2hrs. The gamma data also tells me when decay is about to speed up so I don't sit in a trade too long. Honestly the biggest value is having real reasons to sit out. Pure buying is more about discipline than anything else.
This. Now that third party clients are shut down, it should be SUPER easy for them to see who's hitting the API from the app or the site vs. somewhere else. If they wanted to shut down the bots, they could.
Google and Amazon who Claude uses have your data already. You have to rely on privacy policies and if they say the API does not store data it better not. They have big money on the line and don't want to get sued.
Most people would probably be using paid data subscriptions. I do it’s like $20 a month for am API
I needed a simple solution for automatic document naming of complex documents, including logic to know when to split apart multi-page documents or to keep them together. With AI and a couple of hours, I have a python script (I an not a programmer and never used python at all until yesterday) that monitors an input folder, goes through every document dropped in there and outputs the finished product to an output folder. The original documents go to a processed folder so you can review that against the output folder, so if anything messes up you can go back. The names of the documents match the exact naming convention I wanted and the system is smart enough to figure out what to do with documents that are different. If scanning something like a closing package for a property, it knows to keep that as a single document even though it is really a few separate documents but if scanning something like a tax document, it will split up multiple forms. Going back to the closing document, I was able to tell it to name the document according the property involved and not the parties (e.g. law firm), whereas an invoice will specify the vendor. I even had it add to the name of documents if they are preliminary or amended... Took a few tries going back and forth before it gave good results. It is dead simple to use and gives me exactly what I want. Costs pennies per document to use the Claude API. I can ask AI to modify the script to make any changes I want... For example, if I decide I want it to use a more complex folder structure to keep track of each batch of documents it handles, I am sure that is a few minutes with AI asking it to modify the script. Basically, any feature I want to add to what I have already, I can ask the AI to do and I can have that result right away and not have to ask a company to implement a feature that I may be the only person that cares about. This is pretty much my first serious attempt to use AI and I am hooked and only thinking about how it can do more and more things. Why should I spend hours evaluating software that claims to do these things when I can ask AI to do it and if I don't like the first result just ask it to make changes and pretty soon have the exact result I want?
Massive API and Claude code and you should have it easy
One good programmer with AI can do the job of 100 programmers. Layoffs of programmers are only going to accelerate. Myself with near-zero programming experience had AI walk me through installing python and writing a script that ties into Claude's API to do all sorts of document recognition tasks. I have no idea what programming something like this would be like a few years back, but I would imagine it would require a team of programmers spending a few months to even get close to the result I got in a few hours with no real programming experience using AI and spending under $100.
You're 100% right—Schwab (and basically every retail API) doesn't provide 'signed' trade data or dealer-side flagging. That’s the classic hurdle for building these tools. However, **GammaPulse Pro** uses the 'Standard Model' for GEX calculation. It aggregates the Gamma of the total Open Interest (OI) under the core assumption that dealers are the net liquidity providers (short the options to the public). While the $2k/month institutional terminals pay for exchange-direct feeds to get 'signed trades' (guessing buy/sell side based on bid/ask hits), the structural **'King Nodes'**—the massive OI walls where dealers are forced to hedge—remain the same regardless of the feed. For highly liquid tickers like $SPY, $QQQ, and $NVDA, the 'Standard Model' captures \~90% of the structural signal. The goal here isn't to compete with a Bloomberg terminal on tick-level precision, but to give retail traders a high-fidelity map of the dealer floors and ceilings that they’d otherwise be flying blind into. Engineering-wise, it's about the signal-to-noise ratio. For day trading these levels, the OI-based GEX is the signal.
Schwab API doesn't provide information to distinguish dealers'positions. Why do you think other GEX providers pay thousands of exchange fees to get this data and could do this more or less reliebly for SPX only ?
Perplexity Finance will keep context/awareness of your positions. Just connect your portfolio account over API. And yes, PF has in depth data on everything you mentioned, just need to know where to look.
I'm covering it for now. Might allow users to bring their own API key later.
The lawsuit is because OpenAI built a system with AWS/Amazon that Microsoft alleges violates a contract that dictates that Microsoft Azure the exclusive provider of OpenAI's API traffic. Obviously, this will upset Microsoft because they consider OpenAI to be violating an agreement to go with a competitor.
I had no idea API pricing was so expensive lmao. You’re telling me I have to pay hundreds a month for data to lose tens of thousands of dollars?
Yeah I can’t wait to link my financial accounts into Reddits API
Yeah I’m used to sharing Gemini in chrome with the unusual whales dashboard. Honestly after setting all this up I don’t think I’m going to stick with it longer than a month. API 150 Claude 100. I wrote a few prompts that should be repeatable but seem to execute differently each time on the projects as well
Interactive Brokers? The UI is not very modern I have to admit, but lots of instruments for trading. And if you care about API access, there are lots of implementations (e.g. python)..
Update: my broker Schwab TOS offers API but I must have administrator rights for desktop Excel, which I don’t have nor I want to pay. Anybody tried Libre Office to scrub data from TOS?
I'm using Lynx which is connected to Interactive Brokers. I'm quite happy with it. The same reason for me, wanted to trade real options (and not only Optionsscheine..). I even you their API (also interactive brokers) which is quite useful if you're interested in that
Mods are bears and they keep deleting the AI 10x play. Below is my AI 10x play. Go long 10x leverage AI plays when openAI and anthropic IPOs, so 10x upside with 10x leverage is easy 100x in short time: Below is my DD. Calls on AI companies and VCX. 100x tendies. Everyone says AI is a bubble and its gonna burst and blow up like 2000 internet bubble. But I think they are wrong. AI in current stages is actually undervalued and still has 5x to 10x upside left in medium term. **The numbers:** AI is at least good enough to do the work of most desk based purely computer oriented jobs (aka white collar jobs). Now , lets arrive at the market for this white collar jobs World pop - 8 billion Lets assume 4 people per family, this gives us 8/4 = 2 billion families of which on average 1.5 people will be working in some form or other, which is 3 bn total jobs. Now of this, according to many sources, white collar jobs are around 1.2 billion (it was around 800mn jobs globally in early 2010s) Of these jobs, not all are prone to replacement or something AI can do, so lets assume of these 1.2bn, even a conservative 30% of it can be replaced by AI , which gives us 1.2\*0.3 = \~350mn jobs that can be easily replaced by AI at least 80% to 90%. These are your top white collar jobs that at least pays 50k$ annually if you average out geographies , experience of the person etc. You might wonder someone working in Asia in white collar job is not going to be paid 50k$, but actually they are paid anywhere from 20k$ to 30k$ if its a proper computer based white collar job. And in most developed countries, the average pay of such jobs is 90k$+, so on a weighted average basis, 50k$ per job per year is a very optimum assumption to make as global average. Now, the current annual cost to companies employing these jobs are 350mn \* 50k$ = 17.5tn$ per year. Now, this is the market that is the prime target of AI companies. But not all such jobs can be replaced by AI, so lets assume 50 to 60% jobs are replaced with AI. This gives a TAM of 10.5tn$ (at 60% of total possible market) Now, second part is no company is gonna pay for AI if it costs the same as what they pay for a worker. So lets assume in long run, after AI companies adjust pricing etc, AI subscription costs 25k$ per year (this is actually what will be required by AI companies as monthly subscription if they wanna breakeven on compute costs , translating to 2.1k$ per month) So, our Total revenue potential becomes 10.5tn \* 50% = \~5.25tn$ (since we are assuming AI companies will charge 25k$ per year instead of 50k$ cost per employee ) Since AI companies are mostly gonna follow Uber or Amazon model, once they breakeven, being purely compute based , their profit margins could go has high as 30% if models reach maturity stage, but lets be conservative and assume 15% net margin. so annual income from all these companies will be 5.25tn$ \* 15% = \~800bn$ in pure profits. Since most companies can be expected to trade at 25x to 30x easily, given the high entry barrier and critical nature of AI services, we can assume a P/E multiple of 30x, which gives us a value of 800bn \* 30 = 24tn$. Lets round it to 25tn$ for easy So the AI market is conservatively worth around 25tn$. As of now, most AI companies put together are valued around only 3 to 4tn$. So the undervaluation is massive and people are not looking at the total market it will replace. The only thing were it could be wrong is companies might not pay 25k$ per year, but be realistic, AI companies will hook up users, make CEOs addicted to the AIs and raise monthly fees to reach break even no matter what. And ask any serious dev or heavy computer user, your job will require you to use deep research type AI at least 10x a month and it will easily rack up API costs exceeding 1.5k to 2k per month. Hell, i myself use deep research 5 to 6x a day for work and our bill is over 50k$ per year per person. Plus these are business consumers, so they will definitely pay the fees unlike individual subscribers who will cancel if it costs too much or stick to free tiers. Another thing is our numbers don't include any paying casual users , which can easily add 3 to 4tn$ to the market . Realistically I think there will be a Business tax on AI usage to pay for UBI to compensate for job loss, but given how slow legislation moves, there could be a gap in-between government realizing the impact of job loss and UBI, where this value can be traded and profits can be made.
OpenAI’s current Sora API prices are about $0.10/sec for Sora 2 and $0.30/sec for Sora 2 Pro at 720p, with higher prices for higher-end outputs. That puts revenue at roughly $360/hour of delivered video for the base model and much higher for Pro tiers. By comparison, published 2026 cloud rates for top-end inference GPUs are often around $2–$4 per H100-hour on lower-cost providers and about $4+ per H100-hour on managed providers. Lambda currently lists H100 around $3.29–$4.29/hour, and Runpod advertises H100 from $1.99/hour. That means the break-even bar on raw compute is not crazy high. At $0.10/sec, if OpenAI were using an 8×H100 equivalent cluster at $4/GPU-hour, the raw compute cost would be about $32/hour of wall-clock generation. They would still break even on raw GPU spend as long as generating 1 second of output takes less than about 11.25 seconds of 8-GPU wall time. At $0.30/sec, that threshold rises to about 33.75 seconds; at higher Pro prices, the cushion gets much larger. That is an inference from the pricing math, not a disclosed OpenAI figure. Industry pricing also points the same way. Runway’s API pricing works out to about $0.05/sec for Gen-4 Turbo and $0.10–$0.40/sec for Veo-family models depending on tier and audio, so Sora’s API pricing sits in the same commercial band or above it rather than looking obviously underpriced. So my best read is: Yes, they are likely at least breaking even on the variable cost of serving many API calls, especially Pro tiers.
I want to learn more about pulling options chain data into my spreadsheets. Can you say more about how you do that? Currently I use marketdata API to pull into Google sheets. Also beginning to investigate Schwab API .
> They're not burning cash through the API. The video generation absolutely is burning cash
They're not burning cash through the API. The app was burning cash because it was free. It costs around $2.50-$3.50 per hit on the API, depending on the aggregator site used to generate it.
Throwing out there I've basically never used GPT, I use others. However, most people use GPT and the most noise I heard that a model "got worse" was one of the GPT 4 branches. "Turbo"? I dunno, but it was silly noise because they also dropped the API token costs by 3x. It was very obviously an efficiency push. They pushed a model they though had a similar "AI Omph" but costed them less money. I don't think there was another hit like that after 4 (I could be wrong). Also I want to say the OpenAI will throttle more than the other models (shadow switch users to a worse model). https://www.cl.cam.ac.uk/~is410/Papers/dementia_arxiv.pdf The study I am talking about(and that people with throw out all the time), is basically the root of the idea that AI is going to kill itself through training off it's own data that gets put on the internet. This isn't a problem people make it out to be, the TLDR is you can't leave it in a box and let it talk to itself. Which was never a thing. AI researchers are beyond aware of the phenomenon. In fact one of the most effective ways they've made models better is to have them generate their own examples to train the next model (synthetic data). It works remarkably well. The difference is guided vs blind recursion.
I looked at the API prices without reading about it being a price per second of a video, not for a 10s video, so nvm. It could still be profitable on the API though. Other API providers for video models charge similar kinds of money for models like Kling v2.6 or Veo 3.1 Fast. The pattern is that usually API pricing is profitable when you look at it in the vacuum without having to train the model first, but it's rarely profitable enough to recoup model training costs.
I'm pretty sure even their API is priced at a loss.
The Cursor thing is wild - getting caught red-handed using someone else's model while claiming it's your own flagship release is next level embarrassing. When you're valued at almost $30B and can't even bother to scrub the model IDs from your API calls, that's just sloppy. Moonshot being private kills any immediate play though, which is probably why this isn't getting more attention here.
Why would they only shut down the app, but let people burn cash through the API for it? Even Altman isn't *that* stupid. If you were just going to keep the API going, there wouldn't be a timeline for anything—you don't share a timeline on something staying around.
the article states the app is shutting down. the only mention of the API is in a direct quote from Altman "We’ll share more soon, including timelines for the app and API and details on preserving your work". So maybe the timelines are different? Well see.
Oil to $0.05 a barrel. My insight? I have an API connected directly to 🥭's brain and it acts like a chatgpt API. Thing is, this API is giving me insight to what 🥭 is about to post on Lies social. Direct from the source before he even posts.
a single video on their api is $0.1-$0.3 and API is always priced to be providing a margin. So you're probably off by 20-30x. Training a model was super expensive but serving is not.
You're not running an llm in one html file. If you're hitting the anthropic API, then you're using *someone's* account, and that data is going to anthropic, which is fine, but the data ain't staying on your device
Do we care about API or EIA reports anymore or is it just tweets and vibez until the war is done?
I disclaimed that I don't doubt there's insider trading going on. Obviously this admin is not above that. But also, yeah, there probably are AI bots that are quicker on the draw than Truth Social's server infrastructure. If it turned that someone reverse engineered an API call to pull posts before they're fully published/processed, that wouldn't be wildly shocking. By 15-20min? No, probably not, but a couple minutes is all you need when that kind of money is on the line. Maybe that sounds like a crackpot theory but given he uses Truth Social like a press secretary, it'd be more surprising if nation states *hadn't* tried to unearth those kinds of exploits.
I use Unusual Whales. I found it really easy to customize and create my watchlist. It has the most advanced features in my opinion. Although I'm currently trying to switch from their UI to their API so I can track everything automatically.
Here's Claude's response: Fair point — you should always review code before trusting it with credentials. In this case: \- The refresh token is read-only scope (can't place trades or move funds) \- It stays in a local .env file that's gitignored and never uploaded \- The server runs on localhost — all API calls go directly from your machine to Tastytrade, no third-party servers \- server.js is \~80 lines and fully auditable — you can verify in 2 minutes that it doesn't send your credentials anywhere
Nice one buddy. I've buit only a dashbord one and a trading app with Tastytrade API to be able to day trade options faster - without having to select contracts and going through the confirmation screens before submitting. Well done exposing the url to the internet and accessing it on your phone
Okay but that may be a hallucination, and the way you ask is also very important. It isn’t nefariously trying to mislead you, or it’s being told to do so to harm you. Get a spreadsheet with several examples of the problem solved and show that to it. Then tell it what the problem you are trying to solve is and ask it to make a program for you. If you feed it an API key from IBKR that program can check for you every time you ask it.
If you’re looking for API that actually calculates gex levels for any stock and index have a look here https://flashalpha.com/docs/playground Docs : https://flashalpha.com/docs/lab-api-gex And this requires login, but it’s free. Works during market hours as it’s not cached : https://flashalpha.com/stock/tsla
Bloomberg's BQL is great but yeah, the cost is brutal for anyone not at a big firm. Options Metrics is worth checking out. They let you filter by the usual stuff like strikes, expirations, Greeks, IV, and you can save your favorite scans which is super handy when you're running the same strategies repeatedly for clients. The free tier gives you delayed data so it's fine for research and backtesting, but for live trading you'd need the Pro tier to get real-time numbers. Is that what you're after or are you wanting an API or something?
/u/therpgrad hooked me up with some GPUs, and I bought a fancy rig to go under them, so there's no API costs or anything. will just be keeping that rig alive and managing the vector db longterm.
I use Unusual Whales for all this data. I actually get it through their API but I know they have a UI with custom filters. You can get institutional flow there as well.
Unusual Whales is the best at this imo. You can set up watchlists and monitor for whatever unusual activity you'd like. I like their API a lot too. I automated alerts and send them right to my phone. Next step I want to train my agent to analyze these signals and tell me what to do.
Yeah no doubt, I probably could have saved myself some grief early on if I'd gotten more hands on with the research. I think Schwab is lacking documentation on their websocket/streaming data though. And it took some experimenting to figure out what we could get away with in terms of string/url lengths. But the API stuff seems mostly well documented in Schwab's dev portal
I've had good luck just asking Claude code to do the research itself. It was able to figure out how to get everything working no problem, just need to specify to ignore any source that refers to TD Ameritrade's API.
As noted separately - register as a Schwab developer and renew the token weekly. Note - Claude doesn’t know API details from Schwab. With your developer account you will need to copy/paste the details for the api endpoints of interest so Claude isn’t guessing and you’re getting frustrated. :) It will work quickly and consistently this way.
Great service - you need an account with holdings (no min), register as a Schwab developer, approve a token use, provide Claude with API details, create scripts to run daily or twice daily. You can use polygon/massive for other info (news eg) and free yahoo for sectors since some miss from Schwab. This is direct Schwab developer- separate from ThinkorSwim although you can also track with scripting there but less control than api directly.
Unusual Whales has all 13F filings for tracking these guys. I think they added it to the API as well. I'm going to pull it into my model with their MCP server.
I have no programming experience so I asked claude code to explain the API a little bit. If you want to know something specific let me know and I'll pass it along. This stuff is very easy to vibe code for personal use. >Schwab's Trader API (formerly TD Ameritrade's API) gives you programmatic access to your brokerage account — positions, balances, market data, option chains, and order placement. Authentication uses OAuth > 2.0, which is the biggest hurdle to getting started. You register an app on Schwab's developer portal to get a client ID and secret, then go through a browser-based login flow where the user authorizes > your app. This gives you an access token (valid \~30 minutes) and a refresh token (valid 7 days). The refresh token is the lifeline — if it expires because your app was offline for a week, the user has to > manually re-authenticate through the browser. You'll want to store these tokens somewhere persistent (we use PostgreSQL) and build automatic refresh logic so your access token stays valid during market > hours. > > The data available is comprehensive. Account endpoints give you positions, balances, and order history. The quotes endpoint lets you batch up to \~245 symbols per request for real-time quotes (price, > bid/ask, volume, etc.). The option chain endpoint is particularly powerful — you pass a ticker and get back every available strike and expiration with greeks, bid/ask, open interest, and volume. Strike > keys come back as decimal strings like "150.0" rather than integers, which is a quirk you need to handle. For order placement, you can submit, replace, and cancel equity and options orders with full > control over order type, duration, price, and quantity. There's also a WebSocket streaming service (ACCT\_ACTIVITY) that pushes real-time account events — order fills, cancellations, and status changes — > with execution details like the exchange venue, route, and commission that you won't get from the REST API alone. > > The main restrictions are rate limiting and token management. You're capped at 120 API requests per minute, which matters when you're iterating across many tickers (fetching option chains, quotes, etc.) — > we use 200ms delays between calls to stay under the limit. The quotes endpoint has a URL length limit that caps you at roughly 245 symbols per batch. The streaming WebSocket requires a separate > authentication flow using "user preferences" to get the socket URL and credentials, and it needs periodic reconnection since Schwab rotates the session. The 7-day refresh token expiry means you can't just > deploy and forget — if your server goes down for a week, someone has to log in again manually. > > For infrastructure, at minimum you need a server that can run persistently during market hours (not just serverless functions) if you want streaming data or scheduled jobs. We use a Node.js process on > Railway for the WebSocket stream service and cron jobs, with Next.js on Vercel for the web UI and REST API routes. A database is essential for storing OAuth tokens, order records, and any historical data > you want to track — we use Neon PostgreSQL with Prisma. If you're just doing basic things like pulling quotes or account data, you could get away with a simpler setup, but once you start placing orders or > monitoring fills in real-time, you need the persistent server, reliable token storage, and a database to track state. The streaming service in particular can't run in a serverless environment since it > maintains a long-lived WebSocket connection.
Polygon is Massive now, just one service. Some limited use is free, but you'd want to use your broker's API to get the best results. Yahoo Finance API is also quite useful. If your broker has a good API you can vibe code quite a ways with that and Yfinance
DLocal operates a one-stop payments platform enabling companies like Amazon, Microsoft, and Spotify to process transactions across more than 40 emerging markets. Its strength lies in simplifying complex local payment networks through a single API.
API Error: 500 {"type":"error","error":{"type":"api_error","message":"Internal server error"},"request_id":"req_011CYV8nyJ6D8LcyoKbGNSTv"}
Agree. This is the top comment of the post: > If you listened closely Powell said that the economy would be in a really good spot if Trump didn’t do tariffs and going to war with Iran. I can tell you in financial services, it's an absolute bloodbath. Full stop. Massive outsourcing/offshoring for 5 years. Non-stop re-orgs. The tech is borderline failing at this point. Leaderships only move is offshore and let go of workers, but they have to turnover after 4 years because you can literally see in the general ledger they're destroying the business line. It's obvious the offshore workers have zero background in any of this and significant proportion have fake resumes. All the work is through contractual agreements with IT Consultancies in India, so the bureaucracy is insane. For example, offshore workers aren't allowed access to certain databases. But, there's a mandate to offshore tons of the data analyst/reporting work. The question is: how will they build the reports if they can't access the databases? Well, we build a new staging area. Then build the ETL pipeline. Then load into a new database they can access. Then they query from that database to return CSVs in their local environment. Then run the python scripts. We basically spent 2 years building out the jankiest data pipelines in history for them to get all the reporting wrong. This is the "automation" the MBAs have been preaching about. I'm convinced it's some sort of corporate raid. The only thing holding the company together are COBOL mainframes from the 1980s, which a rogue API from the offshore dev teams recently took down for half a day...
WSB take: if AI agents really replace apps, then the money is in whoever owns distribution (OS layer) and whoever owns the "agent to API" marketplace. Everyone else becomes a commoditized backend. Non meme take: reliability and permissions are going to decide who wins. An agent that can do 80% of tasks but sometimes clicks the wrong thing is dead on arrival. Ive been following some practical agent reliability/evals stuff here: https://www.agentixlabs.com/blog/
You could try pulling delayed market data via Google Finance or an API like Alpha Vantage, then update Sheets automatically.
Interesting project! Some might prefer raw sentiment or positioning data over just news, could complement your API.
Honestly I think a lot of people are still underestimating how much infra side of finance is changing, not just stocks. Been following a few industry reports lately (especially from FinanceFeeds) and a big theme is how brokers are quietly moving towards multi-asset + API-driven platforms. Retail still sees “stocks vs crypto”, but backend is becoming way more unified. If that plays out, companies enabling infra (liquidity, APIs, compliance automation) might outperform typical retail-facing apps. Curious if anyone else is tracking this angle or still focused only on equities?
Value play. This is a 1 year + hold atleast. The biggest potential as stated by ceo on latest conferance call is the onshoring of API manufacturing into the US which ceo said is run and approved by the white house level...
I run a somewhat similar correlation strat for NQ. The issue usually isn't the code, it's the cost of the data feeds. Getting reliable, granular options flow and real-time Greeks via API is expensive af.
Still waiting on the guy to have a conversation with this brokerage integrated API LLM agent and accidentally blow up his account due to an AI hallucination. 🤡
Exactly, just take a look at the pricing of just bare bones H100 and go all the way upto serverless/model API providers, literally goes from 1.5/hr on vast.ai to 10/hr for HF/AWS serverless
Now the platforms are not only rule-based. They have mcp+llm, which can easily do those event driven analysis. Nice ones need description, tho. The API you are talking about is more of MCP layer
Public Brokerage has a decent API, and pretty good rebate
When people compare CRM - or other CRM/ERP products and companies within this field - to ServiceNow and praise ServiceNow.. Have these people worked with ServiceNow before, or do they mostly view ServiceNow as this magnificent workflow automator? ERPs and CRMs appear as slow because they are handling immense amounts of data, corresponding to the sometimes data-heavy loads and requests users make. I have worked with ServiceNow before. All I see is a glorified ticketing system that could just as easily be threatened by competitors. Their API structure is.. decent but not more than that. Are people willing to bet on a general plug and play API request software company that doesn’t have any specific moat - at least as far as I can tell? Sorry, I just don’t buy it. My humble opinion, I really don’t get the craze about ServiceNow so please enlighten me
>I lose money through API keys trying to vibecode a trading bot somehow this puts everything else you said to shame lmao
I lose money in the market.I lose money through API keys trying to vibecode a trading bot.I lose money shorting oil.I lose money on Polimarket bets.
For options specifically, there’s no clean copy-trading solution like forex/futures. Most trade copiers are built for MT4/MT5 or futures platforms, not options chains with strikes/expiries. So in practice, options traders either use broker-level multi-account tools (limited) or manually execute. Even prop setups like FundingRock don’t support options, so you won’t get copier-style scaling there either. If you need true mirroring, you’re basically looking at custom API builds, not plug-and-play tools.
Memes aside, they can easily achieve profitability by 2030. Ads are a potential $250+ Billion a year in annual revenue that they are only just starting to explore, and their enterprise API usage is growing rapidly.
To grow, they still need new customers. The switching cost for Salesforce goes both ways. It's expensive to switch from, but you could also argue that it's even more expensive to switch to. Source: I just led a full set up and role out of Sales Cloud a few years ago. It costed about 5x what was originally estimated by our Salesforce account manager once you factor consultants and new team members that are necessary for a Salesforce product to work properly and integrate into our existing ERP software. Leadership was blindsided and the roll out scope became extremely narrow in order to get a working product to users. As soon as the contract is up, we're out and will never be coming back. I will say, their APIs are a little crazy the way they're structured (so many separate APIs) but the one thing they do really well is roll out changes in a way that makes it easy to maintain existing integrations. My ERP vendor maintains 1 live API version, so when they change something it instantly breaks existing connections. At least SF rolls them out in multiple versions and deprecats old versions with plenty of warning, so you have time to fix them before they break.
Thank you would this be the right approach? 1. Using Barchart OnDemand (15-Min Delayed / Paid Real-Time) Barchart Premier gives you access to the site's tools, but the OnDemand API is technically a separate service (though they often have free tiers for small volumes). • Endpoint: For options, you typically use getFuturesOptions or a custom getData query. • The Script: Google Sheets cannot natively "scrub" Barchart's web interface easily because the site uses scripts to load data (which breaks IMPORTXML). You must use Apps Script. Basic Setup: 1. In Google Sheets, go to Extensions > Apps Script. 2. Use the UrlFetchApp.fetch command to call the Barchart API URL. 3. Parse the JSON and write it to your cells. 2. The Tradier API Method (Real-Time) If you have a Tradier brokerage account, you can get unlimited real-time data for free. This is often the preferred "clean" method for Sheets power users. • Endpoint: https://api.tradier.com/v1/markets/options/chains • Scrubbing Logic: You send a GET request with your API Key in the header, specify the symbol and expiration, and the API returns a clean JSON of the entire chain.
>They are losing $2B PER MONTH and their losses are increasing each month Cash burn is not concerning when you consider how fast they are growing and the value of the technologies they are building. They are raising hundreds of billions of dollars, they can afford to burn cash. >while their models are starting to get beaten by both anthropic and google, with zero moat. GPT 5.4 is still by far the best model on the market for most use cases, it's not even close. The big thing is that Google/Anthropic is heavily focused on overfitting to benchmarks(especially Google), whereas OpenAI focuses on real world performance. OpenAI has a strong moat both from brand recognition, and from having superior technology. >They also saw a 5% drop in users this month. Simply not true. The only evidence I can find justifying this claim is a website claiming 1.5 Million people signed a petition saying they will leave ChatGPT. But this is only 0.2% of users, and that petition does not verify: - How many of those people actually used ChatGPT, or were paying customers - Whether they actually followed through with their pledge(People rarely follow through with Boycott threats) - If the signatures are duplicates/bots. OpenAI's own numbers suggest users are growing 10% monthly, not declining. Lastly, IMO consumer usage isn't the main opportunity for OpenAI long term. It's their agentic products and API.
For real-time options data into Google Sheets, the cleanest free method is the **Tradier API**. Free tier gives you delayed data, paid tier is real-time. You call the `/options/chains` endpoint, parse the JSON, and pull it into Sheets via `IMPORTDATA` or Apps Script. For 15-minute delayed data, **Barchart's ondemand API** has a free tier that works for small volumes.
It uses the GDELT API (https://www.gdeltproject.org/) and runs 2 parallel themed queries against GDELT on each context refresh, focused on events most relevant to financial markets. Things like trade disputes, energy policy, regional conflicts, and central bank actions.
For real-time options data into Google Sheets, the cleanest free method is the **Tradier API**. Free tier gives you delayed data, paid tier is real-time. You call the `/options/chains` endpoint, parse the JSON, and pull it into Sheets via `IMPORTDATA` or Apps Script. For 15-minute delayed data, **Barchart's ondemand API** has a free tier that works for small volumes.
You can do this with a small Python script + Google Sheets API. A simple approach is to use **Yahoo Finance options data (free, \~15 min delayed)** and push the results into Sheets. Libraries like yfinance let you pull the full option chain, including bid/ask and IV.
Hey OP - this is pretty cool. Wondering if it's compatible with the IBKR API?
Can't argue with that- we don't offer API access at this time. Will have to resolve concerns about exposing our IP
This is very helpful; many thanks! I am waiting on a reply from volsignals about API access. ODs data subscriptions might be the most effective long term use of my money for my purposes. I can easily see how i can use their feed to manage my trades based on my current strategy automation.
Nah it's lobbying I'm sure. He's seems to be building up the infrastructure for age verification in the us that requires API in sure he will charge people for it.
Today everyone is confused about the AI bubble, "will it be going to pop," "when it is going to pop," because the market is correcting itself right now. If we see 4 years ago, back when everything was new, chatgpt 3 was released, there were many new start ups opening every single day, which were essentially just back chatgpt API key set, while marketing it with some niche problem that it will, that chatgpt would do it much better (which also hold true for today, I am only talking about small start ups. There were also some unicorns between them which genuinely solved real problems. But the problem now is as the industry grows, like any industry, with more and more hype everyday, the competition grows, it gets harder to compete, even better tools are being replaced. For example Blackbox AI was once best for coding, but now people use Claude code, even chatgpt is replaced by Claude in coding now. The problem is not that there is a bubble or will is going to pop, the main thing we all need to focus on is, what it is going to change, once everything is settled down. What new opportunities will it create? There is a storm right now, where even the most uneducated people are creating start ups, who don't even know a thing about computers or software, and the fun thing is, they also managed to get investors because investors want to win, they care about winning. And, its not that the investors are stupid, it is just that the technology is very new and it is hard to pick a winner, so people with even a decent pitch get the capital, even if after 6 months that start up shuts down and investors lose all their money. The investors mind set is, fund the crowd, and even 2 or even becomes a winner, its gold mine for them.
Wish Apple would dump that shitty Yahoo Finance stocks API
Custom software I built. API to Tradier
The problem is, like the original internet boom it is full of garbage that is going to fail. Basically anyone that has built a business around calling the ChatGPT API to do something that isn't that valuable and doesn't pay for the tokens it uses is liable to fail.
They have revenue, use cases, customers, and are getting into military Now, openAI might be over leveraged, but anthropic is already turning a profit on API calls and is projected to become profitable in a couple of years Way different situation that pets.com getting 300 millions evaluation by selling litter online at a loss
US to release emergent reserves on reserves....starting with almost 200m barrels immediately...more to follow. Pipeline flows out of the bakken and permian basins picked up overnight according to flow data on the backend API data.
The worst part is is like 19API, it's not even that aromatic/heavy. So you don't get volume swell and good diesel yield. Just cracks to light ends in your coker. We valued it at ANS-20.
I want to make sure this works and catches edge cases before creating API automation. Not gonna waste my time if it sucks
I reverse engineered Robinhood API last year and have been collecting options and futures data information for many months now. I have something like 80,000 "snapshots" so far with full Greeks data, volume, bid, ask etc The idea is to use LLM power to analyze and find "true market edge". So far out of 180,000 strategies tested not a single one actually showed "true market edge". Many show positive P&L but fail in reality. the market CANNOT be predicted, never fall for any bot trading or anything like that it all fails the "true market edge" test. and this is why I just follow the trend and not try to predict anything anymore.
Not running it locally, it’s deployed on Vercel with PostgreSQL on Azure, so the scanner just runs as an API call. The pipeline also cuts aggressively before the time consuming calls. Starts with [N] stocks, filters down to 18 finalists, then runs the heavy enrichment only on those. Never blasting all [N] through the full thing.
For market data: Finnhub Premium (news, analyst ratings, insider activity, earnings history, peer groups), TastyTrade API (live options chains, Greeks, IV rank, liquidity ratings — free if you have a TastyTrade account), FRED for macro data (free, Federal Reserve), SEC EDGAR for filings and business descriptions (free), and xAI/Grok for social sentiment.
Disagree, they pull in news from all sources. There's no chance you'll miss a single piece of news of a stock. If it gets cluttered you can filter on news type to find what you need. I'm gonna be honest most brokerages don't even have a good news page like IBKR, I doubt you'll be able to do it better. Comes with a ton of API integrations and algorithmic filtering. Portfolio tracking works fine, there are some quirks with it like deposits counting as profit made but every broker app I've used does that. You absolutely want a UI to be complex, that's the whole point of a professional broker app. If you want something simple go to Robin Hood. And their web interface really isn't that complex in the first place.
I'll just leave this here: The American Petroleum Institute (API) releases its Weekly Statistical Bulletin, which includes data on U.S. crude oil inventories, every Tuesday afternoon. This report provides insights into the weekly changes in crude oil supply and can influence market prices.