Reddit Posts
Download dataset of stock prices X tickers for yesterday?
Tech market brings important development opportunities, AIGC is firmly top 1 in the current technology field
Tech market brings important development opportunities, AIGC is firmly top 1 in the current technology field
AIGC market brings important development opportunities, artificial intelligence technology has been developing
Avricore Health - AVCR.V making waves in Pharmacy Point of Care Testing! CEO interview this evening as well.
OTC : KWIK Shareholder Letter January 3, 2024
The commercialization of multimodal models is emerging, Gemini now appears to exceed ChatGPT
The commercialization of multimodal models is emerging, Gemini now appears to exceed ChatGPT
Why Microsoft's gross margins are going brrr (up 1.89% QoQ).
Why Microsoft's gross margins are expanding (up 1.89% QoQ).
Why Microsoft's gross margins are expanding (up 1.89% QoQ).
Google's AI project "Gemini" shipped, and so far it looks better than GPT4
US Broker Recommendation with a market that allows both longs/shorts
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
Best API for grabbing historical financial statement data to compare across companies.
Seeking Free Advance/Decline, NH/NL Data - Python API?
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
Delving Deeper into Benzinga Pro: Does the Subscription Include Full API Access?
Qples by Fobi Announces 77% Sales Growth YoY with Increased Momentum From Media Solutions, AI (8112) Coupons, & New API Integration
Qples by Fobi Announces 77% Sales Growth YoY with Increased Momentum From Media Solutions, AI (8112) Coupons, & New API Integration
Qples by Fobi Announces 77% Sales Growth YoY with Increased Momentum From Media Solutions, AI (8112) Coupons, & New API Integration
Aduro Clean Technologies Inc. Research Update
Aduro Clean Technologies Inc. Research Update
Option Chain REST APIs w/ Greeks and Beta Weighting
$VERS Upcoming Webinar: Introduction and Demonstration of Genius
Are there pre-built bull/bear systems for 5-10m period QQQ / SPY day trades?
Short Squeeze is Reopened. Play Nice.
Created options trading bot with Interactive Brokers API
Leafly Announces New API for Order Integration($LFLY)
Is Unity going to Zero? - Why they just killed their business model.
Looking for affordable API to fetch specific historical stock market data
Where do sites like Unusual Whales scrape their data from?
Twilio Q2 2023: A Mixed Bag with Strong Revenue Growth Amid Stock Price Challenges
[DIY Filing Alerts] Part 3 of 3: Building the Script and Automating Your Alerts
This prized $PGY doesn't need lipstick (an amalgamation of the DD's)
API or Dataset that shows intraday price movement for Options Bid/Ask
[Newbie] Bought Microsoft shares at 250 mainly as see value in ChatGPT. I think I'll hold for at least +6 months but I'd like your thoughts.
Crude Oil Soars Near YTD Highs On Largest Single-Week Crude Inventory Crash In Years
I found this trading tool thats just scraping all of our comments and running them through ChatGPT to get our sentiment on different stocks. Isnt this a violation of reddits new API rules?
I’m Building a Free Fundamental Stock Data API You Can Use for Projects and Analysis
Fundamental Stock Data for Your Projects and Analysis
Meta, Microsoft and Amazon team up on maps project to crack Apple-Google duopoly
Pictures say it all. Robinhood is shady AF.
URGENT - Audit Your Transactions: Broker Alters Orders without Permission
My AI momentum trading journey just started. Dumping $3k into an automated trading strategy guided by ChatGPT. Am I gonna make it
The AI trading journey begins. Throwing $3k into automated trading strategies. Will I eat a bag of dicks? Roast me if you must
I made a free & unique spreadsheet that removes stock prices to help you invest like Warren Buffett (V2)
I made a free & unique spreadsheet that removes stock prices to help you invest like Warren Buffett (V2)
To recalculate historical options data from CBOE, to find IVs at moment of trades, what int rate?
WiMi Hologram Cloud Proposes A New Lightweight Decentralized Application Technical Solution Based on IPFS
$SSTK Shutterstock - OpenAI ChatGBT partnership - Images, Photos, & Videos
Is there really no better way to track open + closed positions without multiple apps?
List of Platforms (Not Brokers) for advanced option trading
Utopia P2P is a great application that needs NO KYC to safeguard your data !
Utopia P2P supports API access and CHAT GPT
Stepping Ahead with the Future of Digital Assets
An Unexpected Ally in the Crypto Battlefield
Utopia P2P has now an airdrop for all Utopians
Microsoft’s stock hits record after executives predict $10 billion in annual A.I. revenue
Reddit IPO - A Critical Examination of Reddit's Business Model and User Approach
Reddit stands by controversial API changes as situation worsens
Mentions
I gather it from a variety of sources including, Schwab API's, FINRA, etc...
I wouldn't. This is feeding into a software model, so why would I go software -> natural language -> LLM -> structured data embedded in natural language -> structured data -> software when I could just go software -> fundamentals API -> structured data -> software? LLMs are useful for exploratory research where natural language is fastest. But I'm not interested in incorporating natural language and all the associated risks of an LLM into analytical software, that would be nuts.
Getquin's pretty good, has an app and API access to a lot of major trading platforms
I didn't say "software lock" was the only reason, if you want to term it that way. As ML engineer you most likely aren't involved in infra and hardware architecture, you are just the consumer of it. It's just a "tool" provided to you. If I am a large SaaS software provider, I can deploy to AMZN AWS, MSFT Azure, GOOLG GCP or ORCL OCI using Kubernetes and dealing with small naunce of different clould provider API's. Not ideal, but achievable. What's you're talking about is nearly complete rewrites of code. Substantially less than ideal.
Thanks! We’ve tried with Silexx before but they were looking for a more institutional-style API vs what we offer (which is more consumer friendly), though we’ll follow up with them. We can also check with Sterling.
Now that the API is live. Which front ends can we connect to place trades from?
I have made $300 dollars across 7 RKLB transactions this morning. Peak efficiency. Need to look at automating this shit. Anyone have experience automating trading with schwab API/dev tools?
Curious…. How did you get a bot to work? I’m super interested but not very technical inclined (to the degree of coding, API’s, etc). I tried getting a Schwab Trading API to work but many failed attempts later made me hopeless lol.
I've only casually looked at Polygon's data API and I've never looked at their sentiment data. I'm just aware that it exists. Assuming that you are using that data for back-testing or algotrading - you could look in r/algotrading and r/quant to see if those communities have favorites. There are also backtesting tools like QuantConnect that have integrations to popular datasets. Some of those dataset offer sentiment data but I'm unfamiliar with them - [https://www.quantconnect.com/datasets/](https://www.quantconnect.com/datasets/) \- that may give you some ideas of other data sources.
What API do you use for fundamentals?
This is the 4th or 5th time I’ve seen something like this in a week, is this the new meme? ChatGPT is amazing, and as everyone seems to be finding out this week, you need to provide it with the data, it doesn’t get it itself, in order to do that you may have to learn python to download metrics from places like yahoo finance via API’s. But that is the only “difficult” part! Everyday give chatGPT the last 2 years of historic pricing on a stock, give it the last year of options tables (ibstock API I think?), make sure you give it all the metrics it could ever want! VIX, RSI, EPS… all of them! Upload it and then ask for its advice on options. Backtest till your hearts content and so much more. Something prompted me to learn about ChatGPT’s relationship with stock options a couple weeks ago so maybe it’s the new meme?
Would you share what data source it’s using for the live updates? Like on the backend where is it drawing the data from. Some package or API? Which one
It is intriguing to see where Meta is putting its money, for sure! On a related note, I have been working with Filter B0unce for my email verifications while doing cold outreach, and it is been solid. They have maintained my bounce rate under 1% which has been crucial for my deliverability. The real-time verification API is helpful for tricky domains too. If you are doing cold emailing, do not overlook this, it is just $10\/month and includes 300 free verifications monthly!
I use an API to get option chain data and calculate greeks++ myself... Check out my Substack and Twitter: Theta Relay
Pattern day trading point and API access are huge ones. Super helpful, thank you
Interesting in the link too. What financial API are you using?
The most important piece is the API code to collect and save the data is not here 😅
Appreciate the kind words — I’m glad to hear OptionColors was valuable to your workflow. While platforms like OptionVue and OptionNET Explorer each serve a niche, the goal behind OptionColors has always been to push volatility and statistical modeling to a level typically reserved for institutional desks and proprietary tools. The platform was built with prop firms, advanced retail traders, and professional strategists in mind — particularly those working with time spreads and multi-leg structures, where term structure, skew, and precise volatility modeling are critical. We’ve recently integrated the Schwab API to support TOS users more directly and continue to evolve the tools to help close the gap between institutional analytics and what's available to independent traders. If you ever want to revisit or see what’s been added since your subscription, we’d be happy to walk you through the updates.
Going to vibe one out real quick with your bullet points. I'm using finnhub for the API. Will let you know what monster comes out of it.
If someone copy & pastes your strategy, Deep Research is going to ask follow up questions, we can't just copy and paste and go to town. For example, this is what it told me: **Thanks for sharing this detailed strategy blueprint. To help you further, could you clarify what kind of research or assistance you need right now? For example:** * **Do you want help identifying the best data sources/APIs for each module?** * **Are you looking for implementation code examples for the AI models or trading strategies?** * **Would you like help selecting a broker API and evaluating its features?** * **Are you asking for tools or platforms that can support this kind of architecture?** **Let me know what part you'd like to dive into next.** So how should I proceed?
It's like this whole site. Ever since the API change... Seems like all the not insane mods left and reddit hired the worst of these still around to do it full time .
Public just launched their API: [https://public.com/api](https://public.com/api) For options it pays a rebate of up to $0.10 per contract traded Docs here: [https://public.com/api/docs](https://public.com/api/docs)
Gemini sucks. Have an API fetch real time data into your custom GPT
Collecting your own intraday volatility stats is simpler than it looks. Pull 1-min bars for the tickers you scalp and bucket them into 30-min slots, then calculate average true range or ATM straddle cost for each slot. Over 60 trading days you’ll see the spike at the open flatten fast and a smaller bump in the last hour. From there you can size positions by expected cents per minute instead of gut feel. I grab data through Interactive Brokers’ TWS API, clean it in pandas, and pipe the plots into a Thinkorswim watchlist; QuantConnect works too if you’d rather stay cloud-side. APIWrapper.ai let me stitch option chain Greeks into the same frame without writing extra endpoints, which makes the straddle math painless. Real numbers beat rules of thumb and stop you from forcing trades when the juice just isn’t there.
This is all very well-known. Volatility is greatest at the open and declines steadily throughout the day, as you can see from the price of the ATM Straddle, which is a proxy for volatility. This has important implications. Yes, you can trade later in the day, but the same move at 3:00 pm EDT will make you much less than if it occurred at 9:45 am EDT. The best way to see this is to collect data for yourself and plot it in Excel. You can do this through your brokerage’s API, if you’re a developer. Use an LLM to help you.
A couple STEM degrees, I have 2 in mathematics and comp science with minors in physics and chemistry. Then familiarly with lots of regulations like NEC for electrical for things like class 1 dive 1 fitting ratings, or things like API (American petroleum institute) 751 recommendation that you don't use silicon in HF Alkylation units. Along side other craft specialties like what TC boilers need, what metallurgy is used for high nitric acid, pumps and cavitation, BFW (boiler feed water) treatments. When charity of chemicals matters. HIC (Hydrogen induced cracking) and our friend Hydrogen permeation. Reaction batch processes vs continuous processing. FAA lighting requirements for elevated towers. All the railroad laws. Then you get into control and programming and questions like which flow meter is best for this process? pitot tube, anubar, orifice plate, thermal mass, coriolis, vortex, mag flow, etc. Then you get into analyzers and stupid one off shit like why the simplest ever Ried vapor pressure analyzer doesn't work at this plant and it's costing them millions of dollars because they feed the Denver area and the EPA change is going into effect and they can't meet spec. All the safety standards. It's an insane amount of cross craft / specialized knowledge. But the pay is fucking awesome, the problems are always things teams of people couldn't tackle, and every day is interesting. As to how you get into doing what I do I'm not sure I could navigate into this role if it had been my goal all along. A lot of it was luck, and being the guy that solved the problems before so you get the call next time or get your name dropped, as in [you should hire X]. I'm sure there are smarter, more experienced, and more educated people than I am not in this type of role.
check out Lime trading for options and 401k transfers... their API is fast and execution is reliable. i moved my portfolio there last year and the process was smooth. they have solid tools for exploring stock options too.
I had tried running back test but then ChatGPT said it does not have historical option prices. It asked to connect to an API . I was using paid version. How did this work for you?
u/Plastic-Edge-1654 How do you make sure that ChatGPT really gets the real-time option IV surfaces, Greek chains and so on for processing? Via API?
I’d need to automate the entire process, including having the automation include a step to call ChatGPT’s API and input my prompt. For example, I could program it so that if a trade hits a 33% profit threshold, the algorithm automatically closes the trade. After that, it would send a new prompt to ChatGPT via the API, asking for the next trade idea. Then the algorithm could parse ChatGPT’s response and use it to execute a new trade.
This was the prompt. Must be using deep research. CORE STRATEGY BLUEPRINT: QUANT BOT FOR OPTIONS TRADING 1. Define the Trading Universe • Focus: S&P 500, high-volume tech stocks, and sector ETFs • Options: High open interest, tight spreads, weekly/monthly expirations 2. Data Pipeline Setup • Real-Time & Historical Market Data: Prices, implied volatility, Greeks • Economic Indicators: Fed statements, unemployment, CPI, interest rate forecasts • Political Signals: Election cycles, sanctions, fiscal policy shifts • Weather/Natural Disaster Alerts: NOAA, global weather APIs, satellite feeds 3. AI Signal Generation Modules • Natural Language Processing (NLP): • Scrape financial news, SEC filings, Reddit, X/Twitter, Fed transcripts • Use sentiment analysis + event detection to score macro/micro triggers • Time Series Forecasting: • Use LSTM, Transformer models, or Prophet to predict stock volatility and price direction • Event-Based Triggers: • Geo-political tension spikes, hurricane paths near oil fields, unexpected earnings 4. Quant Models to Deploy • Delta-neutral strategies: Iron condors, butterflies • Volatility-based: Straddles/strangles pre-event, IV crush post-event • Directional: Covered calls, puts, debit spreads using ML-classified trend signals 5. Backtest Engine • Simulate trade strategies over 5–10 years • Adjust for: • Slippage • Commissions • Margin use • Volatility skew • Liquidity 6. Execution Layer • Broker API: Interactive Brokers or Tradier • Build custom scripts to auto-place trades when your bot’s edge is confirmed (> X signal score) 7. Risk Management System • Max drawdown control • Position sizing based on Kelly Criterion • Volatility-adjusted leverage • Daily VaR limits • Portfolio correlation monitoring
This was the prompt I wrote. Must be using deep research. CORE STRATEGY BLUEPRINT: QUANT BOT FOR OPTIONS TRADING 1. Define the Trading Universe • Focus: S&P 500, high-volume tech stocks, and sector ETFs • Options: High open interest, tight spreads, weekly/monthly expirations 2. Data Pipeline Setup • Real-Time & Historical Market Data: Prices, implied volatility, Greeks • Economic Indicators: Fed statements, unemployment, CPI, interest rate forecasts • Political Signals: Election cycles, sanctions, fiscal policy shifts • Weather/Natural Disaster Alerts: NOAA, global weather APIs, satellite feeds 3. AI Signal Generation Modules • Natural Language Processing (NLP): • Scrape financial news, SEC filings, Reddit, X/Twitter, Fed transcripts • Use sentiment analysis + event detection to score macro/micro triggers • Time Series Forecasting: • Use LSTM, Transformer models, or Prophet to predict stock volatility and price direction • Event-Based Triggers: • Geo-political tension spikes, hurricane paths near oil fields, unexpected earnings 4. Quant Models to Deploy • Delta-neutral strategies: Iron condors, butterflies • Volatility-based: Straddles/strangles pre-event, IV crush post-event • Directional: Covered calls, puts, debit spreads using ML-classified trend signals 5. Backtest Engine • Simulate trade strategies over 5–10 years • Adjust for: • Slippage • Commissions • Margin use • Volatility skew • Liquidity 6. Execution Layer • Broker API: Interactive Brokers or Tradier • Build custom scripts to auto-place trades when your bot’s edge is confirmed (> X signal score) 7. Risk Management System • Max drawdown control • Position sizing based on Kelly Criterion • Volatility-adjusted leverage • Daily VaR limits • Portfolio correlation monitoring
This was the prompt. Must be using deep research. CORE STRATEGY BLUEPRINT: QUANT BOT FOR OPTIONS TRADING 1. Define the Trading Universe • Focus: S&P 500, high-volume tech stocks, and sector ETFs • Options: High open interest, tight spreads, weekly/monthly expirations 2. Data Pipeline Setup • Real-Time & Historical Market Data: Prices, implied volatility, Greeks • Economic Indicators: Fed statements, unemployment, CPI, interest rate forecasts • Political Signals: Election cycles, sanctions, fiscal policy shifts • Weather/Natural Disaster Alerts: NOAA, global weather APIs, satellite feeds 3. AI Signal Generation Modules • Natural Language Processing (NLP): • Scrape financial news, SEC filings, Reddit, X/Twitter, Fed transcripts • Use sentiment analysis + event detection to score macro/micro triggers • Time Series Forecasting: • Use LSTM, Transformer models, or Prophet to predict stock volatility and price direction • Event-Based Triggers: • Geo-political tension spikes, hurricane paths near oil fields, unexpected earnings 4. Quant Models to Deploy • Delta-neutral strategies: Iron condors, butterflies • Volatility-based: Straddles/strangles pre-event, IV crush post-event • Directional: Covered calls, puts, debit spreads using ML-classified trend signals 5. Backtest Engine • Simulate trade strategies over 5–10 years • Adjust for: • Slippage • Commissions • Margin use • Volatility skew • Liquidity 6. Execution Layer • Broker API: Interactive Brokers or Tradier • Build custom scripts to auto-place trades when your bot’s edge is confirmed (> X signal score) 7. Risk Management System • Max drawdown control • Position sizing based on Kelly Criterion • Volatility-adjusted leverage • Daily VaR limits • Portfolio correlation monitoring
You need to add an API linked to live data
Looks manual. Afaik Robinhood doesn’t have an (official) API for regular trading, only for crypto.
Here is what I entered based off of a subthread above. An I do it right, OP? I'm going to use Robinhood as my brokerage platform and start with $500. I want to hit a minimum of 50% ROI per trade, and the end goal is to generate over $1,000,000 with your trading advice and research, and I am medium risky, but I want to ensure the goal is hit. Here's the prompt. After this research is done, generate a prompt aimed at achieving my results with the highest degree of accuracy possible. CORE STRATEGY BLUEPRINT: QUANT BOT FOR OPTIONS TRADING Define the Trading Universe • Focus: S&P 500, high-volume tech stocks, and sector ETFs • Options: High open interest, tight spreads, weekly/monthly expirations Data Pipeline Setup • Real-Time & Historical Market Data: Prices, implied volatility, Greeks • Economic Indicators: Fed statements, unemployment, CPI, interest rate forecasts • Political Signals: Election cycles, sanctions, fiscal policy shifts • Weather/Natural Disaster Alerts: NOAA, global weather APIs, satellite feeds AI Signal Generation Modules • Natural Language Processing (NLP): • Scrape financial news, SEC filings, Reddit, X/Twitter, Fed transcripts • Use sentiment analysis + event detection to score macro/micro triggers • Time Series Forecasting: • Use LSTM, Transformer models, or Prophet to predict stock volatility and price direction • Event-Based Triggers: • Geo-political tension spikes, hurricane paths near oil fields, unexpected earnings Quant Models to Deploy • Delta-neutral strategies: Iron condors, butterflies • Volatility-based: Straddles/strangles pre-event, IV crush post-event • Directional: Covered calls, puts, debit spreads using ML-classified trend signals Backtest Engine • Simulate trade strategies over 5–10 years • Adjust for: • Slippage • Commissions • Margin use • Volatility skew • Liquidity Execution Layer • Broker API: Interactive Brokers or Tradier • Build custom scripts to auto-place trades when your bot’s edge is confirmed (> X signal score) Risk Management System • Max drawdown control • Position sizing based on Kelly Criterion • Volatility-adjusted leverage • Daily VaR limits • Portfolio correlation monitoring - - - chat GpT wrote - - - Thanks for the detailed blueprint—this is a great foundation. To tailor research that gives you the highest possible chance of turning $500 into over $1,000,000 via options trading with a minimum 50% ROI per trade, could you please clarify the following: 1. Time Horizon: Over what period do you want to achieve the $1,000,000 goal (e.g., 1 year, 5 years)? 2. Trade Frequency: How often do you plan to trade (daily, weekly, opportunistically)? 3. Platform Constraints: Since you’re using Robinhood, are you okay with its limitations (e.g., no access to certain option spreads like iron condors)? 4. Bot Development: Are you building the quant bot yourself, or do you want research into third-party tools or platforms? 5. Technical Resources: Do you have programming capabilities or prefer a no-code/low-code solution? 6. Compliance & Automation: Do you want help navigating SEC compliance and Robinhood’s automation limitations? Once I have these details, I’ll initiate targeted research to help you hit your milestone with the best available strategy, tech stack, and trade ideas. - - - I replied - - - 1. Let's aim for between one to three years. 2. opportunistically 3. Yes. 4. I want you to research and build it. 5. No-code solution. 6. No.
This was the prompt. Must be using deep research. CORE STRATEGY BLUEPRINT: QUANT BOT FOR OPTIONS TRADING 1. Define the Trading Universe • Focus: S&P 500, high-volume tech stocks, and sector ETFs • Options: High open interest, tight spreads, weekly/monthly expirations 2. Data Pipeline Setup • Real-Time & Historical Market Data: Prices, implied volatility, Greeks • Economic Indicators: Fed statements, unemployment, CPI, interest rate forecasts • Political Signals: Election cycles, sanctions, fiscal policy shifts • Weather/Natural Disaster Alerts: NOAA, global weather APIs, satellite feeds 3. AI Signal Generation Modules • Natural Language Processing (NLP): • Scrape financial news, SEC filings, Reddit, X/Twitter, Fed transcripts • Use sentiment analysis + event detection to score macro/micro triggers • Time Series Forecasting: • Use LSTM, Transformer models, or Prophet to predict stock volatility and price direction • Event-Based Triggers: • Geo-political tension spikes, hurricane paths near oil fields, unexpected earnings 4. Quant Models to Deploy • Delta-neutral strategies: Iron condors, butterflies • Volatility-based: Straddles/strangles pre-event, IV crush post-event • Directional: Covered calls, puts, debit spreads using ML-classified trend signals 5. Backtest Engine • Simulate trade strategies over 5–10 years • Adjust for: • Slippage • Commissions • Margin use • Volatility skew • Liquidity 6. Execution Layer • Broker API: Interactive Brokers or Tradier • Build custom scripts to auto-place trades when your bot’s edge is confirmed (> X signal score) 7. Risk Management System • Max drawdown control • Position sizing based on Kelly Criterion • Volatility-adjusted leverage • Daily VaR limits • Portfolio correlation monitoring
This was the prompt. Must be using deep research. CORE STRATEGY BLUEPRINT: QUANT BOT FOR OPTIONS TRADING 1. Define the Trading Universe • Focus: S&P 500, high-volume tech stocks, and sector ETFs • Options: High open interest, tight spreads, weekly/monthly expirations 2. Data Pipeline Setup • Real-Time & Historical Market Data: Prices, implied volatility, Greeks • Economic Indicators: Fed statements, unemployment, CPI, interest rate forecasts • Political Signals: Election cycles, sanctions, fiscal policy shifts • Weather/Natural Disaster Alerts: NOAA, global weather APIs, satellite feeds 3. AI Signal Generation Modules • Natural Language Processing (NLP): • Scrape financial news, SEC filings, Reddit, X/Twitter, Fed transcripts • Use sentiment analysis + event detection to score macro/micro triggers • Time Series Forecasting: • Use LSTM, Transformer models, or Prophet to predict stock volatility and price direction • Event-Based Triggers: • Geo-political tension spikes, hurricane paths near oil fields, unexpected earnings 4. Quant Models to Deploy • Delta-neutral strategies: Iron condors, butterflies • Volatility-based: Straddles/strangles pre-event, IV crush post-event • Directional: Covered calls, puts, debit spreads using ML-classified trend signals 5. Backtest Engine • Simulate trade strategies over 5–10 years • Adjust for: • Slippage • Commissions • Margin use • Volatility skew • Liquidity 6. Execution Layer • Broker API: Interactive Brokers or Tradier • Build custom scripts to auto-place trades when your bot’s edge is confirmed (> X signal score) 7. Risk Management System • Max drawdown control • Position sizing based on Kelly Criterion • Volatility-adjusted leverage • Daily VaR limits • Portfolio correlation monitoring
This was the prompt. Must be using deep research. CORE STRATEGY BLUEPRINT: QUANT BOT FOR OPTIONS TRADING 1. Define the Trading Universe • Focus: S&P 500, high-volume tech stocks, and sector ETFs • Options: High open interest, tight spreads, weekly/monthly expirations 2. Data Pipeline Setup • Real-Time & Historical Market Data: Prices, implied volatility, Greeks • Economic Indicators: Fed statements, unemployment, CPI, interest rate forecasts • Political Signals: Election cycles, sanctions, fiscal policy shifts • Weather/Natural Disaster Alerts: NOAA, global weather APIs, satellite feeds 3. AI Signal Generation Modules • Natural Language Processing (NLP): • Scrape financial news, SEC filings, Reddit, X/Twitter, Fed transcripts • Use sentiment analysis + event detection to score macro/micro triggers • Time Series Forecasting: • Use LSTM, Transformer models, or Prophet to predict stock volatility and price direction • Event-Based Triggers: • Geo-political tension spikes, hurricane paths near oil fields, unexpected earnings 4. Quant Models to Deploy • Delta-neutral strategies: Iron condors, butterflies • Volatility-based: Straddles/strangles pre-event, IV crush post-event • Directional: Covered calls, puts, debit spreads using ML-classified trend signals 5. Backtest Engine • Simulate trade strategies over 5–10 years • Adjust for: • Slippage • Commissions • Margin use • Volatility skew • Liquidity 6. Execution Layer • Broker API: Interactive Brokers or Tradier • Build custom scripts to auto-place trades when your bot’s edge is confirmed (> X signal score) 7. Risk Management System • Max drawdown control • Position sizing based on Kelly Criterion • Volatility-adjusted leverage • Daily VaR limits • Portfolio correlation monitoring
Yeah I work in big tech and use LLMs probably 2 dozen times a day in my work. I also spent many years working at Google. I'm primarily using chatgpt and gemini in my day-to-day. I don't think you understand how diversified "search revenue" is. \* I buy stuff frequently from clicking on YouTube ads \* There's revenue acquired from Google Maps through sponsored pins and API usage \* There are display ads on a lot of websites that I interact with that are powered by Google \* Google is still the primary search engine on my phone and browser and if I need a quick answer to something, I get it served to me by Gemini Flash. There were definitely a few months when they first launched the feature and it was hot garbage, but it's pretty solid in at least 90% of my use cases. \* If I wanna buy a product, I still look it up on search and click through links.. I also frequently check the shopping tab in search.
From that perspective then this company would be a partner that provides the clients. Of course the A.I. may end up writing that API by itself.
I think the Java API lawsuits by Oracle are the scummiest thing Oracle has ever done. Developers rely heavily on API's, and they're shared freely with each other to allow people to interact with their software. Oracle was pushing a dangerous precedent to completely undo a lot of that by arguing that they could copyright their API's, and were suing others like Google for absurd amounts of money for copyright infringement using their Java API's. Basically every single major company in the tech industry filed a brief in favor of Google saying that Oracle was freaking nuts and a ruling in favor of Oracle would heavily damage the tech industry and make things much worse for just about everyone. I refuse to invest in Oracle to this day just for that whole mess. It raises so many red flags for me about the company that it's not worth my time digging into them farther when there's better places to invest my money.
Hey everyone, thanks for the insights here 🙌 I’m backtesting 100 tickers over the past 4 years and want to closely simulate IBKR’s option chain data. Tagging u/ORATS_Matt for his input: 1. Does ORATS source bid/ask, IV, and Greeks directly from OPRA/exchange feeds—essentially the same as IBKR via TWS/API? 2. How closely do your historical EOD and intraday snapshots (SMV, IV, Greeks) align with IBKR output? Have you benchmarked drift, latency, or variance between them? 3. Any documented discrepancies by ticker, expiration cycle, or snapshot timing—like 14‑min pre-close vs IBKR’s \~2‑min snapshot? 4. Do you have accuracy metrics or validation tests comparing ORATS vs IBKR or OPRA raw data? I appreciate ORATS team feedback and any community experience replicating IBKR-like data. Thanks in advance!
I don’t understand how they will keep people from just scraping the data for free. I understand that they can block an API from connecting and downloading mass data with minimal effort, but any coder with a little skill and time will just build a tool to extract what they need.
Thanks! I will look into these. Currently I am using IBKR API (so far only private use is planned) but I am always running into issues with missing data (Greeks, prices, …)
**Use** [**CityFALCON.ai**](http://CityFALCON.ai) for fundamentals, filings, insider trades, historical price/volume, news & sentiment—all integrated via API. **Add** [**DCSC.ai**](http://DCSC.ai) for sector insights: relevance scores, dynamic classification, and transaction analytics. Optionally pull in **free economic data** (e.g., FRED, etc.) and **options data** from other sources to supplement. Together, both platforms should wrap key FMP capabilities into one neat API-driven workflow—well suited for Google Sheets integration. Regarding pricing, it is pretty reasonable. You can contact them and give my reference.
Even if the US could produce enough oil to meet domestic demand, the US doesn't have enough refinery capacity for US light crude. US refineries have economic pressures to continue refining heavier foreign crudes (or are like the older PADD 5 refineries and were built to use heavy Californian/Alaskan crude anyway). Transitioning those refineries would take time and introduce inefficiencies. That's not to say there isn't any capacity for this (see: Exxon in Beaumont), but it's generally limited in scale and it's mostly located in the places that extract the light crude anyway, which generally rules out east/west coast refineries. But even if you *could* convert these other places to accept higher API feedstock, you're going to run into another unavoidable problem: the EIA is projecting peak domestic production in 2027. Without discovering new fields or a rapid spike in oil prices that redefines what reserves are economically viable, the production will begin sliding in <5 years. But this all brings us back to my original point: global commodities are global. The sort of people advocating we do things like remove Iran from the global oil market are the sort of people who would break their own fingers in a fight in the hopes it would distract their opponent.
has anybody successfully used IBKR API to trade options?
That is not my experience when dealing with code, domain specific unstructured language to structure data via API, and other research via paid chat subscription. Maybe you have not messed with the latest models or have some cross session “memory” history on ChatGPT that makes you think it’s better. I use both.
on thinkorswim level 2, i see many bid/ask prices. from IBKR API, i see bid/ask prices from many exchanges. is the TOS level 2 including prices from all those exchanges? on tos if i want to sell quickly, i make my ask lower than the lowest ask. on ibkr, i will have to collect ALL exchanges prices to find the lowest ask, in order to quickly sell?
Data is currently coming from YF API. It would be nice if people join forces to scrape the data and push it to the database
From what I have determined, subscription based AI LLMs seem to be the viable option. I have not purchased any subscriptions. I have considered Google Gemini and BloombergGPT. With other AI LLMs, they did not have data from the SEC Edgar. With these other LLMs, I need to cut and paste data from Edgar into the chat. This need rendered them useless for me. Currently, I am constructing my own LLM on a computer in my residence and purchased the hardware as soon as possible before even higher tariffs are in effect. I will download selected data from Edgar and likely purchase an API subscription from Bloomberg. I worked in IT for decades, and I am very familar with the build of Intel servers. If I had not started this construction, I would had purchased a BloomingtonGPT subscription and determine what I could do with BloombergGPT.
It's possible using automation (API-connected trading bot). It's not possible to trade like that manually. OP is acting like a market maker.
How is this possible? Are you using Python or something with your broker’s API to place these trades?
#API Huge🏁🛢️📈 Crude: -10.133M Cushing: -0.800M Gasoline: -0.202M Distillates: +0.318M #OOTT Tonight oil
What happened to subs going dark due to Reddit closing their API and charging for it?
So? Open AI is falling behind substantially. Anthropic has the best coding agent in Claude 4 Opus. Google's Gemini 2.5 Pro has 5x the context window as o3, OAI's flagship model. OAI's models cost a fortune through API, and even their deep research isn't superior to Google's.
It's definitely why he locked down the API
This is the real reason Reddit API costs were hiked
I have also seen Phds hiring Phds, when a Phd was really not needed for the job (API design and QA)
It's not the number of contracts. It's the number of orders. You may be hitting their API limits. If you were directly using their API with an algo - the API would throttle you. And since you are likely over the CBOE 390 rule limit - while Schwab does support Rule 390 traders - not all brokers support Rule 390 traders. I know that Tasty doesn't support Rule 390 traders. You can try Ibkr - but you ought to speak with them first about your trading patterns and disclose that you are considered a CBOE Rule 390 professional trader.
You may be hitting the API limit. And since you are likely over the CBOE 390 rule limit - not all brokers support Rule 390 traders. You can try Ibkr - but you ought to speak with them first about your trading patterns.
Are you using Schwab API - there is an API limit per day. I think it's about 3000 to 4000 orders per day. If you are hitting that API limit - you are probably bound by CBOE Rule 390. Every broker is required to enforce Rule 390.
Short 90 shares on APP, love seeing it get pummeled, complete hype. META could end that company in a day of they wanted. API violations.
API from polygon to bring out open interest & gamma information
Exactly. I spent over 4 hours today analyzing this sub via API. What I've uncovered is the one thing big banks don't want you to know.. >!deez nuts !<
I can't comment about European stocks. But in the US - public company filings are public information. You can find the raw data in a structured format so you can manipulate it yourself, directly from the SEC filings. All US public companies are required to file electronically with the SEC. Those filings are available for public use through the EDGAR Public Dissemination Service. EDGAR or Electronic Data Gathering, Analysis, and Retrieval contains filing data from 1998 to today. You retrieve the filings through EDGAR - it's available in JSON and XML format. Various search options here - [https://www.sec.gov/edgar/quickedgar](https://www.sec.gov/edgar/quickedgar) Public human-search here - [https://www.sec.gov/edgar/search/#](https://www.sec.gov/edgar/search/#) API here if you want a programmatic method - [https://www.sec.gov/edgar/sec-api-documentation](https://www.sec.gov/edgar/sec-api-documentation)
I don't have a link right now, but I read an article that said NVDA blackwell GPUs are still more energy efficient than google's ASIC TPUs, which surprised me. It did not specify whether it was referring to training or inference. It is worth noting that google cloud has purchased a lot of GPUs from NVDA. However, google has absolutely enormous compute and storage, and the demonstrated ability to compete with frontier models. I think GOOGL is a no-brainer part of any large porfolio. Fears about AI erosion of search haven't yet transpired, and I think search API calls by APIs to GOOGL could offset that, which would probably show up under cloud revenue. Search revenue will fluctuate, like it always has, but the idea that an expensive non-thinking LLM will give great working links that replace an efficient search engine is silly and unfounded. The government may break up google, or break off chrome. In such a case, the resultant parts might end up being worth more than the whole once on the market. With regard to OpenAI, we all know Sam Altman's grand ambitions to scale current compute by another couple orders of magnitude. We know his thinly concealed frustrations with Microsoft. Nadella bet big to create the world's largest supercomputer to create ChatGPT, but he is clearly not willing to sacrifice earnings at the altar of AGI, which is probably in line with the expectations of MSFT investors.
“Regulatory risk: If OpenAI and Google get too close, the antitrust agency could get involved” It’s actually the opposite. OAi was locked into using MSFT azure as a cloud provider for everything until 2030. MSFT also had a huge say on OAi governances. In 2023 antitrust regulators began investigating the relationship as an effective merger. Due to the investigations, MSFT started to allow OAi to use Oracle for cloud computing. Then as part of the Stargate negotiations, MSFT and OAi entered a new agreement where MSFT allowed OAi to build its own capacity for research and training with NvDA and Oracle. That agreement opened the door for other cloud providers also. MSFT still has right of first refusal for all OAi cloud hosting, but if they cannot meet compute demands OAi can go to a cloud rival like Google. However, MSFT still has exclusive rights to OAi API until 2030 or AGI is achieved (whichever happens first). So any OAi use of GCP will be for training and research.
Does anyone know if RH or Yahoo finance or something else has an API to pull pricing data?
I'm an AI sceptic, so take this as my singular opinion and with a healthy grain of salt. I don't think there is a real use case for AI aside from HS and freshman/sophomore college students cheating on essays. There's no real world business applications for this stuff because it so unreliable. This feels like a lot of venture capitol money trying trying to generate buzz in an industry that no longer has the infinite growth mindset. Just like NFTs and the Meta verse this is bullshit chasing 2005. Case and point... >Anthropic recently crossed $3 billion in annualized revenue on booming demand from code-gen startups using its models. No one, and I repeat no one is using AI generated code in their production code base. Every single one of those startups are just hawking a front-end that's just making the same API calls to the LLMs that would get any developer shitcanned if they submitted the output in a pull request.
What also did it for me was them losing that lawsuit against Fortnite for bypassing the Apple Store subscription payment. Basically, they force apps to have to use their payment API and get like 30% of the cost. Spotify and others will follow, and a lot of that revenue is gone.
Cool how did you do the simulation, wrote a script in think or swim or API in shwab ?
#1 I haven’t gotten that far as I was just planning this strategy out in theory so far I’m planning to use interactive brokers with their trader workstation API. #2 yes I would defer personal taxes by keeping the profits within the corporation. Eventually, I would pay myself a salary or dividends to actually receive the money. #3 I don’t believe I’m getting a “free lunch” an all the interest earned off The principal, is risked and can potentially be lost. Any profit is from risking the interest earned. The only protection is in the principal investment. This is just my logic as of now, So if you believe I’m wrong on any of these, please feel free to clarify, Thank you.
This reminds me of an LMS platform I was working for for about a year. They would go out and promote the sheer volume of API calls and exclaim proudly that it showed tremendous growth. Only to find out the marketing team ignored the fact that there was a difference in customer API calls and internal dev API calls and just lumped them all together. They road that wave for over two years.
Great question. I source from an API. Levels are ranked with a score. First level is the strongest (most reactive, causing a rise in volatility...). These are institutional levels. I think tracking where things go over time and seeing how price reacts at certain kinds of levels will help you understand intuitively. I'll post daily. Let me know if you're interested in the newsletter.
Oh trust me, I know ;) And I know ALL about Reddit’s API, Sam Altman’s history with Reddit, that OpenAI and ChatGPT are using Reddit API to suck up our comments…so yeah, I know exactly what I’m doing here.
Haha ah ok this second reply is worse than the other one. it's like you don't understand what context is, much less attempt to consider anything in context with this post or what's happening in the current administration. Elon bought Twitter for example like how can you compare that to some API that I have never heard of and honestly don't care about but even if true just pales so hard in comparison See if you can reconsider the context and how equal or not equal they are, might be useful exercise for you
With the rise of AI, what’s their long-term moat, or ability to grow into their PE? They are currently REALLY good at connecting disparate databases for their users and discover patterns or other relations within the data. But, they still require deployment engineers to work with users to map the databases and customize their solutions per customer. There are companies today that are developing AI tools that can create low/no code API connectors between solutions; I would imagine that this coupled with auto DB mapping tools + next iterations GPT/Claude/Gemini etc. to discover patterns or relationships would erode Palantir’s competitive advantage.
They have had a speaker of the house who is 100% guilty of insider trading for decades. They have a President that was completely incompetent mentally and who actively suppressed primaries, which is super undemocratic. They set up entire API calls at Twitter to get Twitter to sensor anyone they wanted, and they fabricated an entire Russian conspiracy to explain how trump got ejected when it was largely because hrc was a terrible candidate with three decades of political baggage. Ketamine is hardly the worst thing to happen to the White House in the last 12 years.
IBKR has a websocket API that IIRC supports streaming options chains and quotes. I wrote a client for it not too long ago. Throw me a few stacks and I’ll make you a scanner if you want! But probably don’t expect to see many box arbs worth opening, tbh.
If it were me I’d just throw together a continuous scanner in C# with a simple UI, using some low-cost market data API, possibly free if it’s via your broker. Where do you get your data from? Is it just SPX you’re interested in? That makes it much simpler / faster than trying to scan multiple index options, future options, etc.
Use a real API trading platform like TradeStation
I'm intrigued. Their platform sounds niche but powerful. Anyone here actually used their service or API?
Maybe a slight lag between what I get on the ToS desktop in paper mode vs. what I get pulling "last price" via the Schwab API, but I don't think 15 mins. The intraday chart lines up time-wise also.
OpenAI is now required by court order to preserve all ChatGPT logs including "temporary chats" and API requests that would have been deleted
I have no positions in Perplexity. 2.5 Pro is freemium; you get a certain amount of searches and then have to start paying their AI API fee. Neither Google or OpenAI are going to kill them. Especially not OpenAI. That's even true according to the report you sent me: OpenAI's search models are dogshit.
Conceptually an interesting IPO but they’re heavily exposed to interest rate fluctuations. If rates get cut eventually then their income will drop SIGNIFICANTLY. On top of that they’ll IPO at a premium valuation. I’m not too bullish on their API/Web3 product offerings but if those gain traction it would offset an interest rate hit. At $87 that’s a hard fucking pass. This thing will drop for sure.
Key Al terms: - Al An Indian. - API - A Person in India. - LLM - Low-cost Labour in Mumbai. - AGI - A Genius Indian GPT. - Gujarati Professional Typist.
Reddit has enormous potential for growth and innovation, but the challenge will be how profitable that growth actually is. The path to higher margins is not guaranteed, as monetization often comes with higher costs or user churn \[1,2,3\]. \[1\] Revenue ≠ Profit Growth in revenue does not automatically mean profit will increase proportionally. Monetization can bring additional costs, such as infrastructure, personnel, API (AI APIs), community management, or new features. This could compress margins. \[2\] Growth yes, monetization questionable I am convinced by the growth potential, but not that Reddit can actually increase monetization as effectively as the analysis suggests. Monetizing without losing community is difficult, and the monetization model is still experimental in many areas (e.g., premium subreddits, data sales). \[3\] Companies still don’t know how to generate consistent profits from premium advertising.
I do feel like all the companies hanging onto AI indirectly feel like a bubble. Like clearly a lot of tech companies that rebranded an API as AI or something. But google, Microsoft, OpenAI, etc. seem to fundamentally have unique tech. If AGI doesn't kill us all maybe some will be overbought. Look at the graph for cisco stock. It was a huge stock (still pretty big) but entered bubble territory for sure
Remember when all the subs went dark in protest of them closing their API?
Just hit the wire. They're suing Anthropic for copyright infringement. Pretty open and shut case if they have proof they used their API without paying. Shares went up on announcement. [https://www.cnbc.com/2025/06/04/reddit-anthropic-lawsuit-ai.html](https://www.cnbc.com/2025/06/04/reddit-anthropic-lawsuit-ai.html)
The Work Zone API looks intriguing. Rekor Command® unifies disparate data sources into a single, AI-powered platform that enhances situational awareness, supports faster decision-making, and improves resource allocation. Replacing an array of diverse and fragmented tools with a single platform that delivers real-time insights and powerful decision support, enabling transportation agencies to proactively manage roads, allocate resources efficiently, and respond faster to incidents. Under this expanded contract, TxDOT will also deploy several new features, including: ● Traffic Impact Module – A first-of-its-kind decision support tool that uses AI to assess and visualize how incidents affect traffic flow in real time and provide real-time insights to operators for proactive and strategic operations, faster response times, and improved traffic safety, ultimately enhancing overall network performance. ● Work Zone API – Enhances safety by delivering accurate, real-time insights into work zone activity, improving visibility for TxDOT operators and the public. The platform also integrates with the USDOT FHWA’s Work Zone Data Exchange (WZDx), enabling standardized, open data sharing across local and national systems to support broader safety and mobility goals. By making critical work zone information more accessible and actionable, the platform supports safer construction zones, enhances roadway travel time reliability, and strengthens agency coordination. Rekor Command® will also continue to serve as the data backbone for TxDOT’s Construction Partnership Program (CPP), a foundational data layer for improving regional collaboration, public communication, and traffic management. With this statewide expansion, Rekor's role within the CPP will grow to support additional objectives, most notably by powering the CTX-GO mobile application. Built on real-time data streams from the Rekor platform, CTX-GO delivers timely, unified information to travelers, contractors, and public agencies, ensuring enhanced situational awareness and seamless coordination across Texas's growing infrastructure network. https://www.sec.gov/Archives/edgar/data/1697851/000143774925019217/ex_826235.htm
My python program accidentally made thousands of calls to the yahoo finance API in seconds and I think im IP banned now 
All you do on company hardware can be monitored and feed to AI model so it knows with whom you communicate, what exactly you do on daily basis, how you do it etc. Then if it learned enough and API costs to run it and monitor the results are lower than your salary - you are gone and with time it will be able to work 24/7 for lower the costs they were paying you for 5x8.
Revelata isn't exactly what you want, but it's close and $25/month. Pulls KPIs and tables from filings and stitches them into time series. No API today, but an Excel Add-In option that can update things when new filings come out. Preview access to a few tickers [here](https://www.revelata.com/free?ticker=NVDA&formType=10-K). Free trial is full functionality, so maybe worth checking out how easy it is to adjust your workflow if the price is right.
I have not used it, and we did not have it on our list of options data providers: https://www.reddit.com/r/options/wiki/faq/pages/data_sources/ I have now added it. On the plus side, they are very transparent about what they offer, by showing a sample table, and they off a block of data for free, albeit in the Stone Age of options trading up to 2013. On the minus side, they charge an AWFUL lot of money for end-of-data closing quotes ONLY. Seems to me that the much cheaper polygon.io that offers intra-day quotes for a lot less money (though throttled for API calls) is superior.
https://discountingcashflows.com/subscriptions/ I used the website a lot. You can test the API for free. Search a ticker, go to financials, select dev mode, and see the JSON. Then if you need a bunch of requests you can pay.