Reddit Posts
Download dataset of stock prices X tickers for yesterday?
Tech market brings important development opportunities, AIGC is firmly top 1 in the current technology field
Tech market brings important development opportunities, AIGC is firmly top 1 in the current technology field
AIGC market brings important development opportunities, artificial intelligence technology has been developing
Avricore Health - AVCR.V making waves in Pharmacy Point of Care Testing! CEO interview this evening as well.
OTC : KWIK Shareholder Letter January 3, 2024
The commercialization of multimodal models is emerging, Gemini now appears to exceed ChatGPT
The commercialization of multimodal models is emerging, Gemini now appears to exceed ChatGPT
Why Microsoft's gross margins are going brrr (up 1.89% QoQ).
Why Microsoft's gross margins are expanding (up 1.89% QoQ).
Why Microsoft's gross margins are expanding (up 1.89% QoQ).
Google's AI project "Gemini" shipped, and so far it looks better than GPT4
US Broker Recommendation with a market that allows both longs/shorts
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
Best API for grabbing historical financial statement data to compare across companies.
Seeking Free Advance/Decline, NH/NL Data - Python API?
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform
Delving Deeper into Benzinga Pro: Does the Subscription Include Full API Access?
Qples by Fobi Announces 77% Sales Growth YoY with Increased Momentum From Media Solutions, AI (8112) Coupons, & New API Integration
Qples by Fobi Announces 77% Sales Growth YoY with Increased Momentum From Media Solutions, AI (8112) Coupons, & New API Integration
Qples by Fobi Announces 77% Sales Growth YoY with Increased Momentum From Media Solutions, AI (8112) Coupons, & New API Integration
Aduro Clean Technologies Inc. Research Update
Aduro Clean Technologies Inc. Research Update
Option Chain REST APIs w/ Greeks and Beta Weighting
$VERS Upcoming Webinar: Introduction and Demonstration of Genius
Are there pre-built bull/bear systems for 5-10m period QQQ / SPY day trades?
Short Squeeze is Reopened. Play Nice.
Created options trading bot with Interactive Brokers API
Leafly Announces New API for Order Integration($LFLY)
Is Unity going to Zero? - Why they just killed their business model.
Looking for affordable API to fetch specific historical stock market data
Where do sites like Unusual Whales scrape their data from?
Twilio Q2 2023: A Mixed Bag with Strong Revenue Growth Amid Stock Price Challenges
[DIY Filing Alerts] Part 3 of 3: Building the Script and Automating Your Alerts
This prized $PGY doesn't need lipstick (an amalgamation of the DD's)
API or Dataset that shows intraday price movement for Options Bid/Ask
[Newbie] Bought Microsoft shares at 250 mainly as see value in ChatGPT. I think I'll hold for at least +6 months but I'd like your thoughts.
Crude Oil Soars Near YTD Highs On Largest Single-Week Crude Inventory Crash In Years
I found this trading tool thats just scraping all of our comments and running them through ChatGPT to get our sentiment on different stocks. Isnt this a violation of reddits new API rules?
I’m Building a Free Fundamental Stock Data API You Can Use for Projects and Analysis
Fundamental Stock Data for Your Projects and Analysis
Meta, Microsoft and Amazon team up on maps project to crack Apple-Google duopoly
Pictures say it all. Robinhood is shady AF.
URGENT - Audit Your Transactions: Broker Alters Orders without Permission
My AI momentum trading journey just started. Dumping $3k into an automated trading strategy guided by ChatGPT. Am I gonna make it
The AI trading journey begins. Throwing $3k into automated trading strategies. Will I eat a bag of dicks? Roast me if you must
I made a free & unique spreadsheet that removes stock prices to help you invest like Warren Buffett (V2)
I made a free & unique spreadsheet that removes stock prices to help you invest like Warren Buffett (V2)
To recalculate historical options data from CBOE, to find IVs at moment of trades, what int rate?
WiMi Hologram Cloud Proposes A New Lightweight Decentralized Application Technical Solution Based on IPFS
$SSTK Shutterstock - OpenAI ChatGBT partnership - Images, Photos, & Videos
Is there really no better way to track open + closed positions without multiple apps?
List of Platforms (Not Brokers) for advanced option trading
Utopia P2P is a great application that needs NO KYC to safeguard your data !
Utopia P2P supports API access and CHAT GPT
Stepping Ahead with the Future of Digital Assets
An Unexpected Ally in the Crypto Battlefield
Utopia P2P has now an airdrop for all Utopians
Microsoft’s stock hits record after executives predict $10 billion in annual A.I. revenue
Reddit IPO - A Critical Examination of Reddit's Business Model and User Approach
Reddit stands by controversial API changes as situation worsens
Mentions
Amazon is going to start aggressively replacing humans with robots in their logistics anywhere they can, because they've never showed any concern about workers and they're not gonna start now. I think that's going to lead to them showing incredible savings beyond what people are estimating. Google's big plays are obviously Gemini, but that's largely just to keep their head above water vs. everyone else, not necessarily to improve their own internal functioning. So maybe they start seeing some revenue from people using that in workspaces and paying via API? But I honestly don't think we start seeing real AI society destruction before next year. BUT I am mostly long on Google for stuff like Waymo, and some other things I can't remember now bc I just smoked a joint. But look, the point is, both good picks.
Making a LOT of assumptions here, brah. Did you miss the part up there at the top where I clearly state >I'm not sure I trust Schwab's API and would like to double check it.
If Google Gemini wants to win the LLM war: acquire Reddit If Google wanted to structurally weaken competing LLMs, it wouldn’t just out-train them, it would control the data exhaust they all depend on. Right now, a huge portion of high-quality conversational training data comes from Reddit (40% of citations). If Google vertically integrated those surfaces into Gemini (tightening API access, rate-limiting scraping, or bundling premium data behind Google-only licenses) rival models like ChatGPT would see a steady degradation in freshness and relevance and users will just gravitate to Gemini. The game isn’t compute anymore... it’s data privilege. Whoever owns the largest stream of real-world human interaction doesn’t just build better models, they slowly starve everyone else. Reddit is $46b market cap. Google is $4,000b so ~10% of market cap.
my app uses snaptrade for investment/brokage API.
What the OP is describing is definitely possible. I’ve done this myself using webhooks from TradingView, sent to my computer, which then connects via the Kraken API to execute trades. It does work. I usually trade crypto and haven’t yet explored how I’d handle this with stocks. That said, it can be hard for many people to understand exactly what’s being done here. Still, it’s not that unreasonable—assuming high leverage, a solid strategy, and low spreads. Most markets and algorithms are unprofitable, with roughly 50% profitability at best.
EDI and API connections are boring but insanely important.
I have a Google Sheet I use to track all my holdings. I use the Google Finance API to track prices. Last night I noticed my price for TSMC wasn't using the API and was hardcoded to $213. I updated the cell to use Google Finance API... Hey! I had $15,000 more of TSMC than I thought I did!
Working on the API version of this agent will send it in the next week or so, if you don't mind, please DM me as it helps me keep track of everyone I need to get back to! Appreciate it!
Before this turns into the usual noise, let’s set a baseline. These DDs are built from primary data first. Filings, financials, ownership, dilution history, liquidity, positioning, and sentiment. That raw information is then structured and checked to surface risks, inconsistencies, and failure points that are easy to miss when people rush to conclusions. Data is pulled via API's and then delivered to LLM platform for better and cleaner content delivery. The work is about organizing reality, not selling a narrative. **If you think something is missing, incorrect, or misread, point to the data and add value. That improves the work and helps everyone reading.** If the contribution is “this is incomplete,” “this is AI,” or vague negativity without specifics, it adds nothing and will be ignored. I am not here to hype tickers, argue with strangers, or babysit bad faith takes. This is a working floor. If you want to help sharpen the analysis, you’re welcome. If not, keep scrolling. **For the MODs planing to flag this** : The site they link to hosts long-form DDs because Reddit’s character limits make it impossible to post this volume of work without breaking it into dozens of comments, which isn’t realistic or readable. All I’m asking is that intent, execution, and substance be weighed alongside optics. There’s a difference between dropping affiliate links and publishing free research that happens to live off platform due to format constraints. From an intent standpoint, this initiative started because this sub, like most penny stock spaces, is flooded with hype, pumps, and drive by narratives. These DDs are built from primary data and take real time to produce. They’re not designed to push tickers or sell subscriptions. They’re designed to slow people down and give them context before they trade. I also want to point out that these tickers came directly from community voting. I didn’t pick them. I didn’t inject my own agenda. I did the work the community asked for and shared it in the most practical way available. If the community decides it doesn’t want this kind of content, I’ll respect that. But I’d argue there’s room for a win win here where people get higher uality analysis without turning the sub into a billboard.
Before this turns into the usual noise, let’s set a baseline. These DDs are built from primary data first. Filings, financials, ownership, dilution history, liquidity, positioning, and sentiment. That raw information is then structured and checked to surface risks, inconsistencies, and failure points that are easy to miss when people rush to conclusions. Data is pulled via API's and then delivered to LLM platform for better and cleaner content delivery. The work is about organizing reality, not selling a narrative. **If you think something is missing, incorrect, or misread, point to the data and add value. That improves the work and helps everyone reading.** If the contribution is “this is incomplete,” “this is AI,” or vague negativity without specifics, it adds nothing and will be ignored. I am not here to hype tickers, argue with strangers, or babysit bad faith takes. This is a working floor. If you want to help sharpen the analysis, you’re welcome. If not, keep scrolling. **For the MODs planing to flag this** : The site they link to hosts long-form DDs because Reddit’s character limits make it impossible to post this volume of work without breaking it into dozens of comments, which isn’t realistic or readable. All I’m asking is that intent, execution, and substance be weighed alongside optics. There’s a difference between dropping affiliate links and publishing free research that happens to live off platform due to format constraints. From an intent standpoint, this initiative started because this sub, like most penny stock spaces, is flooded with hype, pumps, and drive by narratives. These DDs are built from primary data and take real time to produce. They’re not designed to push tickers or sell subscriptions. They’re designed to slow people down and give them context before they trade. I also want to point out that these tickers came directly from community voting. I didn’t pick them. I didn’t inject my own agenda. I did the work the community asked for and shared it in the most practical way available. If the community decides it doesn’t want this kind of content, I’ll respect that. But I’d argue there’s room for a win win here where people get higher uality analysis without turning the sub into a billboard.
Totally! Free API access can definitely attract deelopers, but it's gotta keep delivering value to maintain that interest after the trial…
Check them out. They have an API and a very responsive team. I’ve been working on trying them out and I like what I see so far.
Gemini provides free quota for API access which is good for developers.
Discord has a great fucking API compared to slack but other than that I think slack is really good as well. Teams sucks a load of donkey cocks but what's wrong with slack?
Monarch Money and Copilot are the gold standards for visibility. It’s a necessary step for complex portfolios, mirroring the control rooms of professional family offices. But don't ignore the 2022 Plaid disruptions. Single API aggregators create a dangerous blind spot during liquidity shocks. Which is why manual oversight of primary accounts remains mandatory when markets fracture.
You know they sell API access, right? The LLM is only a POST request away...
> Right, so you gotta work in a container cause you know it's a flawed technology, Buddy, no technology is ever free from flaws. The idea is that you weigh the pros and the cons. The pros outweigh the cons here massively for many types of work. Even seperate to this, many types of development, especially webdev, are just much better developing in containers where you have everything setup just right, the exact same way every time, rather than dealing with specific configuration and tooling hell all the time. It just makes sense. >Like imagine you're doing data science work and when you got 5% left it just nukes your entire work, even if containerized, surely this technology is worth something. What part of working in a container and backing up makes you think anyone would get 95% of the way through a project or significant goal without having ever though to save their progress?? So many of the room temperature IQ anti AI takes rely on just.... not at all adapting to new technology and then pretending it can't work because you wont use it right. Its like saying cars are worthless because if you're drunk they can't drive you home like horses did... Like yeah, thats a flaw, but then just don't use them while drunk/call a taxi is the answer. >Just do another 2 million API calls! You truly don't realize how much of a nothing sentence this is. It exposes a level of ignorance you don't even realize it exposes. Firstly, its extremely unlikely that youll be making 2 million api calls over the course of ... a very long period of time, much less due to losing any amount of work someone might lose with the container workflow I described. Secondly, 2 million API calls to what exactly? An API call isn't a fixed cost thing. The cost and usecases for calls varies so wildly 2 million could be anything from completely reasonable to outrageous. For instance, a game probably makes 2 million calls to a graphics API within a gaming session. On the other hand, making like 500 api calls from a short shopping visit to a website is a lot. Basically, you've exposed that you know basically nothing about the entire space you are giving your take on, by basing your whole take on the weakest possible argument one could make, and exposing that you have no idea of the significance of essential terms related to the space. I've probably given you more knowledge than you had before in this one comment, and I'm just some dude. If *I* feel like I know next to nothing, imagine how little you know.
Most microcaps say integration and mean a Zapier webhook. EDI and API into tier-one TMS is a different league.
Right, so you gotta work in a container cause you know it's a flawed technology, I'm sure it's gonna do stuff just fine. Like imagine you're doing data science work and when you got 5% left it just nukes your entire work, even if containerized, surely this technology is worth something. Just do another 2 million API calls! AI is worthless "technology"
It's tough to troubleshoot without additional information about your specific request. Is it possible that the option contract you queried did not trade within the timeframe you requested? If it's far in/out of the money, it may be that no trades happened, and therefore no bars were available. I can confirm that this request returns hourly bars using a free-tier API key: [https://api.massive.com/v2/aggs/ticker/O:SPY260105C00683000/range/1/hour/2025-12-22/2026-01-01?adjusted=true&sort=asc&limit=120&apiKey=](https://api.massive.com/v2/aggs/ticker/O:SPY260105C00683000/range/1/hour/2025-12-22/2026-01-01?adjusted=true&sort=asc&limit=120&apiKey=)
I’ve followed Strategy #2 for about half a year and lost nearly all my savings. My decisions, my responsibility, but please take this as a warning. Managing at 1.2 or 0.8 EM is a mirage—NDX can move from 1.5× EM to ITM in seconds, and the “roll” often becomes just continued pain with larger losses. Unless you’ve built a fully tested, risk‑bounded trading system with API access to your broker that executes algorithmically and enforces hard limits, you’re trying to out‑react moves that happen faster than you can click. I do believe papakong88 has good intentions, but the risks here are too great to call this sustainable. Don’t do it. I beg you. Don't be a fool like me.
Appreciate the detailed breakdown. For anything "agentic commerce" I keep coming back to the same question you raised, contract-to-GAAP revenue conversion and what the retention/renewal story looks like after the initial deployments. Are you planning to track any leading indicators like cohort usage (API calls per customer) or gross margin on the checkout/payment components separately? On the broader agentic AI side (non-stock specific), Ive been collecting examples of real-world agent patterns and what tends to break in production here: https://www.agentixlabs.com/blog/
Au/trbodeez Thanks! I wrote a few scanners that use AI agents, statistical inference engines and news and gossip sentiment. This one specifically logs in every night after the OPRA files are available and then makes API calls against Alpaca's brokerage (best balance between cost and value). It gets a history of purchased call contracts to establish an average call volume and then it looks at yesterday's call volume. If yesterday's call volume is 200% higher than the average and higher than each of the five trading days, it makes the screen. The report builds graphs from the screen. https://preview.redd.it/nqbm5hi2xdbg1.jpeg?width=2340&format=pjpg&auto=webp&s=2578602959bb5ac88da0d2c3b4214558ab191d68 I'll continue in next response will only let me include one picture. This is sample output from my saturday 1:30 AM run.
This is very out of date. IEX cloud shut down their API months (more than a year?) ago. Like another commenter said, just an AI post.
To anyone interested in power: Large industrial diesel generators (gensets) are quite often used just as backup generators (mission critical). They can individually provide up to 5MW of power with ~100 liters of displacement. Companies involved in this market: **Caterpillar, Cummins, Kohler, Mitsubishi, Rolls Royce, Generac, Modine, API Heat Transfer, AKG Thermal, Adams Thermal** -Someone who’s worked in the genset industry.
Connecting your brokerage to any third party API is incredibly stupid if you have lots of money in it imo. I don't care if it says SOC 2 compliant or whatever, mistakes & bugs happen.
Yea its a good library which is what it was promised to be, your error isn’t unique, and there is a lot of existing information on that API, if you understand what I mean, so its just reading and giving, but actual coding especially software coding is way more intricate and even now, unless a human checks and verifies nobody uses AI…its faster to do it yourself rather than trying to understand why this test doesn’t work or this output is weird…its like Open book test…the AI are facts written in the book, so that is easy to find because it can find the page, but coding as in software coding or developing is not an open book test…it requires combining objectives, test and trials, reasoning and so much.
Am in the same boat, and there is no good answer. Wikipedia shows some historical data, e.g. Nasdaq-100 until 2007. API providers usually have limited lookback periods (like 5 or 10 years), unless you shell out the big bucks. I have now committed to doing the grunt work and manually extract the data from available press releases.
Yeah, I use intraday data. And there are million+ options, each having its one price history, so it’s like the data for a million of stocks. Then options also have many data points, bid, ask, last price, OI, IV, 1st and 2nd order Greeks, etc. Then you can also pre-calculate variety of indicators that you may want to store with the options price data. Currently I even use a 768 GB RAM server to store just the recent options data for the last few weeks in memory to be able to pull them up via an internal API, and that’s barely enough RAM to store just a couple months of recent options data.
"large amount" but I'm curious if someone can do an API call and sift the data for the percent of people who lost all their retirement fund and in massive debt vs the success posts.
Even the most staunch AI boommongers will admit that there will be job loss. Demand is not infinite. Nobody knows what will happen, but one cannot create consumer or business demand by juicing supply. It feels dumb just typing that, but here we are. Your (correct) reasoning is first year Econ stuff, but nobody's pulling their hair out of their skull here over that intractable fact. Powell appears unperturbed because of "real revenue," which I take to mean conversion of model compute to income. The analysis needs to pinpoint *where exactly that compute is happening.* API tokens? Consumer accounts? What are the use cases? Only then will we begin to understand how AI will affect the job market.
Is this API actually secure or is this going to end up being a back door for people to scrape data from?
API calls are riddled with problems.....LLMs are tainted with False-False and False-True nodes; Hallucinations.....Bug ARE a big deal: a user sends an order for 10 @ 100.0654 and you send the exchange an order for 10 @ 10.07...big difference at volumes & equity spent....never mind the UI/UX interaction with the users and "their" expectations and customizations. I developed Desktop, Server, Databases/Farms & Websites for the past 25 years.....You have a potentially good idea, you need more heads in the room and that is the purpose of the additional devs.
how did you compile this list? did you use an API? manual reading? i’d like to do the same.
Thank you! I'm glad to hear that your platform has extensive historical data. However, after I logged in, it mentioned that the data is accessed via an API. Where can I download ready-made CSV files? Or could you please help me export the data directly?
API of course. Had the help of a tech-forward colleague.
You did this by hand? Or API?
I’d encourage you to check out Massive.com. We provide a couple of years of minute-level options data for free. The data can be queried in CSV format, which makes it easy to import into Excel or other spreadsheet tools. Many users also pull the data directly from the API into Excel. I’m happy to set up a trial if you’d like to take a closer look. Disclosure: I work at Massive.
Happy to chat more about Public (I'm the COO) if you're interested in learning more! Can also introduce you to our Developer Advocate and our API Sales team who can help get you onboarded smoothly. DMs open!
Main thing at 18: treat process > stock picks. Analyst ratings (on moomoo, Yahoo, Morningstar) are fine as a filter, but don’t copy them blindly; track a few ratings in a spreadsheet and see who’s accurate over 1–3 years. Build a simple core first (broad index funds in your TFSA/RRSP), then use a tiny sandbox for “learning” stocks. For deeper fundamentals and API-style data pulls, I’ve mixed Koyfin, Alpha Vantage, and DreamFactory to run my own screens and backtests.
I've been testing [Paradox Intelligence](https://www.paradoxintelligence.com) too. Curious how you're connecting it to Google Sheets..are you using their API directly or pulling data manually?
The main thing here is that debt only works if your capacity comes online on time and stays near full. Coreweave’s problem isn’t just leverage, it’s timing risk: construction delays plus any GPU supply hiccup or AI demand slowdown and the interest meter keeps running while racks sit dark. OP’s point about 4x+ debt/revenue is brutal when the assets are so specialized and resale is limited. I’d watch three signals: power contracts (locked and cheap or not), actual utilization rates, and renewal terms with their biggest AI tenants. In my shop we’ve seen smaller infra players use Equinix, Digital Realty, and even DreamFactory-backed API layers to stay flexible instead of betting the farm on owning everything. The main point: this business model lives or dies on timing and utilization, not just “AI boom” headlines.
Cloudflare NET (bet on AI internet infrastructure) GOOG (bet on models + API layer) Intel (bet on American fab for AI)
Nonsense, two lines of code to call google_ai and three lines to call the trading API.
And how do you automate the trades? Also APIs? The only broker with API access I know of is IBKR.
My 11k loc script to call chatgpt API will destroy yours!
What you can’t change is that data and api’s change. So any strategy you employ needs to accept that. From there you have two options 1) avoid, 2) manage Avoid is what I do. I have 4 ETFs, a house, a mortgage, and cash. That’s 7 line items in a spreadsheet. I update them periodically and can see things like debt to assets, exposures etc. Manage would have to accept change at the API layer. One thing I would think about is having AI code me up something that grabs the data from the API and puts it in a csv. Then separately I would build a spreadsheet/dashboard built on the schema of that csv. This separates data ingest from all the ways you want to see the data. The csv becomes a tab in the spreadsheet and can be updated.
It uses the Yahoo Finance API (so no personal API key is required)
Which API do you use for the live prices?
I’ll try my best here….could be long >Some believe that only those who has insider's (secret) information make the most profits. While there are some (Stephen A Cohen and folks) most aren’t or don’t get enough to make it consistently profitable. Instead it’s about processing that new information into signals and acting on it. That s a retail we just don’t have access to. >Others think that tools that funds, banks and investment firms are using exclusive to them and anything that is available to public is not really working. Absolutely. Asset managers spends millions on people and their raw resource is data. - For example TSLA earnings come at 4PM. I’m paying Bloomberg to send me the full balance sheet by 8PM. By then I have the competitor data loaded, industry data loaded and derived. Models waiting for new input to crunch out adjustments to TSLA. By 9AM I have my orders for how much to buy/sell to adjust across the portfolios - data is #2 operating expense behind researcher salaries. - process is crucial which removes so many emotional traps and behavioral biases. That alone probably accounts for so many retail mistakes. The process is hardened over time with support, checks, metrics to improve incrementally. >But I also think that there are many myths that are made up by people just to find an excuse for their losses. Some elements of that like it’s stacked against retail…but truth is yea it kind is. In academia circles they coin the term smart and dumb money (guess who’s the smart money). I tried to replicate some of the processes myself without using company resource and found myself locked out. I tried to reach out to Morningstar about how I can get an API license for data….yea they don’t reply unless you’re an actual business. So….as retail…simplicity probably works best because we aren’t setup to try to get complicated.
Right now it is only available to private testers for testing and bug fixing. But alpaca is the broker and relaydesk is the API endpoint
Yessir! Alpaca is the broker, relaydesk is the API endpoint
Is this your paid data API that you used? I had too many false positives following LLM based analysis. Did you verify it manually.
With all of the new light oil (40 API) coming out of the permian basin, refiners actually need heavier oil to blend with it. This is more of a midstream issue i.e. The Shale Revolution made the US the largest producer of Sweet Light crude while having the largest capacity to refine heavy sour crude on the planet. Hence the blending. Unless the plan is to become an LPG giant(which the West does not use much, at least not compared to developing nations), you cannot simply refine Shale crude oil in a Louisiana or Texas Refinery. You will end up with a lot of Petrol and LPG and the diesel yield will not be profitable.
All they had to do was buy one of the many Rddt clients they crushed when raising API fees. There were so many good ones - instead we're stuck with this dogshit UI.
Came across AI model API keys. They charge for usage. So i guess that's a good way of making monies?
Thought they had killed the reddit API
I think they will have to and want to make a deal with Uber at some point. Most of these AV companies are tech strong in the AV side but much weaker in the consumer side. Tesla actually being the worst on the consumer side with Waymo likely being the best. It would be extremely easy for anyone of these AV companies to partner with Uber at some point and more so, it would be highly financially beneficial to Uber and these companies to do so. And all Uber has to do is allow their clients the option to have AV or human driver or "I do not care" and start adding it to their fleet. As for the liability and remote control, the AV backend will maintain that. Uber does not want that. The will just lease their cars and all controlled thru an API.
its not all free, but I am happy to support it financially in the building phase an open it to people for using it. Eventually, perhaps I will build a paid product layer on top of this to cover my expenses for advanced features like Portfolio strategy management, cross tracking with other news sources, Live ticker price etc. but for now, happy to absorb the cost to allow this tool to benefit others. I'll DM you a detailed breakdown of the tech stack shortly. Mostly a mix of FinBert, Vader, Grok, Reddit API, Supabase for backend, Next js.
glad you asked actually. Just today, we were contemplating making it opensource, and releasing an API for people to use this data effectively other than just the front end. We are working on both and will revert back on the Open Source status via DM early next week.
No one mentioned it… Tradier has a friendlier API than IBKR.
I planned on IBKR API but honestly, as a swing trader, i find their cost and API limits to be silly. Instead I went with a provider for tick data and process everything myself. Its easier for me to find plays, and then time when I want to enter and how much. I can also adjust and process the data on the fly vs having to rely on them.
Your alternative approach and the IBKR+Python approach are the same thing. Your Python code needs to connect to a broker’s API, like IBKR’s, in order to execute trades. And the broker will provide a data feed. The only reasons you would want a 3rd party data feed is if the broker’s feed is too expensive or too slow. For instance, IBKR’s TWS API updated option quotes every 250 milliseconds and has limitations on simultaneous streaming symbols requiring paying more to exceed the limit: https://www.interactivebrokers.com/campus/ibkr-api-page/market-data-subscriptions/#market-data-lines There’s quite a few broker/data feed APIs: IBKR, Lightspeed, TradeStation, Public, Tradier, Tastytrade, etc. I prefer IBKR because they have great order execution personally.
Before making accusations maybe ask questions on what is going on. API trading where indicators can be used to create a bot that executes trades based on user inputs. No ai, no scamming. Peace be with you 🙏
Not sure what you are doing, but selecting specific lots and multileg API's seem reasonable on both ATP and AT+. I use them alot.
I started trading options I guess about 8 years back. Started automating 4-5 years back when I heard of td Ameritrade API. Automation is not order execution just evaluation of my multitude of positions. Bad years and good years, more good now more bad in the past. This year I am barely matching SPX because I panicked on liberation day otherwise the plan was good. I did it all working full time. My goal was to spend no more than half hour per day analyzing or adjusting positions. I spend more but if I am forced to, I can get by with 15 minutes. And it does not need to be at a specific time, could be any time of the day when I find time. The challenges have been over stretching myself. At my peak I had 40-42 symbols because of diversification. What I learned is that many a times the adjustment requirements come when I am loaded at work and then I make bad decisions so I reduced the number of symbols. Haven't gone all the way to my nirvana of 3 symbols, but halfway there - when I see an adjustable opportunity, I can't resist. But yes - it can be done. // a decade or more back, I was not working -- by choice, and was actively trading based on technicals, and my mantra was -- I want to meet people and if I get a lunch meeting, even if I need to drive one hour back and forth, I would do that and trading will come secondary. And I did that too for a couple of years. Trade evaluations happened everyday but meeting people came first It is just a question of your mindset on how aggressive you want your trade.
THey do, and the DOD Gemini contract will be supporting 3-4 million workers. Lots of room to expand on the current deal too. We are going to need more data storage and speed. The military has zettabytes of data just sitting on hard drives with no where to put it or analyze it. Additionally, once the API keys start getting handed out, automation will spike needs +500%. I can see the trends on Palantir already.
We've had dominos API for 10 years now, only makes sense HaaS replaces PaaS https://www.npmjs.com/package/dominos
With all of the new light oil (40 API) coming out of the permian basin, refiners actually need heavier oil to blend with it. Canadian and Venezuelan crudes serve that purpose. Also, heavy crude is normally catalytically cracked into lighter products yielding more barrels coming out of the refinery than go in leading to improved profits since heavy oil can be purchased at a discount.
So I build pharma plants for a living. Just finished detailed costing for about 4MT/year of API for a commercial GLP. Cost to build that capacity is about 1 to 1.3 billion, with the differntially really being how much CAPEX gets shifted to OPEX costs. 4MT/year at highest dosage of ~15mg; is just shy of 270M doses of anti-fatty magic medicine. 1 data center CAPEX spend is then roughly 200 MT/yr production (and I mean realistically... we could get some more scale efficinecy) or 13.5B doses of anti-fatty juice. We could eliminate the fatties with 1 Data center budget, and walk away with 60%+ margin.
Check out [MesoSim](https://docs.mesosim.io), it has an [AI Assistant](https://chatgpt.com/g/g-690516c500b8819191e154543a9a85a7-mesosim-ai-assistant) to help you with strategy development. Once you have a working strategy you can use [MesoLive](https://docs.mesolive.io) to trade the strategies with IBKR, TastyTrade or using the built-in paper trading account. MesoLive has a web based user interface, but trade automation is also possible through MesoLive-API (needs FundPro subscription). To see what the AI agent is capable of check out this [blog post](https://blog.deltaray.io/rhino-options-strategy). The base strategy was created by the agent from presentations of the Rhino Strategy. [https://blog.deltaray.io/rhino-options-strategy](https://blog.deltaray.io/rhino-options-strategy) Full disclosure: I'm the owner of the service.
API yesterday estimated a crude oil draw of 9.3 mb last week but today the EIA report claims only 1.3 mb draw. Yeah, something reeks here.
Gemini API pretty much unusable for me ATM despite being a paying customer 🤡
Here's how the bubble pops. A prominent article in the Wall Street Journal tells the story of a national bank. They spent tens of millions on ChatGPT, Copilot, and API tokens trying to stay competitive. They hire a small team of specialized consultants that set up an on-premises LLM. They self-host, and train the model on their data. The end result, they get better, more consistent value. They cancel all their AI cloud and infrastructure spend. The roadmap is there. The future of AI is smaller, and Data Center to the Moon race culture is the next financial crisis in the making.
Ah you're talking about speech to text APIs and then integrated into web applications. Those APIs which imo haven't been accurate or reliable until like 5 years ago, or when Watsons came out with theirs. But what I mean is to be able to semantically tell it something and it understanding without me needing to spell it out. That is what makes it more efficient. What would be generative is if I wanted to go to 3 cities in Europe, and I asked what's the most efficient way to hit each place in the time frame I give it. I would expect it to give me suggestions, I would expect it to know if a train was faster, or if renting a car was more cost effective. I would expect it to look into all of these options because it would be as close to having a human travel agent as possible, not just having a speech to text input fed directly into an airlines booking API.
But prices could easily go down a lot more than 50% for tokens, prices are 1/1000th of when I first started using OpenAI API during 3.5 turbo era. I think what we'd need to do in the future is say 'Late 2025 AI was good enough' and instead of passing frontier models to consumers, largely use last years technology.
Bruh, your "model" is basically an API wrapper for FedWatch. Nice marketing attempt though.
How are you getting likes on this? Not one of my sentences noted META. I noted APIs, sure. However, every serious implementation of AI, API or not, requires some oversight. FOR NOW. It's the concept of AI as a coworker, or human in the loop. Get used to it. FOR NOW! Moreover, just because one automation fails, doesn't invalidate all. Stated another way for your benefit: If you're aware of one bad AI tool, doesn't invalidate all others. MAYBE your perspective is limited. To be clear, I've seen many people lose their role in the past few months due to AI. Wait, I didn't just SEE! it. I made it happen! AI is immensely capable in many ways, and it's utility is expanding weekly. Finally, it's myopic folks like you that make Reddit unusable. You grab one concept or word, and run with it. That's all you can handle. You disregarding an entire paragraph someone wrote, fixate on a line, and disregard the context. It's called logic chopping, and you're the first boomer I'd fire from the warehouse job you have.
To be fair it managed to intercept Robinhood API calls and grab 10,000 options quotes on its own
> Anyone can do it. Hahahahahahahahaha Fuck, this is a good one. Just wait until the API you're using changes behavior without you knowing and your fucking dumb thing takes a shit and you will literally have no idea why.
>OpenAI on Thursday announced its most advanced artificial intelligence model, GPT-5.2, and said it’s the best offering yet for everyday professional use. >The model is better than predecessors at creating spreadsheets, building presentations, perceiving images, writing code and understanding long context, OpenAI said. It will be available starting Thursday within OpenAI’s ChatGPT chatbot and its application programming interface (API).
This looks like a classic “story is real, market still doesn’t care” microcap setup, but the key question is contract quality, not just logos and ARR. Main thing I’d dig into is how much of that 15m ARR is true recurring SaaS vs usage-based, and what the minimums and termination clauses are on those 5m+ contracts. A single 8m deal that can be walked away from in 12 months is very different from a 5-year take-or-pay. Also: gross margin trajectory and implementation cost. If each new logo needs a semi-custom integration and on-site team, they might be buying growth at thin margins. I’d want cohort-level data: expansion vs churn per customer, and how many pilots convert to full contracts. On comps, I’d anchor more on high-friction enterprise logistics names like Descartes or even MercuryGate than general SaaS. Tools like Snowflake, Palantir, and DreamFactory-style API layers are great analogs for how sticky data-integrated platforms can become once embedded in customer workflows. So yeah, the mismatch is interesting, but it only works if those big contracts are durable and gross margins scale up, not down.
QE was awesome post-covid. Now we get it at all-time highs! This is not crazy at all! /s But, I think we have learned that the Fed was super nervous about the clusters in spikes in SOFR rates, which was a genuine liquidity issue, possibly shutdown related, we'll never know. I took that as a signal that the Fed would surely cut, not that they would start buying treasuries. I'm not knocking gold, silver, and copper at all, I have exposure to all, but I think the vast majority of loan-created money will go to the place it has been reliably going the last three years: AI infra. Even Jamie D is blunting JPM earnings to invest in that vertical. For the bubble bears: If you've been bearish all along, you're bearish for the same reasons now that you were three years ago: the train might stop. But, your bias shouldn't be prove the train won't stop and I'll get on, but prove to me that it will stop and I'll start to disembark. The Fed has the back of the stock market, even though the stock market absolutely does not need it. Corporate bonds are fine, outside OAI-adjacent stuff. The dollar hasn't tanked. Inflation is not about 3%... yet. Monetization of AI is best measured by API calls. Companies pay for those. That is exploding for every player, except maybe OAI losing some share to GOOG. Inference is the monetization wave and is now most compute demand. ASICs are probably the future, but NVDA has some ASICs build into GB300s already for long context windows. Crucially, there is a memory shortage now. If models stay the same size, all those new API calls need more memory to run. Except models don't stay the same size. Sparse mixture of expert models still improve when they have a larger RAM footprint, and quantization of large models reliably increases halluncinations and degrades performance. That memory will come from MU, SK Hynix, and Samsung. EWY has an uber low PE and is 40% Hynix and Samsung. This is the safest Sharpe ratio bet in the world, but it won't pay out as much as MU. If you held memory stocks in past shortages, you know what a wild ride that can be. This is a secular shortage. Fabric (ethernet/NVlink/optics) and GPUs might still be the bottleneck for training, but probably not, but training was the the bottleneck pre-2025. The current and destination bottleneck is high-band RAM, and the fundamentals of these companies scream bottleneck. Anyway, crystal ball comment, not advice, yada yada. Feel free to come back and mock me if I am wrong.
Probably not real people, they’ve surely have API access at this point.
For option Greeks specifically, I built FastGreeks API (fastgreeks.com) - REST endpoint that returns Delta, Gamma, Theta, Vega, Rho. Example call: POST /greeks { "S": 100, "K": 105, "T": 0.5, "r": 0.05, "sigma": 0.2 } Returns all Greeks in \~10ms. Free tier gives you 1k calculations/month to test. Also supports batch processing if you need to price multiple options at once (up to 10k per request).
~WHAT IF~ What if all the brokers sell live API access to the MM so they can see exactly when I’m about to trade? ~WHAT IF~ 🪄🧌
That Palantir’s products are not really meant for people that can warehouse their own data, analyze it and run LLM layers over it for XYZ output. They pitched my firm on twitter analytics many years ago, it was API plugged into a slick UI. Government contracts here do not signify any “high-level innovation” - Anduril has yet to mass produce. Source: this is literally what I fucking do
I actually use automation and hit their API :)
OK, I found a solution, so I'll post it if anyone else ever runs into this problem. The website [kibot.com](http://kibot.com) offers a free (if you use the API) solution to finding raw unadjusted prices on any given day. I wouldn't doubt there are other methods, but this was the only one I found, and it was quite simple to use.
Yeah you are right! English isn’t my native language so I couldn’t come up with a better word for describing applications/models -> Cloud/API -> TPUs. Do you have a better word?
Everyone I know is doing AI assisted coding now. All of our developers at our fintech startup use it and they are ludicrously smart. I think coding assistance is actually the most practical and transparent AI value-add for businesses. AI art looks like AI art, AI writing is full of em-dashes — and ellipses … But AI code just looks like code. I’m more of a sysadmin so not much of a developer, but I find AI assistance really helpful for writing some code to parse through a complex data structure returned by an API call, for example. However, I only ask it to write functions and snippets and then I massage them and glue them in. I think that’s pretty standard practice.
It did work, though it seems automod removed the bot response. The RemindMe bot can be a bit slow to respond these days due to Reddit's API changes.
Maybe. However if you don't want to use LLMs via an API, NVidia's CUDA is still pretty much the only game in town. It definitely *is* a sign of strength that Google is able to compete with both ChatGPT and NVidia on their home turfs,*while* keeping the original money printing machine alive. I don't really know yet how big of a threat the TPUs are to e.g. NVidia. The recent deals could've just been hedges from big cloud compute users. How much can they scale up the production? What are the profit margins? NVidia also has kind of been consistently excellent at what they do. Google kind of sucks in most of what they offer. On the other hand, we're still stuck with them, so that's probably too harsh of a statement.
I nuked my twelve year old account during the spez API bullshit because I figured that was gonna do the whole idea of profitable stocks, in. Whoopsie. Classic dumbass move from me!!
1) No. The benefit would be minimal to nonexistent and moving investments between brokerage firms carries some risk. Over the course of the transaction, you could incur real and/or opportunity losses on your assets because I doubt you will be able to make the transfer between brokerages in-kind. Schwab is a very good brokerage, and it has an excellent trading platform and API that you may want to make use of someday. 2) If you set up dividend reinvestment on your positions, any dividends from your positions will automatically get reinvested, in fractional or whole share amounts. If you really insist on having every nickel invested at all times however, you can buy shares in something that has a lower share price. I don't think that make sense though. A better way to roll would be to allow your cash position to grow to the point you can purchase one or more shares in something you actually want to own, then buy those shares after the equity pulls back in share price. Your questions are good for the purpose of confirming or refuting your decision. Challenging everything is a good way to become more confident and surer about your decisions. Don't feel shy about continuing to do that in perpetuity.
> Your end users are still using excel to analyze the data which is why excel isnt being replaced but being used for its actual purpose. The databases being queried by analysts is still being outputted into excel sheets and being analyzed in excel sheets. In your organization you can just reach out to anyone in finance / accounting / FP&A / etc. And their most used application is most likely still excel So originally this is the case, but we've since built all of the reports/analysis people do in Excel into the system. This ensures common data, standardized methodology, and standardized reporting. Excel used to be relied upon for exports/imports, but we've moved away from that into an API and microservice based data loading system.
Yeah, the auto mod removed my post, so I guess I got frustrated. Definitely, that's great to hear about. I'm just entering into this world after keeping the investing and technical worlds separate for awhile, and I'm amazed by the richness of the resources available. Applying for the Schwab API now, thanks for the tip.
There's really nothing fledging about what you are describing. That stuff has been around for decades. Re: hedge fund - you mean like r/quant ? No one is ever going to share their edge so you aren't going to get details from people. Re: API's - yeah - those tends to be brokerage or tool provider specific. Those topics get discussed occasionally on r/investing. But there are specific subreddits for specific tools and brokers. For example - if you want to talk about Schwab's API - you can ask here or in r/Schwab . If have questions about a tool like QuantConnect - there's r/QuantConnect . Or TradingView - there's r/TradingView
It likely exists - you just need to elaborate on what you mean. Are you asking about algo development? Quant analysis? Back-testing? What kind of tools? Brokerage API usage? Those topics come up in r/investing and there are smaller subreddits dedicated to specific niche areas and tools.