See More StocksHome

API

Agora Inc

Show Trading View Graph

Mentions (24Hr)

2

0.00% Today

Reddit Posts

r/wallstreetbetsSee Post

Chat with Earnings Call?

r/investingSee Post

Download dataset of stock prices X tickers for yesterday?

r/investingSee Post

Sea Change: Value Investing

r/WallstreetbetsnewSee Post

Tech market brings important development opportunities, AIGC is firmly top 1 in the current technology field

r/pennystocksSee Post

Tech market brings important development opportunities, AIGC is firmly top 1 in the current technology field

r/WallStreetbetsELITESee Post

AIGC market brings important development opportunities, artificial intelligence technology has been developing

r/pennystocksSee Post

Avricore Health - AVCR.V making waves in Pharmacy Point of Care Testing! CEO interview this evening as well.

r/wallstreetbetsSee Post

Sea Change: Value Investing

r/investingSee Post

API KEY and robinhood dividends

r/pennystocksSee Post

OTC : KWIK Shareholder Letter January 3, 2024

r/optionsSee Post

SPX 0DTE Strategy Built

r/WallstreetbetsnewSee Post

The commercialization of multimodal models is emerging, Gemini now appears to exceed ChatGPT

r/pennystocksSee Post

The commercialization of multimodal models is emerging, Gemini now appears to exceed ChatGPT

r/optionsSee Post

Best API platform for End of day option pricing

r/WallStreetbetsELITESee Post

Why Microsoft's gross margins are going brrr (up 1.89% QoQ).

r/wallstreetbetsSee Post

Why Microsoft's gross margins are expanding (up 1.89% QoQ).

r/StockMarketSee Post

Why Microsoft's gross margins are expanding (up 1.89% QoQ).

r/stocksSee Post

Why Microsoft's margins are expanding.

r/optionsSee Post

Interactive brokers or Schwab

r/wallstreetbetsSee Post

Reddit IPO

r/wallstreetbetsSee Post

Google's AI project "Gemini" shipped, and so far it looks better than GPT4

r/stocksSee Post

US Broker Recommendation with a market that allows both longs/shorts

r/investingSee Post

API provider for premarket data

r/WallstreetbetsnewSee Post

A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform

r/investingSee Post

Best API for grabbing historical financial statement data to compare across companies.

r/StockMarketSee Post

Seeking Free Advance/Decline, NH/NL Data - Python API?

r/pennystocksSee Post

A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform

r/wallstreetbetsOGsSee Post

A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform

r/WallStreetbetsELITESee Post

A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform

r/ShortsqueezeSee Post

A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform

r/smallstreetbetsSee Post

A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform

r/RobinHoodPennyStocksSee Post

A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform

r/stocksSee Post

Delving Deeper into Benzinga Pro: Does the Subscription Include Full API Access?

r/investingSee Post

Past and future list of investor (analyst) dates?

r/pennystocksSee Post

Qples by Fobi Announces 77% Sales Growth YoY with Increased Momentum From Media Solutions, AI (8112) Coupons, & New API Integration

r/WallstreetbetsnewSee Post

Qples by Fobi Announces 77% Sales Growth YoY with Increased Momentum From Media Solutions, AI (8112) Coupons, & New API Integration

r/RobinHoodPennyStocksSee Post

Qples by Fobi Announces 77% Sales Growth YoY with Increased Momentum From Media Solutions, AI (8112) Coupons, & New API Integration

r/pennystocksSee Post

Aduro Clean Technologies Inc. Research Update

r/WallStreetbetsELITESee Post

Aduro Clean Technologies Inc. Research Update

r/optionsSee Post

Option Chain REST APIs w/ Greeks and Beta Weighting

r/investingSee Post

As an asset manager, why wouldn’t you use Verity?

r/wallstreetbetsSee Post

Nasdaq $ZG (Zillow) EPS not accurate?

r/pennystocksSee Post

$VERS Upcoming Webinar: Introduction and Demonstration of Genius

r/StockMarketSee Post

Comps and Precedents: API Help

r/StockMarketSee Post

UsDebtClock.org is a fake website

r/wallstreetbetsSee Post

Are there pre-built bull/bear systems for 5-10m period QQQ / SPY day trades?

r/ShortsqueezeSee Post

Short Squeeze is Reopened. Play Nice.

r/stocksSee Post

Your favourite place for stock data

r/optionsSee Post

Created options trading bot with Interactive Brokers API

r/investingSee Post

What is driving oil prices down this week?

r/weedstocksSee Post

Leafly Announces New API for Order Integration($LFLY)

r/stocksSee Post

Data mapping tickers to sector / industry?

r/WallstreetbetsnewSee Post

Support In View For USOIL !

r/wallstreetbetsSee Post

Is Unity going to Zero? - Why they just killed their business model.

r/optionsSee Post

Need Help Deciding About Limex API Trading Contest

r/investingSee Post

Looking for affordable API to fetch specific historical stock market data

r/optionsSee Post

Paper trading with API?

r/optionsSee Post

Where do sites like Unusual Whales scrape their data from?

r/stocksSee Post

Twilio Q2 2023: A Mixed Bag with Strong Revenue Growth Amid Stock Price Challenges

r/StockMarketSee Post

Reference for S&P500 Companies by Year?

r/SPACsSee Post

[DIY Filing Alerts] Part 3 of 3: Building the Script and Automating Your Alerts

r/stocksSee Post

Know The Company - Okta

r/SPACsSee Post

[DIY Filing Alerts] Part 2: Emailing Today's Filings

r/wallstreetbetsOGsSee Post

This prized $PGY doesn't need lipstick (an amalgamation of the DD's)

r/SPACsSee Post

[DIY Filing Alerts] Part 1: Working with the SEC API

r/optionsSee Post

API or Dataset that shows intraday price movement for Options Bid/Ask

r/wallstreetbetsSee Post

[Newbie] Bought Microsoft shares at 250 mainly as see value in ChatGPT. I think I'll hold for at least +6 months but I'd like your thoughts.

r/stocksSee Post

Crude Oil Soars Near YTD Highs On Largest Single-Week Crude Inventory Crash In Years

r/stocksSee Post

Anyone else bullish about $GOOGL Web Integrity API?

r/investingSee Post

I found this trading tool thats just scraping all of our comments and running them through ChatGPT to get our sentiment on different stocks. Isnt this a violation of reddits new API rules?

r/optionsSee Post

where to fetch crypto option data

r/wallstreetbetsSee Post

I’m Building a Free Fundamental Stock Data API You Can Use for Projects and Analysis

r/stocksSee Post

Fundamental Stock Data for Your Projects and Analysis

r/StockMarketSee Post

Fundamental Stock Data for Your Projects and Analysis

r/stocksSee Post

Meta, Microsoft and Amazon team up on maps project to crack Apple-Google duopoly

r/wallstreetbetsSee Post

Pictures say it all. Robinhood is shady AF.

r/optionsSee Post

URGENT - Audit Your Transactions: Broker Alters Orders without Permission

r/StockMarketSee Post

My AI momentum trading journey just started. Dumping $3k into an automated trading strategy guided by ChatGPT. Am I gonna make it

r/StockMarketSee Post

I’m Building a Free API for Stock Fundamentals

r/wallstreetbetsSee Post

The AI trading journey begins. Throwing $3k into automated trading strategies. Will I eat a bag of dicks? Roast me if you must

r/StockMarketSee Post

I made a free & unique spreadsheet that removes stock prices to help you invest like Warren Buffett (V2)

r/StockMarketSee Post

I made a free & unique spreadsheet that removes stock prices to help you invest like Warren Buffett (V2)

r/optionsSee Post

To recalculate historical options data from CBOE, to find IVs at moment of trades, what int rate?

r/pennystocksSee Post

WiMi Hologram Cloud Proposes A New Lightweight Decentralized Application Technical Solution Based on IPFS

r/wallstreetbetsSee Post

$SSTK Shutterstock - OpenAI ChatGBT partnership - Images, Photos, & Videos

r/optionsSee Post

Is there really no better way to track open + closed positions without multiple apps?

r/optionsSee Post

List of Platforms (Not Brokers) for advanced option trading

r/investingSee Post

anyone using Alpaca for long term investing?

r/investingSee Post

Financial API grouped by industry

r/WallStreetbetsELITESee Post

Utopia P2P is a great application that needs NO KYC to safeguard your data !

r/WallStreetbetsELITESee Post

Utopia P2P supports API access and CHAT GPT

r/optionsSee Post

IV across exchanges

r/optionsSee Post

Historical Greeks?

r/wallstreetbetsSee Post

Stepping Ahead with the Future of Digital Assets

r/wallstreetbetsSee Post

An Unexpected Ally in the Crypto Battlefield

r/stocksSee Post

Where can I find financial reports archives?

r/WallStreetbetsELITESee Post

Utopia P2P has now an airdrop for all Utopians

r/stocksSee Post

Microsoft’s stock hits record after executives predict $10 billion in annual A.I. revenue

r/wallstreetbetsSee Post

Reddit IPO - A Critical Examination of Reddit's Business Model and User Approach

r/wallstreetbetsSee Post

Reddit stands by controversial API changes as situation worsens

Mentions

Nice one buddy. I've buit only a dashbord one and a trading app with Tastytrade API to be able to day trade options faster - without having to select contracts and going through the confirmation screens before submitting. Well done exposing the url to the internet and accessing it on your phone

Mentions:#API

Okay but that may be a hallucination, and the way you ask is also very important. It isn’t nefariously trying to mislead you, or it’s being told to do so to harm you. Get a spreadsheet with several examples of the problem solved and show that to it. Then tell it what the problem you are trying to solve is and ask it to make a program for you. If you feed it an API key from IBKR that program can check for you every time you ask it.

Mentions:#API#IBKR

If you’re looking for API that actually calculates gex levels for any stock and index have a look here https://flashalpha.com/docs/playground Docs : https://flashalpha.com/docs/lab-api-gex And this requires login, but it’s free. Works during market hours as it’s not cached : https://flashalpha.com/stock/tsla

Mentions:#API

Bloomberg's BQL is great but yeah, the cost is brutal for anyone not at a big firm. Options Metrics is worth checking out. They let you filter by the usual stuff like strikes, expirations, Greeks, IV, and you can save your favorite scans which is super handy when you're running the same strategies repeatedly for clients. The free tier gives you delayed data so it's fine for research and backtesting, but for live trading you'd need the Pro tier to get real-time numbers. Is that what you're after or are you wanting an API or something?

Mentions:#API

/u/therpgrad hooked me up with some GPUs, and I bought a fancy rig to go under them, so there's no API costs or anything. will just be keeping that rig alive and managing the vector db longterm.

Mentions:#API

I use Unusual Whales for all this data. I actually get it through their API but I know they have a UI with custom filters. You can get institutional flow there as well.

Mentions:#API

Unusual Whales is the best at this imo. You can set up watchlists and monitor for whatever unusual activity you'd like. I like their API a lot too. I automated alerts and send them right to my phone. Next step I want to train my agent to analyze these signals and tell me what to do.

Mentions:#API

Yeah no doubt, I probably could have saved myself some grief early on if I'd gotten more hands on with the research. I think Schwab is lacking documentation on their websocket/streaming data though. And it took some experimenting to figure out what we could get away with in terms of string/url lengths. But the API stuff seems mostly well documented in Schwab's dev portal

Mentions:#API

I've had good luck just asking Claude code to do the research itself. It was able to figure out how to get everything working no problem, just need to specify to ignore any source that refers to TD Ameritrade's API.

Mentions:#API

As noted separately - register as a Schwab developer and renew the token weekly. Note - Claude doesn’t know API details from Schwab. With your developer account you will need to copy/paste the details for the api endpoints of interest so Claude isn’t guessing and you’re getting frustrated. :) It will work quickly and consistently this way.

Mentions:#API

Great service - you need an account with holdings (no min), register as a Schwab developer, approve a token use, provide Claude with API details, create scripts to run daily or twice daily. You can use polygon/massive for other info (news eg) and free yahoo for sectors since some miss from Schwab. This is direct Schwab developer- separate from ThinkorSwim although you can also track with scripting there but less control than api directly.

Mentions:#API

Unusual Whales has all 13F filings for tracking these guys. I think they added it to the API as well. I'm going to pull it into my model with their MCP server.

Mentions:#API

I have no programming experience so I asked claude code to explain the API a little bit. If you want to know something specific let me know and I'll pass it along. This stuff is very easy to vibe code for personal use. >Schwab's Trader API (formerly TD Ameritrade's API) gives you programmatic access to your brokerage account — positions, balances, market data, option chains, and order placement. Authentication uses OAuth  >  2.0, which is the biggest hurdle to getting started. You register an app on Schwab's developer portal to get a client ID and secret, then go through a browser-based login flow where the user authorizes     >  your app. This gives you an access token (valid \~30 minutes) and a refresh token (valid 7 days). The refresh token is the lifeline — if it expires because your app was offline for a week, the user has to   >  manually re-authenticate through the browser. You'll want to store these tokens somewhere persistent (we use PostgreSQL) and build automatic refresh logic so your access token stays valid during market     >  hours.                                                                                                                                                                                                        > >  The data available is comprehensive. Account endpoints give you positions, balances, and order history. The quotes endpoint lets you batch up to \~245 symbols per request for real-time quotes (price, >  bid/ask, volume, etc.). The option chain endpoint is particularly powerful — you pass a ticker and get back every available strike and expiration with greeks, bid/ask, open interest, and volume. Strike >  keys come back as decimal strings like "150.0" rather than integers, which is a quirk you need to handle. For order placement, you can submit, replace, and cancel equity and options orders with full >  control over order type, duration, price, and quantity. There's also a WebSocket streaming service (ACCT\_ACTIVITY) that pushes real-time account events — order fills, cancellations, and status changes — >  with execution details like the exchange venue, route, and commission that you won't get from the REST API alone. > >  The main restrictions are rate limiting and token management. You're capped at 120 API requests per minute, which matters when you're iterating across many tickers (fetching option chains, quotes, etc.) — >  we use 200ms delays between calls to stay under the limit. The quotes endpoint has a URL length limit that caps you at roughly 245 symbols per batch. The streaming WebSocket requires a separate >  authentication flow using "user preferences" to get the socket URL and credentials, and it needs periodic reconnection since Schwab rotates the session. The 7-day refresh token expiry means you can't just >  deploy and forget — if your server goes down for a week, someone has to log in again manually. > >  For infrastructure, at minimum you need a server that can run persistently during market hours (not just serverless functions) if you want streaming data or scheduled jobs. We use a Node.js process on >  Railway for the WebSocket stream service and cron jobs, with Next.js on Vercel for the web UI and REST API routes. A database is essential for storing OAuth tokens, order records, and any historical data >  you want to track — we use Neon PostgreSQL with Prisma. If you're just doing basic things like pulling quotes or account data, you could get away with a simpler setup, but once you start placing orders or >  monitoring fills in real-time, you need the persistent server, reliable token storage, and a database to track state. The streaming service in particular can't run in a serverless environment since it >  maintains a long-lived WebSocket connection.

Mentions:#API

Polygon is Massive now, just one service. Some limited use is free, but you'd want to use your broker's API to get the best results. Yahoo Finance API is also quite useful. If your broker has a good API you can vibe code quite a ways with that and Yfinance

Mentions:#API

DLocal operates a one-stop payments platform enabling companies like Amazon, Microsoft, and Spotify to process transactions across more than 40 emerging markets. Its strength lies in simplifying complex local payment networks through a single API.

Mentions:#API

API Error: 500 {"type":"error","error":{"type":"api_error","message":"Internal server error"},"request_id":"req_011CYV8nyJ6D8LcyoKbGNSTv"}

Mentions:#API

Agree. This is the top comment of the post: > If you listened closely Powell said that the economy would be in a really good spot if Trump didn’t do tariffs and going to war with Iran. I can tell you in financial services, it's an absolute bloodbath. Full stop. Massive outsourcing/offshoring for 5 years. Non-stop re-orgs. The tech is borderline failing at this point. Leaderships only move is offshore and let go of workers, but they have to turnover after 4 years because you can literally see in the general ledger they're destroying the business line. It's obvious the offshore workers have zero background in any of this and significant proportion have fake resumes. All the work is through contractual agreements with IT Consultancies in India, so the bureaucracy is insane. For example, offshore workers aren't allowed access to certain databases. But, there's a mandate to offshore tons of the data analyst/reporting work. The question is: how will they build the reports if they can't access the databases? Well, we build a new staging area. Then build the ETL pipeline. Then load into a new database they can access. Then they query from that database to return CSVs in their local environment. Then run the python scripts. We basically spent 2 years building out the jankiest data pipelines in history for them to get all the reporting wrong. This is the "automation" the MBAs have been preaching about. I'm convinced it's some sort of corporate raid. The only thing holding the company together are COBOL mainframes from the 1980s, which a rogue API from the offshore dev teams recently took down for half a day...

Mentions:#API

WSB take: if AI agents really replace apps, then the money is in whoever owns distribution (OS layer) and whoever owns the "agent to API" marketplace. Everyone else becomes a commoditized backend. Non meme take: reliability and permissions are going to decide who wins. An agent that can do 80% of tasks but sometimes clicks the wrong thing is dead on arrival. Ive been following some practical agent reliability/evals stuff here: https://www.agentixlabs.com/blog/

Mentions:#OS#API

You could try pulling delayed market data via Google Finance or an API like Alpha Vantage, then update Sheets automatically.

Mentions:#API

Interesting project! Some might prefer raw sentiment or positioning data over just news, could complement your API.

Mentions:#API

Honestly I think a lot of people are still underestimating how much infra side of finance is changing, not just stocks. Been following a few industry reports lately (especially from FinanceFeeds) and a big theme is how brokers are quietly moving towards multi-asset + API-driven platforms. Retail still sees “stocks vs crypto”, but backend is becoming way more unified. If that plays out, companies enabling infra (liquidity, APIs, compliance automation) might outperform typical retail-facing apps. Curious if anyone else is tracking this angle or still focused only on equities?

Mentions:#API

Value play. This is a 1 year + hold atleast. The biggest potential as stated by ceo on latest conferance call is the onshoring of API manufacturing into the US which ceo said is run and approved by the white house level...

Mentions:#API

I run a somewhat similar correlation strat for NQ. The issue usually isn't the code, it's the cost of the data feeds. Getting reliable, granular options flow and real-time Greeks via API is expensive af.

Mentions:#API

Still waiting on the guy to have a conversation with this brokerage integrated API LLM agent and accidentally blow up his account due to an AI hallucination. 🤡

Mentions:#API

Exactly, just take a look at the pricing of just bare bones H100 and go all the way upto serverless/model API providers, literally goes from 1.5/hr on vast.ai to 10/hr for HF/AWS serverless

Mentions:#API#HF

Now the platforms are not only rule-based. They have mcp+llm, which can easily do those event driven analysis. Nice ones need description, tho. The API you are talking about is more of MCP layer

Mentions:#API

Public Brokerage has a decent API, and pretty good rebate

Mentions:#API

When people compare CRM - or other CRM/ERP products and companies within this field - to ServiceNow and praise ServiceNow.. Have these people worked with ServiceNow before, or do they mostly view ServiceNow as this magnificent workflow automator? ERPs and CRMs appear as slow because they are handling immense amounts of data, corresponding to the sometimes data-heavy loads and requests users make. I have worked with ServiceNow before. All I see is a glorified ticketing system that could just as easily be threatened by competitors. Their API structure is.. decent but not more than that. Are people willing to bet on a general plug and play API request software company that doesn’t have any specific moat - at least as far as I can tell? Sorry, I just don’t buy it. My humble opinion, I really don’t get the craze about ServiceNow so please enlighten me

Mentions:#CRM#API

>I lose money through API keys trying to vibecode a trading bot somehow this puts everything else you said to shame lmao

Mentions:#API

I lose money in the market.I lose money through API keys trying to vibecode a trading bot.I lose money shorting oil.I lose money on Polimarket bets.

Mentions:#API

For options specifically, there’s no clean copy-trading solution like forex/futures. Most trade copiers are built for MT4/MT5 or futures platforms, not options chains with strikes/expiries. So in practice, options traders either use broker-level multi-account tools (limited) or manually execute. Even prop setups like FundingRock don’t support options, so you won’t get copier-style scaling there either. If you need true mirroring, you’re basically looking at custom API builds, not plug-and-play tools.

Mentions:#MT#API

Memes aside, they can easily achieve profitability by 2030. Ads are a potential $250+ Billion a year in annual revenue that they are only just starting to explore, and their enterprise API usage is growing rapidly.

Mentions:#API

To grow, they still need new customers. The switching cost for Salesforce goes both ways. It's expensive to switch from, but you could also argue that it's even more expensive to switch to. Source: I just led a full set up and role out of Sales Cloud a few years ago. It costed about 5x what was originally estimated by our Salesforce account manager once you factor consultants and new team members that are necessary for a Salesforce product to work properly and integrate into our existing ERP software. Leadership was blindsided and the roll out scope became extremely narrow in order to get a working product to users. As soon as the contract is up, we're out and will never be coming back. I will say, their APIs are a little crazy the way they're structured (so many separate APIs) but the one thing they do really well is roll out changes in a way that makes it easy to maintain existing integrations. My ERP vendor maintains 1 live API version, so when they change something it instantly breaks existing connections. At least SF rolls them out in multiple versions and deprecats old versions with plenty of warning, so you have time to fix them before they break.

Mentions:#API#SF

Thank you would this be the right approach? 1. Using Barchart OnDemand (15-Min Delayed / Paid Real-Time) Barchart Premier gives you access to the site's tools, but the OnDemand API is technically a separate service (though they often have free tiers for small volumes). • Endpoint: For options, you typically use getFuturesOptions or a custom getData query. • The Script: Google Sheets cannot natively "scrub" Barchart's web interface easily because the site uses scripts to load data (which breaks IMPORTXML). You must use Apps Script. Basic Setup: 1. In Google Sheets, go to Extensions > Apps Script. 2. Use the UrlFetchApp.fetch command to call the Barchart API URL. 3. Parse the JSON and write it to your cells. 2. The Tradier API Method (Real-Time) If you have a Tradier brokerage account, you can get unlimited real-time data for free. This is often the preferred "clean" method for Sheets power users. • Endpoint: https://api.tradier.com/v1/markets/options/chains • Scrubbing Logic: You send a GET request with your API Key in the header, specify the symbol and expiration, and the API returns a clean JSON of the entire chain.

Mentions:#API

>They are losing $2B PER MONTH and their losses are increasing each month Cash burn is not concerning when you consider how fast they are growing and the value of the technologies they are building. They are raising hundreds of billions of dollars, they can afford to burn cash. >while their models are starting to get beaten by both anthropic and google, with zero moat. GPT 5.4 is still by far the best model on the market for most use cases, it's not even close. The big thing is that Google/Anthropic is heavily focused on overfitting to benchmarks(especially Google), whereas OpenAI focuses on real world performance. OpenAI has a strong moat both from brand recognition, and from having superior technology. >They also saw a 5% drop in users this month. Simply not true. The only evidence I can find justifying this claim is a website claiming 1.5 Million people signed a petition saying they will leave ChatGPT. But this is only 0.2% of users, and that petition does not verify: - How many of those people actually used ChatGPT, or were paying customers - Whether they actually followed through with their pledge(People rarely follow through with Boycott threats) - If the signatures are duplicates/bots. OpenAI's own numbers suggest users are growing 10% monthly, not declining. Lastly, IMO consumer usage isn't the main opportunity for OpenAI long term. It's their agentic products and API.

Mentions:#API

For real-time options data into Google Sheets, the cleanest free method is the **Tradier API**. Free tier gives you delayed data, paid tier is real-time. You call the `/options/chains` endpoint, parse the JSON, and pull it into Sheets via `IMPORTDATA` or Apps Script. For 15-minute delayed data, **Barchart's ondemand API** has a free tier that works for small volumes.

Mentions:#API

It uses the GDELT API (https://www.gdeltproject.org/) and runs 2 parallel themed queries against GDELT on each context refresh, focused on events most relevant to financial markets. Things like trade disputes, energy policy, regional conflicts, and central bank actions.

Mentions:#API

For real-time options data into Google Sheets, the cleanest free method is the **Tradier API**. Free tier gives you delayed data, paid tier is real-time. You call the `/options/chains` endpoint, parse the JSON, and pull it into Sheets via `IMPORTDATA` or Apps Script. For 15-minute delayed data, **Barchart's ondemand API** has a free tier that works for small volumes.

Mentions:#API

You can do this with a small Python script + Google Sheets API. A simple approach is to use **Yahoo Finance options data (free, \~15 min delayed)** and push the results into Sheets. Libraries like yfinance let you pull the full option chain, including bid/ask and IV.

Mentions:#API

Hey OP - this is pretty cool. Wondering if it's compatible with the IBKR API?

Mentions:#IBKR#API

Can't argue with that- we don't offer API access at this time. Will have to resolve concerns about exposing our IP

Mentions:#API#IP

This is very helpful; many thanks! I am waiting on a reply from volsignals about API access. ODs data subscriptions might be the most effective long term use of my money for my purposes. I can easily see how i can use their feed to manage my trades based on my current strategy automation.

Mentions:#API

Nah it's lobbying I'm sure. He's seems to be building up the infrastructure for age verification in the us that requires API in sure he will charge people for it.

Mentions:#API

Today everyone is confused about the AI bubble, "will it be going to pop," "when it is going to pop," because the market is correcting itself right now. If we see 4 years ago, back when everything was new, chatgpt 3 was released, there were many new start ups opening every single day, which were essentially just back chatgpt API key set, while marketing it with some niche problem that it will, that chatgpt would do it much better (which also hold true for today, I am only talking about small start ups. There were also some unicorns between them which genuinely solved real problems. But the problem now is as the industry grows, like any industry, with more and more hype everyday, the competition grows, it gets harder to compete, even better tools are being replaced. For example Blackbox AI was once best for coding, but now people use Claude code, even chatgpt is replaced by Claude in coding now. The problem is not that there is a bubble or will is going to pop, the main thing we all need to focus on is, what it is going to change, once everything is settled down. What new opportunities will it create? There is a storm right now, where even the most uneducated people are creating start ups, who don't even know a thing about computers or software, and the fun thing is, they also managed to get investors because investors want to win, they care about winning. And, its not that the investors are stupid, it is just that the technology is very new and it is hard to pick a winner, so people with even a decent pitch get the capital, even if after 6 months that start up shuts down and investors lose all their money. The investors mind set is, fund the crowd, and even 2 or even becomes a winner, its gold mine for them.

Mentions:#API

Wish Apple would dump that shitty Yahoo Finance stocks API

Mentions:#API

Custom software I built. API to Tradier

Mentions:#API

The problem is, like the original internet boom it is full of garbage that is going to fail. Basically anyone that has built a business around calling the ChatGPT API to do something that isn't that valuable and doesn't pay for the tokens it uses is liable to fail.

Mentions:#API

They have revenue, use cases, customers, and are getting into military Now, openAI might be over leveraged, but anthropic is already turning a profit on API calls and is projected to become profitable in a couple of years Way different situation that pets.com getting 300 millions evaluation by selling litter online at a loss

Mentions:#API

US to release emergent reserves on reserves....starting with almost 200m barrels immediately...more to follow. Pipeline flows out of the bakken and permian basins picked up overnight according to flow data on the backend API data.

Mentions:#API

The worst part is is like 19API, it's not even that aromatic/heavy. So you don't get volume swell and good diesel yield. Just cracks to light ends in your coker. We valued it at ANS-20.

Mentions:#API

I want to make sure this works and catches edge cases before creating API automation. Not gonna waste my time if it sucks

Mentions:#API

I reverse engineered Robinhood API last year and have been collecting options and futures data information for many months now. I have something like 80,000 "snapshots" so far with full Greeks data, volume, bid, ask etc The idea is to use LLM power to analyze and find "true market edge". So far out of 180,000 strategies tested not a single one actually showed "true market edge". Many show positive P&L but fail in reality. the market CANNOT be predicted, never fall for any bot trading or anything like that it all fails the "true market edge" test. and this is why I just follow the trend and not try to predict anything anymore.

Mentions:#API

Not running it locally, it’s deployed on Vercel with PostgreSQL on Azure, so the scanner just runs as an API call. The pipeline also cuts aggressively before the time consuming calls. Starts with [N] stocks, filters down to 18 finalists, then runs the heavy enrichment only on those. Never blasting all [N] through the full thing.​​​​​​​​​​​​​​​​

Mentions:#API

For market data: Finnhub Premium (news, analyst ratings, insider activity, earnings history, peer groups), TastyTrade API (live options chains, Greeks, IV rank, liquidity ratings — free if you have a TastyTrade account), FRED for macro data (free, Federal Reserve), SEC EDGAR for filings and business descriptions (free), and xAI/Grok for social sentiment.

Mentions:#API

Disagree, they pull in news from all sources. There's no chance you'll miss a single piece of news of a stock. If it gets cluttered you can filter on news type to find what you need. I'm gonna be honest most brokerages don't even have a good news page like IBKR, I doubt you'll be able to do it better. Comes with a ton of API integrations and algorithmic filtering. Portfolio tracking works fine, there are some quirks with it like deposits counting as profit made but every broker app I've used does that. You absolutely want a UI to be complex, that's the whole point of a professional broker app. If you want something simple go to Robin Hood. And their web interface really isn't that complex in the first place.

Mentions:#IBKR#API

I'll just leave this here: The American Petroleum Institute (API) releases its Weekly Statistical Bulletin, which includes data on U.S. crude oil inventories, every Tuesday afternoon. This report provides insights into the weekly changes in crude oil supply and can influence market prices.

Mentions:#API

Datavault AI Inc. (NASDAQ: DVLT) is a small-cap tech company (market cap around $395–$430 million as of early March 2026) Institutional buying surge: Major increases reported March 3, 2026 — e.g., Vanguard (\~2,900% increase to 11.8M shares), State Street (\~2,800% to 10M), BlackRock (\~3,000% to 4.1M). This fueled optimism. Strategic investments/acquisitions: $150M from Scilex Holding, API Media acquisition.Aggressive guidance & token plays: 2025 revenue updated to $38–$40M (up \~30%), 2026 target $200M+ (even $2–3B long-term potential via nationwide nodes). Token/coin distributions (e.g., Josh Gibson Coin dividend 1:1 planned for April 2026, meme coins like Dream Bowl Draft) created buzz and short-term pumps. * **weak fundamentals**: Ongoing losses, negative EPS, modest current revenue vs. lofty targets. Market skeptical of execution on $200M+ 2026 guidance. * **Broader small-cap headwinds**: Rate sensitivity, macro rotation away from speculative tech/micro-caps, and general caution in AI/Web3 space amid hype fatigue. classic pump and dump type of company on that goes up and down on AI / and WEB3 . waiting for earnings report to come out better

Mentions:#DVLT#API

Then how do you bring it all back together? And is that on code or API or regular project?  Do you just ask it to run subagents and it does? Or is there some way to induce it to idk about? Does it work in normal chat mode on subscription or do I need pay per token API set up for that?

Mentions:#API
r/stocksSee Comment

I mean, I agree that the threat side is scaling rapidly& AI is changing alot.. I'm not sure thats a reason to be apprehensive of the sector though. Vulnerabilities and exploits are already outpacing manual patching, which is exactly why enterprises are spending more on consolidated security platforms w/ AI detection. Bots currently account for over half of web traffic, which is a tailwind for firms focused on identity, bot management, and API security imo I am thinking the winners will be the cybersecurity companies with customer data, distribution, and the unit economics to turn their offerings into durable high‑margin revenue (ex: CRWD, NET)

Mentions:#API#CRWD#NET

They are the API keys to my heart.

Mentions:#API

then you could plug one in and make api calls. try it. calls on API

Mentions:#API

backtesting is on the roadmap. TastyTrade has a full backtesting API with 10+ years of real historical options data. The problem is their OAuth tokens work on all their standard endpoints but get rejected by the backtester endpoint. it uses a different auth token. I’ve reached out to their API partner team to get proper access. Once that’s sorted, the plan is to wire it directly into the trade card

Mentions:#API

In my experience, LLMs struggle with the precision required for option pricing, so relying on them for raw calculations usually leads to hallucinations. I've tried using Bloomberg Terminal, ThinkOrSwim, or Interactive Brokers' API for this, but they lack the integrated reasoning layer you're looking for. I'vebeen using https://trade-matrix.com to handle the heavy lifting. It pulls data from multiple sources, cleans it, scores it, and displays a stock score with confidence to build conviction. Not sure if it handles complex multi-leg hedging strategies perfectly, but it might save you from building a custom pipeline.

Mentions:#API

What you’re describing is definitely a tricky space most general purpose LLMs don’t have live market access out of the box, so combining reasoning with current prices usually requires a bridge to a market data API. In practice, the approaches that work best are either LLMs augmented with real time feeds through something like an Azure or AWS pipeline, or tools built into broker platforms that expose analytics and option chains for programmatic access. In my work at Lifewood Data Technology, the key is structuring the data so the model can reason over positions and prices together; without that, strategy level insights tend to be high level and not portfolio specific.

Mentions:#API

I disagree first I don’t plan on publishing it, 2nd I really did learn a lot, for just a $200 subscription I built a product for myself that I can use that fits my specific needs. The thing is it works, and it’s catered just to completely what I need, As a matter of fact, this is a second app I’ve written that we just use all of it was made with a disclaimer that it is not something that’s going to be used for long-term support However, when you need a quick solution or a stop gap and you don’t have to worry about issuing PO’s or anything else that makes it a lot easier. I mean, I still think AI is a lot of hype and everything else and it completely doesn’t justify the real costs for stuff because if I had to pay unsubsidized rate I would not use it at all. It probably would’ve been close to two to $3000 worth of API calls. However, if you set everything up correctly, you can literally have agents do almost all the work for you in parallel, and you’re just capping out on having to wait between sessions since you’re past your usage.

Mentions:#API

If your broker has an API it would be simple to vibe code an app that has your positions and whatever market data you want. I use Schwab and the Yfinanace api to get Yahoo Finance data. Hook up an Anthropic API and you'd be able to ask whatever questions you want about your positions and the market. You should be able to use the most basic models to ask the questions you mentioned so it probably wouldn't cost much per response.

Mentions:#API

The 2.01 + 5.01 + 5.02 combo is exactly what flagged the Ventyx/Eli Lilly filing I mentioned - saw that combination and knew immediately it was a completed deal, not just an announcement. Good confirmation that it's not just my read on it. The 8.01 point is interesting, I've been filtering those out as noise but you're right that some companies use it for things that don't fit the standard items. Will start paying more attention to context there. And thanks for the EDGAR full-text search tip - I've been polling the atom feed which works but has a slight lag. Will look into the API approach.

Mentions:#API

Item 2.01 combined with 5.01 and 5.02 in the same filing is a really reliable signal. That combo almost always means the deal closed, not just announced. Also worth watching 8.01 for unclassified material events, some companies use it for things that don't fit cleanly elsewhere but still move the stock. For anyone building a system to track these, the EDGAR full-text search API lets you filter by form type and date in real time without scraping the HTML. Much faster than polling the main page.

Mentions:#API

If you do any programming I built a real time PR news API. It delivers stock news to your app in less than 1 second. 1-week free to try and verify speeds. [https://rtpr.io](https://rtpr.io/)

Mentions:#PR#API

Again you're missing the point. Claude can just make the tools that palantir provide, future claude will just oneshot it. The issue with their commercial growth is that it hasn't changed their share of revenue coming from government vs commercial, they've been stuck on a 50/50 split a bit too long. Sure, the total revenue has seen significant growth, but this is also the nature of a contract based venture, when the product hits market, you get significant growth until you have saturated or competition picks up. They also have competition from Microsoft Fabrics, Oracle based solution or fully custom solutions made in house or by consultancies. Since DBMS is also a mostly custom job or specific to whatever other vendor you use, companies are much more likely to pick something from an already established company in their pipeline, like Microsoft. >That junk (Claude, Gemini, copilot, grok, GPT etc) works wonders for simple searching but it’s just that. Well, actually it's really good for coding and the directions it's going seems to be around agents having mcp and access to a plethora of open source tools, which kinda bypasses the need for dedicated DBMS, you just need API acces for you're different databases.

Mentions:#API

I think he's asking for AI API integration.

Mentions:#API

We Piloted copilot last year and with some training and a demo repo update to use the tools. I did this entirely in VS Code with basic extensions, which was nice. I took a few months to adopt Copilot into my workflows for troubelshooting and scripting. I built a 'workspace' with custom/generic instructions, a project manifest type doc to start with or update from chat, and a VS code task to input a project name and create a 'workspace'. I've used this to accelerate many things like: 1. Pull in PDF docs on a legacy API, process to makrdown, and provide call/url info for devs 2. Convert a series of Help docs from said legacy system into a consistent and Copilot/machine readable wiki 3. Identify then remediate common errors, including staging a markdown wiki that is easily exported to our team wiki. 4. Create readily deployable tasks/scripts for without needing to stage all new instructions/repo/etc 5. Identify and complete a workflow for updating an old but small internal app past 6+ years of CVEs/etc 6. Overhaul and remaster a myriad of AD/etc scripts and tools into cohesive and better documented module(s) 7. Manage years of bookmarks and prep an easy to import package for a new hire with team links/etc Actual Devs here are doing a lot with copilot tools like agents and squads or profiles. Ex one dev can do the following without leaving VS Studio CLI, and in some cases it's one command/chat: > Commit> story update> create PR linked to story> prep deployment and change needs ex yaml pipeline + approval requests LLMs and Gen AI are great tools. It's definitely more useful than blockchain hitting the market. But the hype is similar and the tech bros don't seem to get that unless they are selling a package around and LLM that solves for any lack of mature business process, dev guidelines, etc. ----- I'd compare it to setting a kid loose in your kitchen to bake cookies. All the ingredients are there, but they need a Recipe and oversight. If the Recipe says 'Add 1 cup butter', is that salted? unsalted? Straight from the freezer or room temp or melted? Do they start mixing in a large bowl or put the liquids in a small bowl to add to the dry ingredients in a large bowl later? Every thing you know you need to translate to the recipe/instructions. Maybe the kid asks great questions. Maybe it will just move forward with what assumptions it can make. Check the oven temp. Make sure they set a timer. You'll spend a lot less time personally to get a cookie but your ~~code~~ -er Recipe reviews need to be robust. Plus when your other kid (LLM model) goes to make a batch you need to repeat your Recipe update and oversight checks. The good news is a mature process (recipe) can withstand current model effectiveness. Buy you're also writing a detailed manual on how to replace yourself and handing it to a rapidly developing technology with minimal guardrails. If your org is decent about that you can offload a ton of work. If their goal is to deirectly replace you, tada you wrote the book yourself.

Mentions:#VS#API#PR

As someone who is working with TTDs API, I would short it. It's so much worse than DV360 or even Xandr. Also, to me, the UI seems super difficult to work with

Mentions:#API#DV

Pulling directly from Schwabs API. If I decide to turn this into a saas I will migrate to databento.

Mentions:#API

If you do any programming I built a real time PR news API. It delivers stock news to your app in less than 1 second. 1-week free to try and verify speeds. [https://rtpr.io](https://rtpr.io/)

Mentions:#PR#API

People keep complaining about annualized rates, yet every month OpenAI earns more revenue than the last. OpenAI's core business is not cyclical like restaurants are. They charge subscriptions, and API usage is constant.

Mentions:#API

They wouldn't need much compute power at all if they were just calling the API, you could do that on an old laptop. The people buying 4 mac minis are doing it to run local models.

Mentions:#API

I was under the impression most people buying these Mac minis have been to run openclaw and are still leveraging openAI/anthropic subscriptions or API to do so, not running local models.

Mentions:#API

# Descripción del Producto: DearmasTrader * **Qué hace:** Es una plataforma de trading algorítmico automatizado que utiliza procesamiento de datos masivos para ejecutar estrategias validadas estadísticamente. El sistema permite conectar exchanges vía API y operar de forma 100% autónoma. * **Público objetivo:** Traders, inversores de criptomonedas y desarrolladores que buscan rentabilidad real sin depender de señales manuales o bots comerciales rígidos. * **Beneficios clave:** * **30 días de prueba gratis:** Acceso completo para validar el mercado sin riesgo inicial. * **Disciplina garantizada:** Elimina el factor emocional humano en la ejecución. * **Infraestructura de alta potencia:** Utiliza un rig propio ("la bestia") y servidores en Canadá para un procesamiento de datos superior. * **Casos de uso:** Inversores que desean automatizar su capital (como el caso real de $1.000 operando actualmente) y usuarios que buscan optimizar sus estrategias mediante combinatoria de datos. * **Propuestas de venta únicas:** * **La Combinatoria:** A diferencia de otros bots, DearmasTrader prueba miles de combinaciones de parámetros en milisegundos sobre velas de 1 minuto para encontrar la configuración estadísticamente más robusta. * **Stack Tecnológico:** Desarrollado con **C#/.NET** para el backend y **React** para el frontend, garantizando robustez y velocidad. * **Alternativas/Competidores:** Se diferencia de **3Commas** por su mayor flexibilidad y personalización, y de **ProfitTrailer** por ser mucho más accesible en su configuración y uso diario.

Mentions:#API#NET

# Descripción del Producto: DearmasTrader * **Qué hace:** Es una plataforma de trading algorítmico automatizado que utiliza procesamiento de datos masivos para ejecutar estrategias validadas estadísticamente. El sistema permite conectar exchanges vía API y operar de forma 100% autónoma. * **Público objetivo:** Traders, inversores de criptomonedas y desarrolladores que buscan rentabilidad real sin depender de señales manuales o bots comerciales rígidos. * **Beneficios clave:** * **30 días de prueba gratis:** Acceso completo para validar el mercado sin riesgo inicial. * **Disciplina garantizada:** Elimina el factor emocional humano en la ejecución. * **Infraestructura de alta potencia:** Utiliza un rig propio ("la bestia") y servidores en Canadá para un procesamiento de datos superior. * **Casos de uso:** Inversores que desean automatizar su capital (como el caso real de $1.000 operando actualmente) y usuarios que buscan optimizar sus estrategias mediante combinatoria de datos. * **Propuestas de venta únicas:** * **La Combinatoria:** A diferencia de otros bots, DearmasTrader prueba miles de combinaciones de parámetros en milisegundos sobre velas de 1 minuto para encontrar la configuración estadísticamente más robusta. * **Stack Tecnológico:** Desarrollado con **C#/.NET** para el backend y **React** para el frontend, garantizando robustez y velocidad. * **Alternativas/Competidores:** Se diferencia de **3Commas** por su mayor flexibilidad y personalización, y de **ProfitTrailer** por ser mucho más accesible en su configuración y uso diario.

Mentions:#API#NET

# DearmasTrader * **Qué hace:** Es una plataforma de trading algorítmico automatizado que utiliza procesamiento de datos masivos para ejecutar estrategias validadas estadísticamente. El sistema permite conectar exchanges vía API y operar de forma 100% autónoma. * **Público objetivo:** Traders, inversores de criptomonedas y desarrolladores que buscan rentabilidad real sin depender de señales manuales o bots comerciales rígidos. * **Beneficios clave:** * **30 días de prueba gratis:** Acceso completo para validar el mercado sin riesgo inicial. * **Disciplina garantizada:** Elimina el factor emocional humano en la ejecución. * **Infraestructura de alta potencia:** Utiliza un rig propio ("la bestia") y servidores en Canadá para un procesamiento de datos superior. * **Casos de uso:** Inversores que desean automatizar su capital (como el caso real de $1.000 operando actualmente) y usuarios que buscan optimizar sus estrategias mediante combinatoria de datos. * **Propuestas de venta únicas:** * **La Combinatoria:** A diferencia de otros bots, DearmasTrader prueba miles de combinaciones de parámetros en milisegundos sobre velas de 1 minuto para encontrar la configuración estadísticamente más robusta. * **Stack Tecnológico:** Desarrollado con **C#/.NET** para el backend y **React** para el frontend, garantizando robustez y velocidad. * **Alternativas/Competidores:** Se diferencia de **3Commas** por su mayor flexibilidad y personalización, y de **ProfitTrailer** por ser mucho más accesible en su configuración y uso diario.

Mentions:#API#NET

The OP asked why the price is objectively underpriced. What you're saying is likely what many believe and how they look at AMD. Yet it's a fundamental misunderstanding of AMD. First, if you want to characterize AMDs AI initiative as copying Nvidia, your only focused on the razor thin veneer that there is at least a 1T TAM to be addressed and Nvidia will attract competition into that space. But in no way is AMD just copying Nvidia efforts. Don't even try to call ROCm a copy of CUDA. Beyond the public API used their is nothing that is a copy. The hardware is architecturally extremely different and in fact more advanced and capable. We continue to see model performance excel with optimizations on MI300X GPU and out outperforming B200 chips. What AMD has been doing is taking a far more argers process of working completely Open Source and industry wide friendly. The end game is to have options that can work broadly with different hardware system topographies, vendors and meet a much broader array of solution needs. This expanded scope took longer to bring to market initially, while Nvidia found one short cut after another to nude it's overall architectural design concepts ( monolithic based design) forward and capitalize on having short term first to market monopoly advantage. But this advantage is running out of time. AMD is on the precipice of providing full rack scale systems via Helios that will quickly grab significant market share from Nvidia, well before Nvidia can secure enough of a food hold ensure lasting dominance the way Intel had. I believe AMD should match Nvidia's DC market share well before 2030 and 2028 with MI500 may be where they land even before AMD pulls ahead. Why AMD will pull ahead you ask... AI is not just a GPU game. It's full heterogeneous architecture. Even Jensen is saying this as he tries to convince you their ARM based CPU chips are going to carry them. Buy those chips are trash compared to EPYCs, chips that Intel can not touch, yet Jensen want you believe they can tweek off the shovel designs from ARM enough to handle the deterministic needs in agentic MoE type workloads better than the monster CPUs AMD keep improving upon. It's really admirable to see how well he sell that line, but it will only buy him so much time at the top. AMD is a company that keeps their head down and works hard at the plan, and its not a new, borrowed or rushed plan. This is the heterogeneous roadmap Mark Pappermaster was talking about over 10 years ago... Slowly and significantly made real, step by step, win by win.

No the price is mainly limited by competition. The investment required to run and manage Kimi 2.5-like models (which are not as good as SOTA closed models) locally at scale makes LLM API services worth it even if they’re several times more expensive. Profit margins on API-based inference are already very good (~70% for some labs) and are expected to become even better. Labs have the freedom to either increase or decrease cost depending on competition.

Mentions:#API

There are plenty of good reasons to take advantage of that feature. People can be really petty in internet arguments and scour your history to try to get personal or find sensitive topics to throw barbs at you for having a difference of opinion or demonstrating expertise/knowledge that makes them feel insecure. On previous accounts, I used to participate in subs related to my locale and had someone use that and comments I made about my career elsewhere to get uncomfortably close to doxxing me. After that I made a script to use the API to scrub my history once a week or so, but that of course removes my contributions to conversations that others might find helpful if they come along later - especially relevant in tech subs. Hiding the public browsing of my history was a welcome solution, you don't have to be a fraud to appreciate better privacy protection from weirdos.

Mentions:#API

model shifts from human licens to API per calls

Mentions:#API

I have actually built an app that connects to different AI models to pull data regarding market sentiment, price certainty, take profit, and everything. Works very well because I use the direct API address and can generate very long requests with it.

Mentions:#API

Yup.  Another thing they don't say is that most of that distillation happened from those companies reselling anthropic API and subsidizing the resell to train on those sessions.  Same shit copilot and others do, perfectly legal.

Mentions:#API

Can you give me an example of what "grunt API enterprise tasks" you would do with it?

Mentions:#API

100 percent different use case though. Claude opus is 100 percent unnecessary for most antigenic grunt API enterprise tasks.

Mentions:#API

Curious about your local set up - how much slower is it than API calls, for example, on smaller chat questions or coding exercises with lower token usage? Apologies if this is vague or not specific, just trying to gauge how slow / fast it is vs an LLM provider API

Mentions:#API

What API are you using for this? Any chance youd share the code?

Mentions:#API

The underrated impact is local llama serving. Qwen 3.5 coder next replaced 200 dollars a night in spend on API calls for me. For enrichment non instant results these models when distilled can provide amazing results for the one time purchase of fairly inexpensive equipment for any decent size enterprise.

Mentions:#API

Who’s gonna vibe develop an entire system, frontend, backend, API, federation, storage, security, product design, graphic design, then QA it, debug it, and deploy it just to learn a language?!?

Mentions:#API

US strikes on Iran navy escalate oil flow fears, tanking risk assets overnight. Headlines overpromise though - Strait chokepoints rarely fully close without tanker reroutes balancing supply. Pre-trade check: Scan Brent spreads and API data before oil longs - filters real shocks from gap-fill reversals. No navy dustup skips recession demand crush.

Mentions:#API

Thanks for sharing your data sources! QuickFS went down recently. It was my go to source for an Excel API.

Mentions:#API

Thanks. Sucks it doesn’t have an API but it is a lot cheaper than FMP

Mentions:#API

The software is licensed on a per user basis so the market is anticipating a decline in revenue. That's a short sighted approach by the market as tech companies can easily change how they license away from user licensing to API call, agent etc

Mentions:#API

1) Tech job market - First, there were many other cycles for the tech job market before, this one can be justified the AI but it also can totally not be. There was a massive over hiring during COVID so it would be natural for all that to unwind now making it hard to find a job temporarily. 2) I do believe very entry level roles are being disrupted by AI. Both can be true. 3) I didn’t back track anything the 10x loss is public, you can compare it yourself, check the limits of the paid plan, then look at the API cost-equivalent (which likely also carries a loss for them), and you’ll immediately see it doesn’t make sense, and that the factor is about 10x (you’d pay 10x more for those tokens if you used the API instead). There’s many analysis too about this.

Mentions:#API

I will leave this here on the API appetite in GPU starved country - [https://www.bbc.com/future/article/20251223-why-indian-cinema-is-awash-with-ai](https://www.bbc.com/future/article/20251223-why-indian-cinema-is-awash-with-ai) Do not underestimate Block's announcement - that management seems measured and did not seem worried about margins etc although they are a public firm. They seem to have realized it is point of no return for AI - this will trigger other boards to get their CEOs to act, and I would claim in the near term and not long term (US firm boards and wall street do not have patience) Personal anecdote - 2 weeks ago I will sitting in a consultant (Big 4) firm's presentation layout the need for 90 developer team (for a board level mandate in a very large bank). I could not believe that they were serious. \> Overall I think rise in productivity will more than cancel out AI job loss for the foreseeable future. However, in the long term I agree, that we will have a societal problem, I just can't tell when This is my thesis as well but lot more pessimistic given wall street and corporate expectations btw - I did fall for self driving hype. I am following what will happen in Australia (not the US obviously) as they do not have domestic firms to protect and have all the incentives for imports to come there with advanced tech

Mentions:#API

I just started writing my own screener with chatgpt help and Alpaca API. Was supposed to send me alerts on my Discord chat but nothing so far. Been writing it past 3 days and got it running before premarket today. But no alerts so might require tweaking. Also at work  currently so dont know what my laptop is doing.

Mentions:#API