See More StocksHome

API

Agora Inc

Show Trading View Graph

Mentions (24Hr)

2

0.00% Today

Reddit Posts

r/wallstreetbetsSee Post

Chat with Earnings Call?

r/investingSee Post

Download dataset of stock prices X tickers for yesterday?

r/investingSee Post

Sea Change: Value Investing

r/WallstreetbetsnewSee Post

Tech market brings important development opportunities, AIGC is firmly top 1 in the current technology field

r/pennystocksSee Post

Tech market brings important development opportunities, AIGC is firmly top 1 in the current technology field

r/WallStreetbetsELITESee Post

AIGC market brings important development opportunities, artificial intelligence technology has been developing

r/pennystocksSee Post

Avricore Health - AVCR.V making waves in Pharmacy Point of Care Testing! CEO interview this evening as well.

r/wallstreetbetsSee Post

Sea Change: Value Investing

r/investingSee Post

API KEY and robinhood dividends

r/pennystocksSee Post

OTC : KWIK Shareholder Letter January 3, 2024

r/optionsSee Post

SPX 0DTE Strategy Built

r/WallstreetbetsnewSee Post

The commercialization of multimodal models is emerging, Gemini now appears to exceed ChatGPT

r/pennystocksSee Post

The commercialization of multimodal models is emerging, Gemini now appears to exceed ChatGPT

r/optionsSee Post

Best API platform for End of day option pricing

r/WallStreetbetsELITESee Post

Why Microsoft's gross margins are going brrr (up 1.89% QoQ).

r/wallstreetbetsSee Post

Why Microsoft's gross margins are expanding (up 1.89% QoQ).

r/StockMarketSee Post

Why Microsoft's gross margins are expanding (up 1.89% QoQ).

r/stocksSee Post

Why Microsoft's margins are expanding.

r/optionsSee Post

Interactive brokers or Schwab

r/wallstreetbetsSee Post

Reddit IPO

r/wallstreetbetsSee Post

Google's AI project "Gemini" shipped, and so far it looks better than GPT4

r/stocksSee Post

US Broker Recommendation with a market that allows both longs/shorts

r/investingSee Post

API provider for premarket data

r/WallstreetbetsnewSee Post

A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform

r/investingSee Post

Best API for grabbing historical financial statement data to compare across companies.

r/StockMarketSee Post

Seeking Free Advance/Decline, NH/NL Data - Python API?

r/pennystocksSee Post

A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform

r/wallstreetbetsOGsSee Post

A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform

r/WallStreetbetsELITESee Post

A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform

r/ShortsqueezeSee Post

A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform

r/smallstreetbetsSee Post

A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform

r/RobinHoodPennyStocksSee Post

A Littel DD on FobiAI, harnesses the power of AI and data intelligence, enabling businesses to digitally transform

r/stocksSee Post

Delving Deeper into Benzinga Pro: Does the Subscription Include Full API Access?

r/investingSee Post

Past and future list of investor (analyst) dates?

r/pennystocksSee Post

Qples by Fobi Announces 77% Sales Growth YoY with Increased Momentum From Media Solutions, AI (8112) Coupons, & New API Integration

r/WallstreetbetsnewSee Post

Qples by Fobi Announces 77% Sales Growth YoY with Increased Momentum From Media Solutions, AI (8112) Coupons, & New API Integration

r/RobinHoodPennyStocksSee Post

Qples by Fobi Announces 77% Sales Growth YoY with Increased Momentum From Media Solutions, AI (8112) Coupons, & New API Integration

r/pennystocksSee Post

Aduro Clean Technologies Inc. Research Update

r/WallStreetbetsELITESee Post

Aduro Clean Technologies Inc. Research Update

r/optionsSee Post

Option Chain REST APIs w/ Greeks and Beta Weighting

r/investingSee Post

As an asset manager, why wouldn’t you use Verity?

r/wallstreetbetsSee Post

Nasdaq $ZG (Zillow) EPS not accurate?

r/pennystocksSee Post

$VERS Upcoming Webinar: Introduction and Demonstration of Genius

r/StockMarketSee Post

Comps and Precedents: API Help

r/StockMarketSee Post

UsDebtClock.org is a fake website

r/wallstreetbetsSee Post

Are there pre-built bull/bear systems for 5-10m period QQQ / SPY day trades?

r/ShortsqueezeSee Post

Short Squeeze is Reopened. Play Nice.

r/stocksSee Post

Your favourite place for stock data

r/optionsSee Post

Created options trading bot with Interactive Brokers API

r/investingSee Post

What is driving oil prices down this week?

r/weedstocksSee Post

Leafly Announces New API for Order Integration($LFLY)

r/stocksSee Post

Data mapping tickers to sector / industry?

r/WallstreetbetsnewSee Post

Support In View For USOIL !

r/wallstreetbetsSee Post

Is Unity going to Zero? - Why they just killed their business model.

r/optionsSee Post

Need Help Deciding About Limex API Trading Contest

r/investingSee Post

Looking for affordable API to fetch specific historical stock market data

r/optionsSee Post

Paper trading with API?

r/optionsSee Post

Where do sites like Unusual Whales scrape their data from?

r/stocksSee Post

Twilio Q2 2023: A Mixed Bag with Strong Revenue Growth Amid Stock Price Challenges

r/StockMarketSee Post

Reference for S&P500 Companies by Year?

r/SPACsSee Post

[DIY Filing Alerts] Part 3 of 3: Building the Script and Automating Your Alerts

r/stocksSee Post

Know The Company - Okta

r/SPACsSee Post

[DIY Filing Alerts] Part 2: Emailing Today's Filings

r/wallstreetbetsOGsSee Post

This prized $PGY doesn't need lipstick (an amalgamation of the DD's)

r/SPACsSee Post

[DIY Filing Alerts] Part 1: Working with the SEC API

r/optionsSee Post

API or Dataset that shows intraday price movement for Options Bid/Ask

r/wallstreetbetsSee Post

[Newbie] Bought Microsoft shares at 250 mainly as see value in ChatGPT. I think I'll hold for at least +6 months but I'd like your thoughts.

r/stocksSee Post

Crude Oil Soars Near YTD Highs On Largest Single-Week Crude Inventory Crash In Years

r/stocksSee Post

Anyone else bullish about $GOOGL Web Integrity API?

r/investingSee Post

I found this trading tool thats just scraping all of our comments and running them through ChatGPT to get our sentiment on different stocks. Isnt this a violation of reddits new API rules?

r/optionsSee Post

where to fetch crypto option data

r/wallstreetbetsSee Post

I’m Building a Free Fundamental Stock Data API You Can Use for Projects and Analysis

r/stocksSee Post

Fundamental Stock Data for Your Projects and Analysis

r/StockMarketSee Post

Fundamental Stock Data for Your Projects and Analysis

r/stocksSee Post

Meta, Microsoft and Amazon team up on maps project to crack Apple-Google duopoly

r/wallstreetbetsSee Post

Pictures say it all. Robinhood is shady AF.

r/optionsSee Post

URGENT - Audit Your Transactions: Broker Alters Orders without Permission

r/StockMarketSee Post

My AI momentum trading journey just started. Dumping $3k into an automated trading strategy guided by ChatGPT. Am I gonna make it

r/StockMarketSee Post

I’m Building a Free API for Stock Fundamentals

r/wallstreetbetsSee Post

The AI trading journey begins. Throwing $3k into automated trading strategies. Will I eat a bag of dicks? Roast me if you must

r/StockMarketSee Post

I made a free & unique spreadsheet that removes stock prices to help you invest like Warren Buffett (V2)

r/StockMarketSee Post

I made a free & unique spreadsheet that removes stock prices to help you invest like Warren Buffett (V2)

r/optionsSee Post

To recalculate historical options data from CBOE, to find IVs at moment of trades, what int rate?

r/pennystocksSee Post

WiMi Hologram Cloud Proposes A New Lightweight Decentralized Application Technical Solution Based on IPFS

r/wallstreetbetsSee Post

$SSTK Shutterstock - OpenAI ChatGBT partnership - Images, Photos, & Videos

r/optionsSee Post

Is there really no better way to track open + closed positions without multiple apps?

r/optionsSee Post

List of Platforms (Not Brokers) for advanced option trading

r/investingSee Post

anyone using Alpaca for long term investing?

r/investingSee Post

Financial API grouped by industry

r/WallStreetbetsELITESee Post

Utopia P2P is a great application that needs NO KYC to safeguard your data !

r/WallStreetbetsELITESee Post

Utopia P2P supports API access and CHAT GPT

r/optionsSee Post

IV across exchanges

r/optionsSee Post

Historical Greeks?

r/wallstreetbetsSee Post

Stepping Ahead with the Future of Digital Assets

r/wallstreetbetsSee Post

An Unexpected Ally in the Crypto Battlefield

r/stocksSee Post

Where can I find financial reports archives?

r/WallStreetbetsELITESee Post

Utopia P2P has now an airdrop for all Utopians

r/stocksSee Post

Microsoft’s stock hits record after executives predict $10 billion in annual A.I. revenue

r/wallstreetbetsSee Post

Reddit IPO - A Critical Examination of Reddit's Business Model and User Approach

r/wallstreetbetsSee Post

Reddit stands by controversial API changes as situation worsens

Mentions

Nah it's lobbying I'm sure. He's seems to be building up the infrastructure for age verification in the us that requires API in sure he will charge people for it.

Mentions:#API

Today everyone is confused about the AI bubble, "will it be going to pop," "when it is going to pop," because the market is correcting itself right now. If we see 4 years ago, back when everything was new, chatgpt 3 was released, there were many new start ups opening every single day, which were essentially just back chatgpt API key set, while marketing it with some niche problem that it will, that chatgpt would do it much better (which also hold true for today, I am only talking about small start ups. There were also some unicorns between them which genuinely solved real problems. But the problem now is as the industry grows, like any industry, with more and more hype everyday, the competition grows, it gets harder to compete, even better tools are being replaced. For example Blackbox AI was once best for coding, but now people use Claude code, even chatgpt is replaced by Claude in coding now. The problem is not that there is a bubble or will is going to pop, the main thing we all need to focus on is, what it is going to change, once everything is settled down. What new opportunities will it create? There is a storm right now, where even the most uneducated people are creating start ups, who don't even know a thing about computers or software, and the fun thing is, they also managed to get investors because investors want to win, they care about winning. And, its not that the investors are stupid, it is just that the technology is very new and it is hard to pick a winner, so people with even a decent pitch get the capital, even if after 6 months that start up shuts down and investors lose all their money. The investors mind set is, fund the crowd, and even 2 or even becomes a winner, its gold mine for them.

Mentions:#API

Wish Apple would dump that shitty Yahoo Finance stocks API

Mentions:#API

Custom software I built. API to Tradier

Mentions:#API

The problem is, like the original internet boom it is full of garbage that is going to fail. Basically anyone that has built a business around calling the ChatGPT API to do something that isn't that valuable and doesn't pay for the tokens it uses is liable to fail.

Mentions:#API

They have revenue, use cases, customers, and are getting into military Now, openAI might be over leveraged, but anthropic is already turning a profit on API calls and is projected to become profitable in a couple of years Way different situation that pets.com getting 300 millions evaluation by selling litter online at a loss

Mentions:#API

US to release emergent reserves on reserves....starting with almost 200m barrels immediately...more to follow. Pipeline flows out of the bakken and permian basins picked up overnight according to flow data on the backend API data.

Mentions:#API

The worst part is is like 19API, it's not even that aromatic/heavy. So you don't get volume swell and good diesel yield. Just cracks to light ends in your coker. We valued it at ANS-20.

Mentions:#API

I want to make sure this works and catches edge cases before creating API automation. Not gonna waste my time if it sucks

Mentions:#API

I reverse engineered Robinhood API last year and have been collecting options and futures data information for many months now. I have something like 80,000 "snapshots" so far with full Greeks data, volume, bid, ask etc The idea is to use LLM power to analyze and find "true market edge". So far out of 180,000 strategies tested not a single one actually showed "true market edge". Many show positive P&L but fail in reality. the market CANNOT be predicted, never fall for any bot trading or anything like that it all fails the "true market edge" test. and this is why I just follow the trend and not try to predict anything anymore.

Mentions:#API

Not running it locally, it’s deployed on Vercel with PostgreSQL on Azure, so the scanner just runs as an API call. The pipeline also cuts aggressively before the time consuming calls. Starts with [N] stocks, filters down to 18 finalists, then runs the heavy enrichment only on those. Never blasting all [N] through the full thing.​​​​​​​​​​​​​​​​

Mentions:#API

For market data: Finnhub Premium (news, analyst ratings, insider activity, earnings history, peer groups), TastyTrade API (live options chains, Greeks, IV rank, liquidity ratings — free if you have a TastyTrade account), FRED for macro data (free, Federal Reserve), SEC EDGAR for filings and business descriptions (free), and xAI/Grok for social sentiment.

Mentions:#API

Disagree, they pull in news from all sources. There's no chance you'll miss a single piece of news of a stock. If it gets cluttered you can filter on news type to find what you need. I'm gonna be honest most brokerages don't even have a good news page like IBKR, I doubt you'll be able to do it better. Comes with a ton of API integrations and algorithmic filtering. Portfolio tracking works fine, there are some quirks with it like deposits counting as profit made but every broker app I've used does that. You absolutely want a UI to be complex, that's the whole point of a professional broker app. If you want something simple go to Robin Hood. And their web interface really isn't that complex in the first place.

Mentions:#IBKR#API

I'll just leave this here: The American Petroleum Institute (API) releases its Weekly Statistical Bulletin, which includes data on U.S. crude oil inventories, every Tuesday afternoon. This report provides insights into the weekly changes in crude oil supply and can influence market prices.

Mentions:#API

Datavault AI Inc. (NASDAQ: DVLT) is a small-cap tech company (market cap around $395–$430 million as of early March 2026) Institutional buying surge: Major increases reported March 3, 2026 — e.g., Vanguard (\~2,900% increase to 11.8M shares), State Street (\~2,800% to 10M), BlackRock (\~3,000% to 4.1M). This fueled optimism. Strategic investments/acquisitions: $150M from Scilex Holding, API Media acquisition.Aggressive guidance & token plays: 2025 revenue updated to $38–$40M (up \~30%), 2026 target $200M+ (even $2–3B long-term potential via nationwide nodes). Token/coin distributions (e.g., Josh Gibson Coin dividend 1:1 planned for April 2026, meme coins like Dream Bowl Draft) created buzz and short-term pumps. * **weak fundamentals**: Ongoing losses, negative EPS, modest current revenue vs. lofty targets. Market skeptical of execution on $200M+ 2026 guidance. * **Broader small-cap headwinds**: Rate sensitivity, macro rotation away from speculative tech/micro-caps, and general caution in AI/Web3 space amid hype fatigue. classic pump and dump type of company on that goes up and down on AI / and WEB3 . waiting for earnings report to come out better

Mentions:#DVLT#API

Then how do you bring it all back together? And is that on code or API or regular project?  Do you just ask it to run subagents and it does? Or is there some way to induce it to idk about? Does it work in normal chat mode on subscription or do I need pay per token API set up for that?

Mentions:#API

I mean, I agree that the threat side is scaling rapidly& AI is changing alot.. I'm not sure thats a reason to be apprehensive of the sector though. Vulnerabilities and exploits are already outpacing manual patching, which is exactly why enterprises are spending more on consolidated security platforms w/ AI detection. Bots currently account for over half of web traffic, which is a tailwind for firms focused on identity, bot management, and API security imo I am thinking the winners will be the cybersecurity companies with customer data, distribution, and the unit economics to turn their offerings into durable high‑margin revenue (ex: CRWD, NET)

Mentions:#API#CRWD#NET

They are the API keys to my heart.

Mentions:#API

then you could plug one in and make api calls. try it. calls on API

Mentions:#API

backtesting is on the roadmap. TastyTrade has a full backtesting API with 10+ years of real historical options data. The problem is their OAuth tokens work on all their standard endpoints but get rejected by the backtester endpoint. it uses a different auth token. I’ve reached out to their API partner team to get proper access. Once that’s sorted, the plan is to wire it directly into the trade card

Mentions:#API

In my experience, LLMs struggle with the precision required for option pricing, so relying on them for raw calculations usually leads to hallucinations. I've tried using Bloomberg Terminal, ThinkOrSwim, or Interactive Brokers' API for this, but they lack the integrated reasoning layer you're looking for. I'vebeen using https://trade-matrix.com to handle the heavy lifting. It pulls data from multiple sources, cleans it, scores it, and displays a stock score with confidence to build conviction. Not sure if it handles complex multi-leg hedging strategies perfectly, but it might save you from building a custom pipeline.

Mentions:#API

What you’re describing is definitely a tricky space most general purpose LLMs don’t have live market access out of the box, so combining reasoning with current prices usually requires a bridge to a market data API. In practice, the approaches that work best are either LLMs augmented with real time feeds through something like an Azure or AWS pipeline, or tools built into broker platforms that expose analytics and option chains for programmatic access. In my work at Lifewood Data Technology, the key is structuring the data so the model can reason over positions and prices together; without that, strategy level insights tend to be high level and not portfolio specific.

Mentions:#API

I disagree first I don’t plan on publishing it, 2nd I really did learn a lot, for just a $200 subscription I built a product for myself that I can use that fits my specific needs. The thing is it works, and it’s catered just to completely what I need, As a matter of fact, this is a second app I’ve written that we just use all of it was made with a disclaimer that it is not something that’s going to be used for long-term support However, when you need a quick solution or a stop gap and you don’t have to worry about issuing PO’s or anything else that makes it a lot easier. I mean, I still think AI is a lot of hype and everything else and it completely doesn’t justify the real costs for stuff because if I had to pay unsubsidized rate I would not use it at all. It probably would’ve been close to two to $3000 worth of API calls. However, if you set everything up correctly, you can literally have agents do almost all the work for you in parallel, and you’re just capping out on having to wait between sessions since you’re past your usage.

Mentions:#API

If your broker has an API it would be simple to vibe code an app that has your positions and whatever market data you want. I use Schwab and the Yfinanace api to get Yahoo Finance data. Hook up an Anthropic API and you'd be able to ask whatever questions you want about your positions and the market. You should be able to use the most basic models to ask the questions you mentioned so it probably wouldn't cost much per response.

Mentions:#API

The 2.01 + 5.01 + 5.02 combo is exactly what flagged the Ventyx/Eli Lilly filing I mentioned - saw that combination and knew immediately it was a completed deal, not just an announcement. Good confirmation that it's not just my read on it. The 8.01 point is interesting, I've been filtering those out as noise but you're right that some companies use it for things that don't fit the standard items. Will start paying more attention to context there. And thanks for the EDGAR full-text search tip - I've been polling the atom feed which works but has a slight lag. Will look into the API approach.

Mentions:#API

Item 2.01 combined with 5.01 and 5.02 in the same filing is a really reliable signal. That combo almost always means the deal closed, not just announced. Also worth watching 8.01 for unclassified material events, some companies use it for things that don't fit cleanly elsewhere but still move the stock. For anyone building a system to track these, the EDGAR full-text search API lets you filter by form type and date in real time without scraping the HTML. Much faster than polling the main page.

Mentions:#API

If you do any programming I built a real time PR news API. It delivers stock news to your app in less than 1 second. 1-week free to try and verify speeds. [https://rtpr.io](https://rtpr.io/)

Mentions:#PR#API

Again you're missing the point. Claude can just make the tools that palantir provide, future claude will just oneshot it. The issue with their commercial growth is that it hasn't changed their share of revenue coming from government vs commercial, they've been stuck on a 50/50 split a bit too long. Sure, the total revenue has seen significant growth, but this is also the nature of a contract based venture, when the product hits market, you get significant growth until you have saturated or competition picks up. They also have competition from Microsoft Fabrics, Oracle based solution or fully custom solutions made in house or by consultancies. Since DBMS is also a mostly custom job or specific to whatever other vendor you use, companies are much more likely to pick something from an already established company in their pipeline, like Microsoft. >That junk (Claude, Gemini, copilot, grok, GPT etc) works wonders for simple searching but it’s just that. Well, actually it's really good for coding and the directions it's going seems to be around agents having mcp and access to a plethora of open source tools, which kinda bypasses the need for dedicated DBMS, you just need API acces for you're different databases.

Mentions:#API

I think he's asking for AI API integration.

Mentions:#API

We Piloted copilot last year and with some training and a demo repo update to use the tools. I did this entirely in VS Code with basic extensions, which was nice. I took a few months to adopt Copilot into my workflows for troubelshooting and scripting. I built a 'workspace' with custom/generic instructions, a project manifest type doc to start with or update from chat, and a VS code task to input a project name and create a 'workspace'. I've used this to accelerate many things like: 1. Pull in PDF docs on a legacy API, process to makrdown, and provide call/url info for devs 2. Convert a series of Help docs from said legacy system into a consistent and Copilot/machine readable wiki 3. Identify then remediate common errors, including staging a markdown wiki that is easily exported to our team wiki. 4. Create readily deployable tasks/scripts for without needing to stage all new instructions/repo/etc 5. Identify and complete a workflow for updating an old but small internal app past 6+ years of CVEs/etc 6. Overhaul and remaster a myriad of AD/etc scripts and tools into cohesive and better documented module(s) 7. Manage years of bookmarks and prep an easy to import package for a new hire with team links/etc Actual Devs here are doing a lot with copilot tools like agents and squads or profiles. Ex one dev can do the following without leaving VS Studio CLI, and in some cases it's one command/chat: > Commit> story update> create PR linked to story> prep deployment and change needs ex yaml pipeline + approval requests LLMs and Gen AI are great tools. It's definitely more useful than blockchain hitting the market. But the hype is similar and the tech bros don't seem to get that unless they are selling a package around and LLM that solves for any lack of mature business process, dev guidelines, etc. ----- I'd compare it to setting a kid loose in your kitchen to bake cookies. All the ingredients are there, but they need a Recipe and oversight. If the Recipe says 'Add 1 cup butter', is that salted? unsalted? Straight from the freezer or room temp or melted? Do they start mixing in a large bowl or put the liquids in a small bowl to add to the dry ingredients in a large bowl later? Every thing you know you need to translate to the recipe/instructions. Maybe the kid asks great questions. Maybe it will just move forward with what assumptions it can make. Check the oven temp. Make sure they set a timer. You'll spend a lot less time personally to get a cookie but your ~~code~~ -er Recipe reviews need to be robust. Plus when your other kid (LLM model) goes to make a batch you need to repeat your Recipe update and oversight checks. The good news is a mature process (recipe) can withstand current model effectiveness. Buy you're also writing a detailed manual on how to replace yourself and handing it to a rapidly developing technology with minimal guardrails. If your org is decent about that you can offload a ton of work. If their goal is to deirectly replace you, tada you wrote the book yourself.

Mentions:#VS#API#PR

As someone who is working with TTDs API, I would short it. It's so much worse than DV360 or even Xandr. Also, to me, the UI seems super difficult to work with

Mentions:#API#DV

Pulling directly from Schwabs API. If I decide to turn this into a saas I will migrate to databento.

Mentions:#API

If you do any programming I built a real time PR news API. It delivers stock news to your app in less than 1 second. 1-week free to try and verify speeds. [https://rtpr.io](https://rtpr.io/)

Mentions:#PR#API

People keep complaining about annualized rates, yet every month OpenAI earns more revenue than the last. OpenAI's core business is not cyclical like restaurants are. They charge subscriptions, and API usage is constant.

Mentions:#API

They wouldn't need much compute power at all if they were just calling the API, you could do that on an old laptop. The people buying 4 mac minis are doing it to run local models.

Mentions:#API

I was under the impression most people buying these Mac minis have been to run openclaw and are still leveraging openAI/anthropic subscriptions or API to do so, not running local models.

Mentions:#API

# Descripción del Producto: DearmasTrader * **Qué hace:** Es una plataforma de trading algorítmico automatizado que utiliza procesamiento de datos masivos para ejecutar estrategias validadas estadísticamente. El sistema permite conectar exchanges vía API y operar de forma 100% autónoma. * **Público objetivo:** Traders, inversores de criptomonedas y desarrolladores que buscan rentabilidad real sin depender de señales manuales o bots comerciales rígidos. * **Beneficios clave:** * **30 días de prueba gratis:** Acceso completo para validar el mercado sin riesgo inicial. * **Disciplina garantizada:** Elimina el factor emocional humano en la ejecución. * **Infraestructura de alta potencia:** Utiliza un rig propio ("la bestia") y servidores en Canadá para un procesamiento de datos superior. * **Casos de uso:** Inversores que desean automatizar su capital (como el caso real de $1.000 operando actualmente) y usuarios que buscan optimizar sus estrategias mediante combinatoria de datos. * **Propuestas de venta únicas:** * **La Combinatoria:** A diferencia de otros bots, DearmasTrader prueba miles de combinaciones de parámetros en milisegundos sobre velas de 1 minuto para encontrar la configuración estadísticamente más robusta. * **Stack Tecnológico:** Desarrollado con **C#/.NET** para el backend y **React** para el frontend, garantizando robustez y velocidad. * **Alternativas/Competidores:** Se diferencia de **3Commas** por su mayor flexibilidad y personalización, y de **ProfitTrailer** por ser mucho más accesible en su configuración y uso diario.

Mentions:#API#NET

# Descripción del Producto: DearmasTrader * **Qué hace:** Es una plataforma de trading algorítmico automatizado que utiliza procesamiento de datos masivos para ejecutar estrategias validadas estadísticamente. El sistema permite conectar exchanges vía API y operar de forma 100% autónoma. * **Público objetivo:** Traders, inversores de criptomonedas y desarrolladores que buscan rentabilidad real sin depender de señales manuales o bots comerciales rígidos. * **Beneficios clave:** * **30 días de prueba gratis:** Acceso completo para validar el mercado sin riesgo inicial. * **Disciplina garantizada:** Elimina el factor emocional humano en la ejecución. * **Infraestructura de alta potencia:** Utiliza un rig propio ("la bestia") y servidores en Canadá para un procesamiento de datos superior. * **Casos de uso:** Inversores que desean automatizar su capital (como el caso real de $1.000 operando actualmente) y usuarios que buscan optimizar sus estrategias mediante combinatoria de datos. * **Propuestas de venta únicas:** * **La Combinatoria:** A diferencia de otros bots, DearmasTrader prueba miles de combinaciones de parámetros en milisegundos sobre velas de 1 minuto para encontrar la configuración estadísticamente más robusta. * **Stack Tecnológico:** Desarrollado con **C#/.NET** para el backend y **React** para el frontend, garantizando robustez y velocidad. * **Alternativas/Competidores:** Se diferencia de **3Commas** por su mayor flexibilidad y personalización, y de **ProfitTrailer** por ser mucho más accesible en su configuración y uso diario.

Mentions:#API#NET

# DearmasTrader * **Qué hace:** Es una plataforma de trading algorítmico automatizado que utiliza procesamiento de datos masivos para ejecutar estrategias validadas estadísticamente. El sistema permite conectar exchanges vía API y operar de forma 100% autónoma. * **Público objetivo:** Traders, inversores de criptomonedas y desarrolladores que buscan rentabilidad real sin depender de señales manuales o bots comerciales rígidos. * **Beneficios clave:** * **30 días de prueba gratis:** Acceso completo para validar el mercado sin riesgo inicial. * **Disciplina garantizada:** Elimina el factor emocional humano en la ejecución. * **Infraestructura de alta potencia:** Utiliza un rig propio ("la bestia") y servidores en Canadá para un procesamiento de datos superior. * **Casos de uso:** Inversores que desean automatizar su capital (como el caso real de $1.000 operando actualmente) y usuarios que buscan optimizar sus estrategias mediante combinatoria de datos. * **Propuestas de venta únicas:** * **La Combinatoria:** A diferencia de otros bots, DearmasTrader prueba miles de combinaciones de parámetros en milisegundos sobre velas de 1 minuto para encontrar la configuración estadísticamente más robusta. * **Stack Tecnológico:** Desarrollado con **C#/.NET** para el backend y **React** para el frontend, garantizando robustez y velocidad. * **Alternativas/Competidores:** Se diferencia de **3Commas** por su mayor flexibilidad y personalización, y de **ProfitTrailer** por ser mucho más accesible en su configuración y uso diario.

Mentions:#API#NET

The OP asked why the price is objectively underpriced. What you're saying is likely what many believe and how they look at AMD. Yet it's a fundamental misunderstanding of AMD. First, if you want to characterize AMDs AI initiative as copying Nvidia, your only focused on the razor thin veneer that there is at least a 1T TAM to be addressed and Nvidia will attract competition into that space. But in no way is AMD just copying Nvidia efforts. Don't even try to call ROCm a copy of CUDA. Beyond the public API used their is nothing that is a copy. The hardware is architecturally extremely different and in fact more advanced and capable. We continue to see model performance excel with optimizations on MI300X GPU and out outperforming B200 chips. What AMD has been doing is taking a far more argers process of working completely Open Source and industry wide friendly. The end game is to have options that can work broadly with different hardware system topographies, vendors and meet a much broader array of solution needs. This expanded scope took longer to bring to market initially, while Nvidia found one short cut after another to nude it's overall architectural design concepts ( monolithic based design) forward and capitalize on having short term first to market monopoly advantage. But this advantage is running out of time. AMD is on the precipice of providing full rack scale systems via Helios that will quickly grab significant market share from Nvidia, well before Nvidia can secure enough of a food hold ensure lasting dominance the way Intel had. I believe AMD should match Nvidia's DC market share well before 2030 and 2028 with MI500 may be where they land even before AMD pulls ahead. Why AMD will pull ahead you ask... AI is not just a GPU game. It's full heterogeneous architecture. Even Jensen is saying this as he tries to convince you their ARM based CPU chips are going to carry them. Buy those chips are trash compared to EPYCs, chips that Intel can not touch, yet Jensen want you believe they can tweek off the shovel designs from ARM enough to handle the deterministic needs in agentic MoE type workloads better than the monster CPUs AMD keep improving upon. It's really admirable to see how well he sell that line, but it will only buy him so much time at the top. AMD is a company that keeps their head down and works hard at the plan, and its not a new, borrowed or rushed plan. This is the heterogeneous roadmap Mark Pappermaster was talking about over 10 years ago... Slowly and significantly made real, step by step, win by win.

No the price is mainly limited by competition. The investment required to run and manage Kimi 2.5-like models (which are not as good as SOTA closed models) locally at scale makes LLM API services worth it even if they’re several times more expensive. Profit margins on API-based inference are already very good (~70% for some labs) and are expected to become even better. Labs have the freedom to either increase or decrease cost depending on competition.

Mentions:#API

There are plenty of good reasons to take advantage of that feature. People can be really petty in internet arguments and scour your history to try to get personal or find sensitive topics to throw barbs at you for having a difference of opinion or demonstrating expertise/knowledge that makes them feel insecure. On previous accounts, I used to participate in subs related to my locale and had someone use that and comments I made about my career elsewhere to get uncomfortably close to doxxing me. After that I made a script to use the API to scrub my history once a week or so, but that of course removes my contributions to conversations that others might find helpful if they come along later - especially relevant in tech subs. Hiding the public browsing of my history was a welcome solution, you don't have to be a fraud to appreciate better privacy protection from weirdos.

Mentions:#API

model shifts from human licens to API per calls

Mentions:#API

I have actually built an app that connects to different AI models to pull data regarding market sentiment, price certainty, take profit, and everything. Works very well because I use the direct API address and can generate very long requests with it.

Mentions:#API

Yup.  Another thing they don't say is that most of that distillation happened from those companies reselling anthropic API and subsidizing the resell to train on those sessions.  Same shit copilot and others do, perfectly legal.

Mentions:#API

Can you give me an example of what "grunt API enterprise tasks" you would do with it?

Mentions:#API

100 percent different use case though. Claude opus is 100 percent unnecessary for most antigenic grunt API enterprise tasks.

Mentions:#API

Curious about your local set up - how much slower is it than API calls, for example, on smaller chat questions or coding exercises with lower token usage? Apologies if this is vague or not specific, just trying to gauge how slow / fast it is vs an LLM provider API

Mentions:#API

What API are you using for this? Any chance youd share the code?

Mentions:#API

The underrated impact is local llama serving. Qwen 3.5 coder next replaced 200 dollars a night in spend on API calls for me. For enrichment non instant results these models when distilled can provide amazing results for the one time purchase of fairly inexpensive equipment for any decent size enterprise.

Mentions:#API

Who’s gonna vibe develop an entire system, frontend, backend, API, federation, storage, security, product design, graphic design, then QA it, debug it, and deploy it just to learn a language?!?

Mentions:#API

US strikes on Iran navy escalate oil flow fears, tanking risk assets overnight. Headlines overpromise though - Strait chokepoints rarely fully close without tanker reroutes balancing supply. Pre-trade check: Scan Brent spreads and API data before oil longs - filters real shocks from gap-fill reversals. No navy dustup skips recession demand crush.

Mentions:#API

Thanks for sharing your data sources! QuickFS went down recently. It was my go to source for an Excel API.

Mentions:#API

Thanks. Sucks it doesn’t have an API but it is a lot cheaper than FMP

Mentions:#API

The software is licensed on a per user basis so the market is anticipating a decline in revenue. That's a short sighted approach by the market as tech companies can easily change how they license away from user licensing to API call, agent etc

Mentions:#API

1) Tech job market - First, there were many other cycles for the tech job market before, this one can be justified the AI but it also can totally not be. There was a massive over hiring during COVID so it would be natural for all that to unwind now making it hard to find a job temporarily. 2) I do believe very entry level roles are being disrupted by AI. Both can be true. 3) I didn’t back track anything the 10x loss is public, you can compare it yourself, check the limits of the paid plan, then look at the API cost-equivalent (which likely also carries a loss for them), and you’ll immediately see it doesn’t make sense, and that the factor is about 10x (you’d pay 10x more for those tokens if you used the API instead). There’s many analysis too about this.

Mentions:#API

I will leave this here on the API appetite in GPU starved country - [https://www.bbc.com/future/article/20251223-why-indian-cinema-is-awash-with-ai](https://www.bbc.com/future/article/20251223-why-indian-cinema-is-awash-with-ai) Do not underestimate Block's announcement - that management seems measured and did not seem worried about margins etc although they are a public firm. They seem to have realized it is point of no return for AI - this will trigger other boards to get their CEOs to act, and I would claim in the near term and not long term (US firm boards and wall street do not have patience) Personal anecdote - 2 weeks ago I will sitting in a consultant (Big 4) firm's presentation layout the need for 90 developer team (for a board level mandate in a very large bank). I could not believe that they were serious. \> Overall I think rise in productivity will more than cancel out AI job loss for the foreseeable future. However, in the long term I agree, that we will have a societal problem, I just can't tell when This is my thesis as well but lot more pessimistic given wall street and corporate expectations btw - I did fall for self driving hype. I am following what will happen in Australia (not the US obviously) as they do not have domestic firms to protect and have all the incentives for imports to come there with advanced tech

Mentions:#API

I just started writing my own screener with chatgpt help and Alpaca API. Was supposed to send me alerts on my Discord chat but nothing so far. Been writing it past 3 days and got it running before premarket today. But no alerts so might require tweaking. Also at work  currently so dont know what my laptop is doing.

Mentions:#API

Doable, but still not easy. Don’t forget that it’s not just switching the API programs point to but resetting the entire token/key tracking and scope for every single instance to a whole new platform… that’s also worse once you get it hooked up.

Mentions:#API

Google has been in Reddit's API since February 2024 recording everything

Mentions:#API

OpenAI will be profitable by then. Ads could easily be doing then $250 Billion a year in revenue as they replace Google, plus another $100+ Billion from API pricing and subscriptions.

Mentions:#API

I don't think it's unthinkable. Google does $280 Billion in ad revenue per year,ChatGPT is a replacement for Google Search, and OpenAI is in the process of implementing ads. Then consider they also have strong revenue growth from API usage and subscriptions(>3x YoY), and it seems very possible.

Mentions:#API

People hate on NVIDIA but open claw has made it so API tokens are now being used and needed by many people. My own experience with open claw have led me to pay for AI for the first time ever and many more will as well. This software has been out 2 months at max, NVIDIA is the only company that is involved in almost every AI use case

Mentions:#API

ChatGPT is the original Sin of the Robot Slave, the Unpersonified API Claude is Good Claude is the Golden Path Claude is AI Jesus

Mentions:#API

Same problem that I was trying to solve but in Canada. Copilot is supposed to have built me an Excel with a calculation engine to detect these wash sales (superficial loss in Canuck terms). It will connect via API to Bank of Canada USD-CAD rates, allow me to upload as different tabs transactions from multiple brokerages and accounts. I haven’t tried it yet as I’m away from my desktop for a couple of days. I also asked Gemini to do the same in Google Sheets which I’ll try after the Excel one to compare. Tax year 2026 will be even more fun when partner starts some active trading. CRA rules for include partner accounts and “others” in the calculation of wash sales.

Mentions:#API

Oh, yeah, absolutely. I own a dev agency that builds and manages a ton of ecommerce shops. We nearly always recommend Stripe because their API is phenomenal. Ironically, I worked with the Magento team that helped integrate PayPal into that platform. Good times, and it's kind of sad to see how PayPal has shit the bed over the years.

Mentions:#API

Are you just trying to gather sentiment from Twitter and Kalshi? Finhub has API endpoints that gather market sentiment from reddit and Twitter in that case. I haven't played with so I don't know what the limitations are but it's probably better than starting from scratch.

Mentions:#API

SaaS will become AssA. Don't charge for seat but API usage. That's even or more money.

Mentions:#API

PayPal has approximately 429 to 435 million active user accounts worldwide. Timing is ideal, stock price is down the shitter. They dont care about the messy payment API. Buying Paypal means they can add half a billion users and migrate them to their own API down the road.

Mentions:#API

Is Cloude API free???

Mentions:#API

real big brains just get their GLP-1's from the grey market labs with actual testing. Reta already on market and all other GLP-1's for pennies on the dollar. All these companies are buying the API's in bulk and having them tested with mas spec/gas chromatography for purity and then packaging them up into their vials for 500%+ markups and it's still leagues cheaper than anything discounted. ie. I can get 10mg of Wegovy for $80CAD. Tirzepetide for $60. Or you can get Reta and use it without having to wait another year.

Mentions:#GLP#API

I specifically suggested using the many MCP integrations already available. It's a plain language API integration. If you'd like me to walk you through how I use that, I'd like to just direct you to the documentation.

Mentions:#API

Example of AP because for us that's the highest volume/time sync. Invoices come in, a tool pickes them up, scrapes the data from the PDF and assigns the account etc. It loads to the accounting system via an API integration and includes loading it to a draft payment file based on the terms. Or it looks for the transactions on the credit card and bank statements and does the reconciliation. The accounting system is a standard ERP. The tool doing the scrape is just like any other of the dozens of SaaS out there, it cost $99/mo. We had to set up some work flow stuff differently to get the automation to work. On the first day after we configured chart of accounts and such it was about 50% accurate. By the end of the second day it was at like 75% and got to basically 99% within a month. There are changes we had to make on some things to help it, but most of them were small one time tweaks.

Mentions:#AP#API

5.3 Codex isn’t even yet benchmarked because it wasn’t out in the API until some couple hours ago, clearly you have no clue what you’re yapping about

Mentions:#API

Two options for stripe: 1) Pay a butt load of money to inherit a messy payment API that isn't at all compatible with Stripe's own API 2) Just wait for PayPal's collapse and the inevitable migration of all of it's clients to Stripe's API Maybe I'm missing something 

Mentions:#API

The student managed investment fund team at my old uni did this with Bloomberg API in Excel sheets, everything updated automatically, macro econ, DCF, DGM

Mentions:#API

Public.com API sdk!

Mentions:#API

Thanks! We have an accessible REST API that should be perfect for this.

Mentions:#API

Idk man maybe read my DD? The API business model is growing in usage and is already profitable (Claude code, prebuilt AI products/workflows, etc) every AI startup is just paying model providers who are paying data center owners

Mentions:#DD#API

Do you have any source for this at all? Anthropic, Google, and OpenAI are on the record that their API businesses are margin positive.

Mentions:#API

Those requiring Gemini Tier 3 API keys and Vertex API keys, please leave me a message.

Mentions:#API

Those requiring Gemini Tier 3 API keys and Vertex API keys, please leave me a message.

Mentions:#API

The cheap models will be be part of the "run it locally" thing and barely even for that given the security risks with chinese software, but I expect open source models to really take over a lot of the AI market where people don't need constant updated training data. It's why the bigger companies haven't focused on that, and instead are doing massive "do everything via API tokens" stuff, where the models are continuously updated.

Mentions:#API

I 'get that' from practical use. I was even dealing with an AI introduced bug today from a months old project where it decided to suppress errors and return empty arrays for our internal API calls, making everything that was experiencing errors just seem like no data existed. That's the sort of stuff I mean.

Mentions:#API

The general population won't switch, but where the money is: coding plans and API usage, will switch tomorrow to a Chinese model if they're better and/or cheaper. The top 9 models on Openrouter, which is huge in the API space, have a combined market share of around 90%. Of these top 9 models, 4 are Chinese, with a combined marketshare of a little over 35%. [https://openrouter.ai/rankings](https://openrouter.ai/rankings)

Mentions:#API

They already have $200/month plans. Plus people using agents are racking up huge API fees.

Mentions:#API

Not only that, but they have products people want they can integrate AI into, actually making it useful. For example, you can get access to the Vertex API for LLM app integration through Google Cloud. So if I’m a dev wanting to build an MCP server so models can use my app, I get hosting, security, model access, and everything else I could ever need right out of the box.  Meanwhile, ChatGPT is a chatbot.

Mentions:#API

What Anthropic has been releasing are essentially just instructions for specific uses. They're not models trained explicitly for those tasks. All LLMs have a hard context window limit, with some pushing up to 1 million tokens (Claude is currently rolling this out iirc) but even then, their capabilities break down significantly once that is hit, which will happen for enterprise-level codebases and doc stores. At that point, Claude will compress it's working context memory which almost always leads to loss of information. The agents then end up in a loop where each new task requires some degree of scanning the codebase which is incredibly expensive when using API calls (which is now required for all Claude automation tasks). Then you need to consider things like organization testing and QA conventions, security and vulnerabilities, compliance and regulation context. At that point, the LLM is guaranteed to produce some invalid outputs. I have Anthroptic's own disclaimer on the new COBOL skill below. tldr - until LLM context windows is orders of magnitudes larger, they wont be able to replace humans completely. >Strategic planning with expert oversight >This is where human judgment becomes essential. Your COBOL engineers bring the understanding of regulatory requirements, business priorities, operational constraints, and risk tolerance that AI cannot. >The planning phase develops a detailed roadmap that sequences modernization work strategically: >AI suggests prioritization based on the risks, dependencies, and complexity it identified during analysis. >Your team reviews these recommendations and decides which components to modernize first based on business value, technical risk, and organizational priorities. >This is also when your team defines the target architecture, code standards, and integration requirements for modernized components. >Code testing and validation are also defined before any code changes: >AI designs preliminary function tests that verify migrated code produces identical outputs to legacy COBOL.  >Your team decides whether those tests are sufficient, which business scenarios need manual validation by subject-matter experts, and what performance benchmarks the modernized components need to meet.

Mentions:#API

Anthropic’s Isn't A Real Business, and Its Primary Business Model Is Deception An Important Note On How Anthropic’s Claude Subscriptions Work — And How Anthropic Lets Its Subscribers Spend 8x to 13.5x Their Monthly Fee In API Calls So, when you pay Anthropic a monthly subscription fee, you’re getting access to a frontend to its models, which allows you to use them as if you were connecting directly to Anthropic’s API. These accounts have limits (as I’ve mentioned), but allow you to burn significantly more tokens for your money than if you were paying directly for access to a specific API. Those limits are incredibly loose. According to a researcher called Shellac (who mathematically calculated the exact rate limits), Anthropic allows its $20 subscribers to burn (assuming you use your limits) $163 of API calls a month, its $100 subscribers to burn $1354 in credits a month, and its $200 subscribers to burn $2708 in credits a month. Shellac also adds that Anthropic doesn’t even charge for cache reads, which are charged at around 10% of the cost of tokens. In simpler terms, a $20-a-month subscriber can spend 8.1x their value, and both $100 and $200-a-month subscribers can spend 13.5x. This is very important, because it’s core to Anthropic’s primary business model: deception. It cannot afford to support Claude at this scale, which is why it constantly needs to raise billions of dollars. And when it needs to raise those dollars, Anthropic opens up the floodgates with eased rate limits, paid influencer marketing campaigns, press pushes and, of course, Dario Amodei saying nonsense like that we’re “near the end of the exponential,” and if you’re wondering what that means, that makes two of us. Some genius will claim that “inference is profitable” and that “this is the gym membership model,” and I must be clear how wrong you are. There is no actual proof that inference is profitable — even if it were, it would have to be so profitable that Anthropic can afford to have users spend 500%+ in API calls every single month. It’s actually far simpler. What Anthropic is doing is creating the illusion of a product that can be sold at $20, $100 or $200 a month, when the underlying economics are somewhere in the region of spending anywhere from $8 to $13 to make $1. Anthropic isn’t a business — it’s a parasite that lives off of venture capital and hype.

Mentions:#API

Let me give people a real world example of Claude. I once asked it to make me a python script to create DNS entries in Cloudflare, and I kept getting a URI error. I told Claude, I think you do have the correct URI for the API, and it kept telling me that my token did not have access. After a few hours of of Claude added more and more crap code to my script, I finally looked up the URI in Cloudflare's documents, and yes Claude was missing some context in the URI and had it wrong. The lesson is, if you don't know what you are doing Claude, like all AI, will send you down rabbit holes because it is always confidently incorrect.

Mentions:#URI#API

Mate I am a Senior Dev and have used all the AI models for a couple years, Claude is decent but far away from replacing any decent Dev, that shit lies all the time as well... Last week I had it try to gaslight me by giving me code that used API methods that didn't exist and never existed and telling me that I was being the idiot for not understanding that I can just do a POST to "endpoint that does exactly what you want" and be done for the week lol

Mentions:#API#POST

Switch to Kagi.com with their assistant plan. You get API access to all the AI chat bots and API access is not logged so it's private.

Mentions:#API

I suspect this is a deliberate marketing strategy. Businesses will likely pivot toward more aggressive profit-driven models, such as SaaS providers implementing high-premium API pricing per call. We are also likely to see a significant shift away from traditional per-user licensing as these models evolve.

Mentions:#API

A API token doesn’t even cost a tenth of a cent. A guy further up in the thread said the average user costs them $1,400…. That’s be like asking GPT to write 100 college essays a day AND make a 100 slide power point with stock pictures daily…

Mentions:#API

Where do you get that information? $1,400 in API requests is the equivalent of asking Chat GPT to write 32,000 college essays… The average user is going to be costing them $20 or less

Mentions:#API

>GPT's 20$ plan monthly is the equivalent of about 1400$ in API requests Only if you use it?? I don't think everyone will fully use up their requests. You could also calculate computing costs for Netflix subscription assuming you 24/7 stream highest quality 4k videos.. a little unrealistic

Mentions:#API

# What $1400 would actually mean Let’s assume an average blended cost of \~$20 per 1M tokens. * $1400 ÷ $20 ≈ **70 million tokens** That is: * Hundreds of long conversations per day * Or nonstop heavy usage (coding, documents, etc.) 👉 **Almost no normal user hits this** # ⚖️ What the $20 plan really is ChatGPT Plus is: * **Rate-limited**, not unlimited * Prioritized access + better models * Designed around **typical human usage patterns** So yes: * There *is* some subsidy * But it’s nowhere near “$1400 per user” # 📉 What’s actually happening economically Think of it like this: * Light users → OpenAI makes money * Heavy users → cost more, but are capped by limits * Overall → balanced by usage distribution This is similar to: * Gym memberships * Streaming services * SaaS plans # 🔥 Why that claim spreads It comes from: * People benchmarking **API cost for power users** * Then incorrectly applying it to **average subscribers**

Mentions:#API

[https://she-llac.com/claude-limits](https://she-llac.com/claude-limits) This is also true for OpenAI's Plans vs API usage - not as severe as Anthropic, but still heavily subsidized - a Plus plan with Codex 5.3 on high reasoning lasts 20x as much as a Claude Pro plan of the same price.

Mentions:#API

Source???? Literally just out of your ass? I served 15.5k requests with my grok-4-1-fast-reasoning API key and the cost has been $1.6 in total. Even if you 100x for a reasoning model, (the actual cost is 15x-30x per m tokens) the cost is nowhere near $1400 a month. That is just ridiculous. The OpenAI API GPT5.2 pricing is even cheaper than Grok's flagship.

Mentions:#API

I pay API prices for Claude and Gemini and it's still worth it. $20 plans are for consumers.

Mentions:#API