Nexthop AI Raises $500M: Why AI Tool Costs Keep Falling
A half-billion dollars for the wires behind AI
Nexthop AI just closed a $500 million Series B at a $4.2 billion valuation. The round was led by Lightspeed Venture Partners with Andreessen Horowitz, Altimeter, and Kleiner Perkins joining in. The funding was oversubscribed — more investors wanted in than there was room for.
If you run a restaurant in Charleston or a contracting business in Beckley, a networking startup’s fundraise probably doesn’t sound like your concern. But what Nexthop builds — the switches and hardware that connect servers inside AI data centers — is a key bottleneck in how fast and how cheaply AI tools reach your business.
What Nexthop AI actually does
Nexthop builds networking switches purpose-designed for AI data centers. Think of them as the highway system inside a data center: they move data between the thousands of GPUs that train and run AI models. Without fast, reliable networking, even the most powerful chips sit idle waiting for data.
The company was founded by Anshul Sadana, who spent 17 years at Arista Networks as COO and scaled that company’s sales from zero to over $5 billion. He holds patents in Ethernet latency and dynamic service insertion — the core technologies that determine how fast data moves between AI chips.
Alongside the funding, Nexthop unveiled three new networking switches built on open-source operating systems like SONiC and FBOSS. These compete directly with equipment from Cisco, Arista, and Hewlett-Packard Enterprise.
Key facts
- $500M Series B at $4.2 billion valuation, led by Lightspeed Venture Partners
- Founded by Anshul Sadana, former Arista Networks COO (17 years, $0 to $5B in sales)
- Three new switches launched alongside funding, built on open-source networking stacks
- Offices in Santa Clara, Seattle, Vancouver, Dublin, and Bengaluru
- Market context: The AI networking market sits at $19.93 billion in 2026, projected to reach $213 billion by 2034
Why this matters for small business AI costs
The AI infrastructure buildout is staggering. Tech giants — Alphabet, Amazon, Meta, and Microsoft — are expected to spend roughly $650 billion on AI data centers in 2026. We covered Google’s $185 billion slice of that spending when it was announced in February.
All that money is building capacity. More capacity means more competition among AI service providers. More competition means lower prices for the tools you actually use — the chatbot that answers your phone, the scheduling system that books your appointments, the content tool that writes your marketing emails.
The numbers back this up. Token costs for large language models have dropped by a factor of 280 over the past two years, from roughly $60 per million output tokens for GPT-4 at launch to fractions of a cent today. LLM inference costs have declined roughly 10x annually. Nvidia’s upcoming Vera Rubin platform promises another 10x efficiency gain on top of that.
Nexthop’s contribution is specific but critical. Networking has been a bottleneck. You can pack a data center with the fastest GPUs on the planet, but if the switches connecting them can’t keep up, the whole system slows down. Faster networking means faster model training, faster inference, and ultimately cheaper AI services.
The honest picture: per-unit costs fall, but watch the fine print
Here is where it gets more nuanced. The cost of each AI query is plummeting. But the broader technology cost picture for small businesses is more complicated than headlines suggest.
Memory prices have surged 50-55% quarter-over-quarter as manufacturers prioritize high-margin AI server components. Cloud providers have announced price hikes of 5-15%, with some premium tiers jumping 30-40%. The same AI infrastructure boom driving Foxconn’s record revenue is also creating supply chain pressure that raises costs for everyday business hardware.
The net effect for a small business owner: your AI-powered scheduling tool is getting cheaper, but your laptop, your cloud hosting, and your general IT costs may be going up. The savings are real but uneven.
What you should do
Immediate actions
- Audit your AI tool costs now. If you locked in pricing 12 months ago, check what your providers charge today — many AI tools have quietly dropped prices or added free tiers.
- Watch for bundled AI features. Microsoft, Google, and other platforms are embedding AI into tools you already pay for. You may already have access to AI capabilities you haven’t turned on.
- Budget for hardware cost increases. If you are planning equipment purchases in 2026, factor in 10-15% higher costs for servers, storage, and networking gear.
Watch for
- More infrastructure deals like Nexthop’s. Each one adds capacity and competition to the AI supply chain. As data center infrastructure expands — including right here in Appalachia — the long-term trend points toward cheaper and more accessible AI tools.
- Open-source networking adoption. Nexthop’s bet on SONiC and FBOSS could drive down enterprise networking costs broadly, not just for AI workloads.
The bottom line
A $500 million networking deal is not the kind of news that changes your Tuesday. But it is another signal that the AI infrastructure buildout is broadening — moving beyond GPUs into the full stack of hardware needed to run AI at scale. Every layer that gets cheaper and faster eventually shows up as a lower price tag or a better feature in the tools small businesses depend on.
The trend line is clear even if the path is bumpy. AI tools are getting more capable and less expensive on a per-use basis. The businesses that benefit most will be the ones paying attention to what is available today, not waiting for a perfect moment that never arrives.
If you are exploring how AI infrastructure and tools fit your business, see how we help small businesses get started.