You’re also interfacing with folks in a niche service industry here. If you’re a sales rep, you’re definitely being screened to represent what people want to be perceived with using the product.
I'm just waiting to see how they monetize their userbase. Last time I checked they made $1/user a year(~20m active users/ 20m revenue). It's predominately a whale market where traffic is driven by power users. You have mobile gacha games like Genshin Impact that make ~$10/user a MONTH (~5m active users/ 50m in monthly revenue)...
I wouldn't wait. fpgas weren't design to serve this model architecture. yes they are very power efficient but the layout/p+r overhead, the memory requirement (very few on-the-market fpgas have hbm), slower clock speed, and just an unpleasant developer experience makes it a hard sell.
As someone who's worked at Xilinx before and after the merger, it's a surprise they were even able to sell it for that much. Altera has been noncompetitive to Xilinx in performance and to Lattice in terms of low-end/low-power offerings for at least the last 2 generations.
I'm concerned about the future of FPGAs and wonder who will lead the way to fix these abhorrent toolchains these FPGA companies force upon developers.
Yeah I personally wondered if AMD was just copying Intel, because apparently every CPU manufacturer also needs to manufacture FPGAs, or they actually have a long term strategy where it is essential for both the FPGA and CPU departments to cooperate.
I think Xilinx did a fine job with their AI Engines and AMD decided to integrate a machine learning focused variant on their laptops as a result. The design of the intel NPU is nowhere near as good as AMD's. I have to say that AMD is not a software company though and while the hardware is interesting, their software support is nonexistent.
Also, if you're worried about FPGAs that doesn't really make much sense, since Effinix is killing it.
I briefly hoped that, like the integration of GPUs, there would be a broader integration of programmable logic in general purpose CPUs, with AMD integrating Xilinx fabric and Intel integrating Altera fabric. But I could never imagine a real use case and apparently there wasn't a marketable enough one either. Something like high-level synthesis ending up like CUDA always seemed like it would present a neat development environment for certain optimizations.
Agree on both.
As things like the PIO on the rp line of micros gets more common, micros will have IO that can match FPGAs. For low end, micros are generally good enough or gain NPU compute cores. It’s the IO that differentiates FPGAs.
There is literally no market for FPGA as coprocessor/accelerator and there never was (that was some kind of pipe/hype dream before GPGPU took off). Where there is a market for them (prototyping ASICs, automotive, whatever, network switches, etc) there is no replacement but there is also no growth.
Depends entirely how you define "growth". If you take AI, LLM as your baseline of growth, then yeah, sure. But what else is growing?
FPGAs are getting cheaper with each gen, expanding into low cost, high volume markets that were unthinkable for an FPGA 10 years ago. Lattice has an FPGA family specifically targeted to smartphones, and I've been consulting for a high end audio company that wanted to do some dsp, and a cheap FPGA was the best option in the market for the particular implementation that they wanted to do.
It's not sexy growth, but it's growth. Otherwise, we wouldn't had the explosion of the latest years in low end FPGA companies.
lookup sigmastudio dsp, dsp is insanely cheap todo, there is absolutely no need for fpga, what that guy was doing was either nonsense or it was in 1995. which are both irrelevant points, or rather you provided examples that show fpga are irrelevant, no growth market.
(how many audio devices were using TMS320 dsps even before and after ipod was a thing...)
My point is that FPGAs have become very cheap, competing with microcontrollers. I would agree that high end audio manufacturers are about as rational as they costumers.
If FPGAs are not a growing market, how come we have gone from 2 companies (we'll ignore niche space stuff) to ~10 in the last 20 years? Not many IC fields where there is a growth in manufacturers instead of consolidation...
I hear this a lot, but in my experience this isn't true at all.
A Versal AI Edge FPGA has a theoretical performance of 0.7TFLOPs just from the DSPs alone, while consuming less power than a Raspberry Pi 5 and this is ignoring the AI Engines, which are exactly the ASICs that you are talking about. They are more power efficient than GPUs, because they don't need to pretend to run multiple threads each with their own register files or hide memory latency by swapping warps. Their 2D NOC plus cascaded connections allow them to have a really high internal memory bandwidth in-between the tiles at low power.
What they are missing is processing in memory, specifically LPDDR-PIM for GEMV acceleration. The memory controllers simply can't deliver a memory bandwidth that is competitive with what Nvidia has and I'm talking about boards like Jetson Orin here.
FPGAs are neither here nor there and will always be niche. If you need the same thing many times you make dedicated silicon. If you need many different things available all at once you use a normal CPU. Only if ASICs are too expensive and CPUs are too slow the FPGA can shine.
They're competitive on perf/watt because they're designed to do one thing. But they're much more expensive than an ASIC, which, if also designed to do one thing would be better than the FPGA.
So Intel found optimists who think they can make Altera more competitive? It's a success. Success with Intel products would be better, and excellence at M&A is hard to convert into excellence at chipmaking, but it's better than nothing.