I've been using an embedded system based on a combination of FPGA and DSP for about ten years now. If I could go back in time and start over, I would ditch the FPGA. The reason I say that is because of the time costs associated with realizing any benefit of an FPGA design.
1) FPGA based design is "future proof", we can use the same hardware to interface with new sensors in the future!
Reality: we could spin a new rev of the circuit board faster than we can develop and debug the new interface in the FPGA.
2) everything can done in parallel!
Reality: it's faster (development time) to put a soft core CPU on the FPGA and write linear c code to get the job done. When that solution runs fast enough, why spend more time optimizing?
Bad FPGA developers are hard to find, good FPGA developers are nearly impossible to find.
FPGAs are super cool. For certain niche applications they can't be beat. But most of the time, the development cost is just not worth it.
Sounds like the program I used to work on, though we were doing FPGA cards feeding into a small MPI cluster. The only real reason I can think of to use FPGAs in production is that you need ASIC-like functionality, but the total build out isn't worth the cost of taping out an ASIC.
On my first program the hardware had several FPGAs and CPLDs for that reason. That design was finalized in 1994, though. Today those chips could be replaced by microcontrollers. As microcontrollers and soft-core processors get more capable, the applications for FPGAs will decrease.
Any time you are dealing with massive amounts of data in a fairly restricted environment (something out in the field) FPGAs are about your only solution. Image processing is really one of the largest and best uses of FPGAs that I have seen.
I think they are very overblown for many things though. I'm sure eventually when semiconductor technology slows down (Moore's Law) that FPGAs will start becoming worthwhile for most things simply due to their efficiency. We are still a long way from that though!
Do you have any examples of FGPAs used with image processing? I've done a little, and I've seen some academic examples. But I haven't seen anything in production.
The acquisition systems in almost all medical imaging, along with their noise reduction and such are built with FPGAs. An MRI machine costs several million dollars or more, and GE sells a handful each year, so spending $10,000 on the FPGA that it's built around makes more sense than millions on spinning an ASIC that will be out-of-date the day the machine is put in the field. Not to mention the need to maintain these machines for decades, given the replacement cost. Nothing else will do, when an applications processor is too slow to do data acquisition and an ASIC isn't cost effective.
Military cameras, high frame rate cameras, etc. Military cameras in particular since they used uncompressed data (think tens of gbits/s to deal with) and tend to be in mobile environments. Lots of image stabilization and other such things going on.
These types of cameras also tend to be low volume and high cost like wcunning mentioned.
Almost all their products are FPGA based designs based on a similar core set of blocks which are then leveraged across many products. This reduces both development and fpga cost for them. Very smart approach.
I worked in medical diagnostic device design for a while, and there we needed FPGAs because literally nothing else could accept data as fast as we needed. I'm talking GHz speed ADC reading, with special purpose transceivers that are built on-die with FPGAs. The fact that we can both poll data that fast and do any processing of it at all is astonishing. That's where FPGAs are going today -- data acquisition systems for anything fancier than accelerometers.
I am truly excited that there is a push to popularize FPGAs. They often allow a fundamentally different approach to problems and afford a massive improvement in efficiency and flexibility when applied to a given task. This book is a very approachable high-level answer to the question "wtf is an fpga and why do I care?" but I'm a bit disappointed that is lacks much direction on where to look next for more depth.
I'll take the liberty to suggest a couple possibilities here:
Opencores is kind of a SourceForge for FPGA stuff. There are lots of interesting components there to mix into a project.
nitpick: This is my first encounter with the ASSP (Application-Specific Standard Product) initialism. It seems like a needless distinction from ASIC to me.
At Amazon scale, most of the time they could use an FPGA, they would be better served by an ASIC.
The only exception I can think of would be if they dynamically implemented different algorithms in a FPGA in a just-in-time model. Or, at least, in a regular basis.
I'd love to see a higher level language binding for FPGAs like CUDA for GPUs. I've seen real experts make data fly through an FPGA but it seemed like development was slow (no surprise given 8hrs just to compile) and getting the clocking/pipelining right to run at really high speeds was non-trivial. In contrast using CUDA for GPUs was pretty approachable even for regular developers to get started although it did take expertise to squeeze all of the performance out of them. Hard to beat FPGAs though when you need low latency though...
FPGAs have finally become used in some compute clouds. Specifically, Microsoft published some work in this space called Catapult: ftp://ftp.cs.utexas.edu/pub/dburger/papers/ISCA14-Catapult.pdf
I predict that we'll see FPGAs become in use more often, but there are certain hurdles in both the operational, and programmability model that need to first be solved.
I've still never really seen any FPGA deployments beyond using them basically as DSPs in finance which is too bad as I would LOVE to play with something that can rip through the generation of stochastic processes quickly.
1) FPGA based design is "future proof", we can use the same hardware to interface with new sensors in the future!
Reality: we could spin a new rev of the circuit board faster than we can develop and debug the new interface in the FPGA.
2) everything can done in parallel!
Reality: it's faster (development time) to put a soft core CPU on the FPGA and write linear c code to get the job done. When that solution runs fast enough, why spend more time optimizing?
Bad FPGA developers are hard to find, good FPGA developers are nearly impossible to find.
FPGAs are super cool. For certain niche applications they can't be beat. But most of the time, the development cost is just not worth it.