what do you mean? in image processing or in general? in general they're used a lot where performance is important, especially in high frequency trading, crypto-mining, etc. also they're quite important for image processing in defence/medical imaging.
If you go to nvidia's jobs website today you'll find they're actively hiring FPGA developers for ASIC prototyping. Obviously they're not dumping their 5090 RTL straight into some 10 metre wide FPGA chip. First, they grab the largest FPGA you can get your hands on - the FPGA vendors tend to have a couple of comically expensive comically large SKUs for specifically this purpose. Then you pop a few of them onto a development board and partition your design across the cluster of FPGAs with some custom interconnect, orchestration and DFT stuff. FPGAs offer quite a compelling way of getting test mileage vs simuluation/emulation in software.
Every. If I was to guess, NVidia probably uses Cadence Palladium/Protium solutions[1]. They're basically industry standard, and essentially everyone uses Cadence design tools for circuit design.
I can't quite grok the filter added to the DDS to generate twiddle factors for FFTs. I'll have to re-read that section a few time for it to sink in.
Shouldn't one use some hardware description language (HDL) in such chapters? Or I've overlooked where the code is placed?
[1]: https://www.cadence.com/en_US/home/tools/system-design-and-v...
For large ASIC designs like this, companies often use numerous (12+) FPGAs connected via transceivers on dedicated simulation boards.