Your cart is currently empty!

No math in the lab!
Bram Nauta scrutinizes artificially enhanced results at circuit design conferences.
The atmosphere at the conference was great. Many young PhD students like me were attending. I saw many fantastic papers on chip design! The results were sometimes ten times better than the state of the art. I saw very complex circuit diagrams with many blocks that didn’t show all those complicated details of transistors. The measured graphs looked impressive and the chip photographs were beautiful. I was excited about the burst of progress in our field.
Some older people seemed less enthusiastic. They were asking questions. I didn’t understand their agitated tone during the Q&A sessions. They said that the results were impossible and asked if they were with or without chip calibration. What a strange question to ask! Who cares?
I remember one presenting author being frank and making no effort to hide the fact that he did use calibration to improve the results. Pressured some more by an old man, the author divulged that he had used Matlab to post-process his results. In Matlab, he could do the same thing you would otherwise have to do in the digital domain on a chip. However, that would have been a lot of work to design and measure, so Matlab was a great idea for enhancing his results. The old man said, “But you don’t even mention the word ‘calibration’ in your paper!” To which the presenter countered that the paper wasn’t about calibration but about the chip – the calibration not included.
Another presentation was about a reference circuit. I learned at university that, normally, if you create a reference circuit on a chip, the value varies significantly due to process spread. You can calibrate that out, but the value remains temperature-dependent. Therefore, I understand it’s very challenging to make a stable reference on a chip that’s highly precise across different temperatures.
This particular chip produced excellent results. Thanks to the Matlab post-processing, the reference was accurate. Even if the temperature changed, the reference value stayed the same, and even the second-order derivative of the temperature dependence was zero. Why did nobody think of this earlier?
Next, I went to the session on analog-to-digital converters. Again, amazing circuits that I couldn’t understand. As usual in academic presentations, these circuits were tested with a single sine wave input, and in the digital output, we like to see the digital representation of the sine wave, with little noise and distortion. In that session, again, all papers presented excellent results. I didn’t understand the circuits, but they all used Matlab to produce an almost perfect sine wave.
As a young and naïve student, I may have loved the conference, but today, I’m very skeptical about post-processing your results in Matlab, especially without fully disclosing what has been done. Often, authors don’t describe what they actually do and sometimes even don’t mention the post-processing at all. Yes, the results are better with an order of magnitude, but what’s the point of fooling each other? Maybe next year, there will be an analog-to-digital converter with near-zero power dissipation and near-zero area that still makes a digital sine wave with Matlab; all you need is amplitude, frequency and phase information, right? Matlab can make sine waves – there’s nothing magic there!
Young academics are under high pressure to get promoted. For that, they need to have their papers accepted at prestigious conferences. To get accepted, these papers are reviewed by people who have succeeded in publishing papers in the past. There are almost no critical outsiders anymore. We’re now in a positive feedback loop where numbers get better for the sake of getting papers accepted. Matlab seems the next level.
Conferences like to be prestigious, but if they keep accepting these papers because they want to show the best results, they’ll soon become irrelevant.