⟵ All articles

The Hardware Testing Divide

November 23, 2020

I was recently talking with a colleague and he brought up a topic that I've been thinking about a lot recently because of Lager. He said (and I'm paraphrasing) that unlike with software/web development, in hardware you have two distinct classes of testing, and each has its own set of tools and engineering cultures.

One class of testing covers product development. This is where engineers talk a lot about unit, system and QA tests. The tools used to run these tests could include debug-probes, serial dongles, python scripts, one-off test firmware, etc. The purpose for these tests is to make sure the product reliably does what the specification, or PRD, or CEO's latest email, says it should do. If you're lucky enough to be on a team that realizes the value of continuous integration, and has set up end-to-end CI pipeline that includes device hardware, then these tests are also invaluable in catching regressions as new features are added.

The second class of testing covers factory new product introduction (NPI), and everything after (i.e. EVT, DVT, PVT, MP). The purpose of these tests is to make sure units that are assembled incorrectly, or contain defective components are caught. Typically these tests are created by dedicated manufacturing engineers, who bring their own, completely different, toolchain than what the dev/product team uses (e.g. expensive test fixture, Windows 7/10 machine, LabView, optical sensors, etc).

What I've just described for hardware is in stark contrast with how web products are developed and deployed. Well, I take that back, the first part kind of lines up between hardware and web development, it's the second part that is so foreign to the web world. As a web product goes through the traditional gates: dev→QA→staging→production there isn't a line drawn where on one side of the line you "do this kind of testing and use these tools" and on the other side of the line you "do a completely different kind of testing and use a whole other set of tools". And where the two sides are completely divorced from one another.

And yes, deploying code to a million servers/users is very very different from manufacturing a million physical devices that contain 1000x (or more) different components. I 100% appreciate that difference.

But wouldn't it be cool if hardware development was more like web development? Think about the cost savings if the testing platform created for product development was identical to what was used on the manufacturing line. All of a sudden your manufacturing team is able to clearly communicate with the engineering team as issues come up. Or maybe you're a small startup where engineers are forced to wear multiple hats. All of a sudden switching between development, to manufacturing, and then back again is frictionless because the tools and platforms are identical, making context switching manageable.

So how does something like this come about? The first step would be to develop a testing platform that is powerful enough to handle both classes of hardware testing. The second would be to make it flexible but intuitive, so you don't end up creating a monster that no one can, or wants to, use. The goal is to create a platform that a team can grow with as their needs grow. Starting with bench top prototypes, progressing through functional prototypes, and finally moving to NPI, an ideal platform would be able to handle each case. There will always be specialization in hardware. But just like technologies like email and then Slack helped to simplify whole classes of communications, the right testing platform can help streamline hardware development and manufacturing.

Teams that are able to build such a platform will have an advantage over teams that don't because they'll be able to ramp up production more quickly and maximize their yields faster. Given how capital intensive hardware is, and combine that with the fact that any new product has an expiration date before a new technology makes it obsolete, means that getting to market fast is the difference between a successful company and one that fails.

This is the problem statement that Lager is trying to solve. We believe that as hardware becomes more sophisticated, and as a result more complicated, we'll need testing platforms that are robust, easy to use, and flexible enough to adapt to these new requirements.