Models and Malarky


Every chemical engineer loves a good computational model. You have a reaction stream, you add some stuff, and would you look at that, there will be a minor exotherm. You'd better cool it down a bit before adding the stuff and be sure to add it slowly. That'll add some capex and time costs, but that's what the budget is for.

Models are great. But they usually need verification in the real world.

I've been called a radical empiricist for this view (not that kind, or that kind). But I don't think it's unreasonable. Calculated models are not the answer. They're the question.

This came up at work lately with a predicted exotherm on mixing two solutions. The modeling software spat out a nearly 20 °C exotherm, which was ridiculous given the components. The modelers asked for help, I measured it in the lab, and it was actually a 2 ° endotherm in a flask. The hunt for software bugs begins.

But a lot of times, when it's more subtle, that kind of result is overlooked or ignored. At best, you end up over-provisioning your equipment and wasting your budget. Or maybe your timeline gets pushed back because you have to source a beefier part. But at worst, well, kaboom.

It's the same problem as with organic reaction mechanisms. Nature is complex. We can't yet accurately simulate every bit of the universe down to the boson and quark or whatever. So we compromise and make models. Usually they're pretty good. Usually.

So verify your models. Maybe it's my bias as a process chemist, but at the end of the day, it's my job to make sure product gets into the drum. When the equipment is designed and built wrong because the model said it was fine, it makes that job a lot harder.