Imagine you’re working for a manufacturing firm and you’ve developed in the lab a new process for making polyamides with desirable properties resulting in plastic film that’s more transparent. You believe that food packaging companies will pay a little more for this improved material because it will make bacon and other foods look more appealing.

But you have a problem: you can’t be certain that the delicate processes you’ve developed on the small scale in the lab will yield exactly the same results when you try to duplicate them in huge polymerization reactors. It could be an embarrassing waste of time and money to tool-up only to find the final product isn’t as strong or light or economical to produce as your experiments in the lab suggested. How can you be as certain as possible that, given all of the investment, the final product will perform as you hope? It’s a problem of process systems engineering and you need a mathematical modeller to solve it most efficiently.

Professor Kim McAuley
PREDICTING OUTCOMES: “I’ve been drawn to math all my life but I always liked math that explains what is going on the world rather and math for sake of discoveries,” says Professor Kim McAuley, pictured with Master's student Brad Buren.

“I get involved with companies that make plastics and chemicals at large scale,” says Queen’s chemical engineering professor and Associate Dean, School of Graduate Studies, Kim McAuley. “They want to do some math before they try something new at large scale. They’d like to know ahead of time — if they try to operate at a different temperature, at a different throughput, and/or use different catalysts — that we can predict what’s going to happen and how.”  

Mathematical modelling isn’t just for polyamide films, it’s helpful in all manner of industrial and research applications that require a prediction of untried chemical process conditions based on some, but not always enough, experimental data. And it’s not always about plugging new values into tried-and-true models.

“There often aren’t statistical tools for the kinds of modelling problems we encounter again and again,” says McAuley. “My research group and I do fundamental work, funded through my NSERC Discovery Grant, developing statistical techniques for use in our future models and by other modellers who are trying to decide which experiments should be done next, or which parameters they should be estimating.”

For example, McAuley and graduate student Brad Buren are helping the research group led by Professor Judit Puskas at the University of Akron with models to predict average lengths and branching levels of arborescent polyisobutylene polymer chains in newly developed materials.

“Judit needs to understand that so she can design processes to make polymers with the targeted properties that she wants for different biomedical applications,” says McAuley.

Then there’s McAuley’s collaboration with medical physicist John Schreiner on gel dosimetry. McAuley’s group has developed models and manufacturing processes for polymer gels that react visibly to the kinds of radiation used in cancer treatment. Knowing precisely how much radiation is being delivered to a patient at locations within and near tumours is critically important. Gel dosimetry is a way for medical physicists and oncologists to know that radiation treatment plans and equipment are working properly by building precise three-dimensional maps of radiological exposure before patients are exposed to any radiation at all.

“It’s one of the fun things about being a mathematical modeller,” says McAuley. “You get to know a lot about a wide variety of chemical processes, how they work, and how different researchers conduct measurements. You get to interact with a lot of fun and smart people and you learn a lot about chemical engineering. If you have mathematical modeling skills, you can learn about so many different things.”