Skip to main content

 Mark Schmidt 

...and other advances in thermal modeling from our humble beginnings with slide rules to advanced new curved elements developed specifically for thermal analysis.  

In days of yore, the world was flat. You might think that we have to go all the way back to the days of Columbus or even ancient Greece to find people who believed that, but the truth is they exist today and have their very own website. If you have a few minutes to read the FAQ for the Flat Earth Society, I highly recommend it.

Map of Flat Earth from 1893

So what does the flat Earth have to do with thermal analysis?  Until recently, thermal analysis was also flat. Our history began with Finite Difference Analysis (FDA) in the 1960s. A rectangle or brick could be made with just a few nodes, creating a complete model as long as there were no holes, bends, cut off corners, rounds, or any anisotropic materials that didn’t conveniently align with the rectangle or brick. Obviously, modeling this way took incredible approximations, and I can only assume we did not completely disclose those approximations to the astronauts who sat on gigantic fuel tanks for a lift into space where they could see this round Earth. Engineers with slide rules said it was alright, and so they went.

You may remember various physics and engineering homework problems that involved spheres, cylinders, and wonderfully flat objects. Though it may take some time, you could solve those problems with a pencil. You would see conduction modeled with simple terms such as “kA/L” or “2πk/ln(R2/R1)”. This sums up the limit of FDA. This was an era when a thousand node model was considered huge. There is absolutely nothing wrong with FDA if the geometry can be reasonably represented with simple these shapes, so these methods are available in Thermal Desktop. Rarely are we so fortunate, or so willing to make the necessary assumptions or to build a model from scratch when CAD drawings are now the starting point.

In the 1980s, structural engineers managed to develop Finite Element Analysis (FEA). This was a great advance for them, and it would eventually be a great advance for thermal engineers, but it caused us some trouble at first. These troubles began in the 1990s when mid-level managers took on the identity of disappointed mothers at Thanksgiving… looking to the thermal team and asking “Why can’t you be more like your brother?” To them, if the structural FEA can solve for temperature, then there’s no reason to waste money on specialized tools for thermal engineers (or even on the thermal engineers themselves).

Despite the wishes of these managers, it’s not simply a matter of adding thermal equations to structural elements. We tried, of course, but large facets did not capture the geometry, and adding more facets meant adding nodes in thermally uninteresting places that bogged down the solution. This was a time when the world was just getting to know the word “Pentium,” and we couldn’t wait as long for system-level transient runs to complete as we did for Deep Thought to calculate the answer to Life, the Universe, and Everything. The only option was to use coarse elements that looked wrong even to the untrained eye (i.e., the managers). Even for parts where the geometry was not distorted, finite element networks just looked wrong to a generation of thermal engineers who were accustomed to finite difference and lumped parameter analyses. 

Pentium and Douglas Adam's Deep Thought

At the time, many engineers came to the conclusion that FEA must be ill-suited for thermal analysis, and a battle ensued within the ranks. You’d think that showing FEA was accurate would end the debate, but victory was delayed many years. Part of the delay was caused by the fact that FEA conduction matrices appeared counterintuitive compared to hand-calculatable FDA matrices. The appearance of completely correct negative terms in these matrices alarmed thermal engineers, which is ironic because negative terms are even more likely to occur when thermal engineers tried to use oversized and skewed elements in a desperate attempt to run system-level transients.

Thermal Desktop uniquely offered both finite element and finite difference analysis so engineers could use FDA when the geometry was suitable and FEA when parts got more complicated. Even so, those convinced that FEA was wrong pushed the limits of FDA beyond reason, which helped no one.

For the most part, the battle finally ended around the turn of the millennium. Now it’s understood that both FEA and FDA are mathematically accurate for thermal analysis… accurate in the sense that the physics of the model are solved correctly, but approximations between the actual geometry and the model can cause problems. An eight node FE wheel model will be a mathematically correct stop sign.

As older thermal folks embraced FEA, and as younger engineers trained in FEA in college couldn’t believe there was ever a choice, it became clear that we were a secondary consideration after structural and CFD. We simply aren’t that big of a market compared to the others.

Except at CRTech: thermal analysis is our only market. A few years ago we decided that thermal engineering had such unique modeling needs that using hand-me-downs from bigger disciplines was causing problems. We embarked on a quest to develop a type of finite element specifically for the needs of thermal analysis. We originally called them “thermal elements,” but we found the best name for something new is whatever clearly identifies it, so now we just call them “curved elements.”

A propeller with different modeling elements

So why are curved elements ideal for thermal analysis? Because they separate the fidelity to the geometry from the resolution of the thermal solution. We need accurate reflections for radiation, accurate surface areas for contact and convection surfaces, and accurate volumes for thermal capacitance. But we usually don’t need to calculate the temperature at dozens of locations around a curve. Simply put, when complex geometry includes curves, curved elements allow us to gain the accuracy we need without increasing the node count.

Maybe in the distant future computers will be able to quickly handle hundreds of millions of nodes (or the distant past for Deep Thought), but for the here and now, we want to allocate our resources to something more useful. Fewer nodes mean faster models, and so we can spend more time on what-if scenarios, parametric sweeps, and those beloved Power Point slides.

We thermal engineers have come a long way since slide rules put people into orbit above a round planet. Back then, we over-engineered parts to cover the weakness in analysis techniques, but those days are over. With technology battling for the last sliver of efficiency and shrinking to fit into pockets, we will play a key role in the technology in the future… so long as we don’t cling to the ideas of the past as does the Flat Earth Society.