Unexpected Temperature Drop During IGBT On Phase in Boost Converter

Hello everyone,

I’ve recently started exploring the field of power electronics, so I’m still getting familiar with many concepts. While I’ve managed to work through several challenges so far, there’s one issue I just can’t wrap my head around.

I’m working with a boost converter, intending to step up from 10V to 250V. I want to analyze the temperature behavior of the IGBT in this circuit. I’ve chosen a thermal model and implemented an initial cooling concept. However, I’ve noticed something: during the IGBT’s turn-on phase, the temperature decreases.

This seems wrong to me. Based on my setup, I expect—and do observe—significant conduction losses. Yet, when I try different input and output voltage levels, the temperature consistently drops during the turn-on phase. I just can’t figure out why.

As I mentioned, I’m still relatively new to this field and may be missing something fundamental. I’ve attached my circuit design, the thermal models I used, and a scope of the IGBT’s temperature and the losses.

I’d greatly appreciate any insights or guidance you can provide.

Best regards,
BOOST - PCM.plecs (11.2 KB)
IKQ120N120CS7_igbt.xml (6.3 KB)
E6D40065H.xml (3.0 KB)

Driton

Your PLECS model is incomplete as it does not include a Heatsink component and therefore one cannot measure the losses or temperature in the PLECS model. Could you post an updated model?

Your thermal models also seem altered from the vendor models. Both models only have a first-order thermal network, while the vendor supplied models for these parts have a 4th order thermal network. Any idea why that is? Basically 75% (IBGT) and 95% (Diode) of the thermal impedance from the datasheet is missing, such that more heat flows out of the device than into it due the conduction losses resulting in the temperature reduction.

I also suspect the case-to-ambient thermal impedance in your model is quite small, which one can confirm with the full model.

1 Like

Thank you for your feedback!
I just realized that I accidentally uploaded old files—no idea how that happened. Here are the updated files.

You’re absolutely right; this was my mistake. I had been experimenting with the vendor models, and it seems I uploaded the wrong ones, which is why the thermal network configuration is different.

Regarding your observation about the case-to-ambient thermal impedance: yes, it is indeed low in my model. But is this the only reason why I observe a temperature reduction during the on- phase?

Additionally, I conducted further tests:

  • When boosting from 10V to 20V, I observe a temperature rise in the IGBT on-phase.
  • However, if I only change the input voltage to 100V while keeping the duty cycle and load resistance constant, I observe a temperature drop during the on-phase.

Could this be due to the high voltages not being adequately modeled in the IGBT’s thermal representation?

BOOST - PCM.plecs (75.8 KB)
IGQ120N120S7_igbt.xml (6.6 KB)
C5D50065D.xml (3.1 KB)

Thank you for posting the complete models. The reason why the temperature decreases is because the conduction losses at the operating point you’re modeling are quite low compared to the switching losses and the energy flowing to ambient.

Consider the following simple circuit, where Qsw represents the device losses. After a turn-on event Tj has a step increase. If (Tj-Tamb)/Rth < Qsw then the temperature will decrease even though the device is conducting and Qsw due to conduction losses is non-zero.

image

I’ll also note that the IGBT thermal model has the “Convert to Cauer At Simulation Start” option unchecked in the IGBT’s Thermal Description tab. One should check this option if the device is not connected to a constant temperatures. See this previous forum post discussing issues of series connected foster networks.

1 Like