Intrinsic one sample delay in triggered subsystem?

Hello everyone, I just want to ask something about triggered subsystem.

I used this kind of system to simulate a board that works with discrete cycle time.

Let’s suppose a simple empty triggered block…Single input, single output.

If I watch the signals, they are perfectly the same. How is it possible?

I try to explain… In every real board, there is always at least one sample delay. Therefore, the output should be shifted at least by one sample delay compared to input. Am I wrong?

How can I consider this delay? Just with a zero-holder or there is a specific subsystem to achieve this?

I hope to have made myself quite clear.

Thank you!

Hi,

computation time is not modelled intrinsically in PLECS. Any mathematical operation in a simulation step is instantanious, e.g. the output of a sine block becomes the sine of its input in the same simulation step.

A triggered subsystem behaves exactly the same - it may just not be executed in every simulation step.

If you want to model the delay introduced by real systems (e.g. a computation delay) you should use a delay block.

As an alternative (if the computational delay is very small compared to the trigger period) you could use a Pulse Delay block behind the triggered subsystem.

Kind regards,

Oliver Schwartz