The Moving average function in Plecs, gives wrong mean value for a time domain current signal, Whereas the mean function in the display window (i.e. Waveform viewing window) gives the correct value example for the same is attached below, How can we resolve this discrepancy?
Can you include your model?
Have you made sure that your averaging period is the same between both approaches?
Including the model is not feasible for confidentiality reasons.
The averaging period for the moving average is 10,000 cycles, and is the same I used to calculate the mean of the signal in the graph window.
That’s not clear from your scope image where you show that the time delta is 0s. Make sure that the cursors are set to the last 10,000 cycles of the scope data window so that it’s the same data as is shown in the Display block (assuming that was viewed at the end of the simulation).
I think I am measuring it right, just as you said, but still the value is somehow different. The moving average asks for a averaging time interval which i specified to be a large enough window, i.e. 1e-5 while the frequency of the signal is 5MHz, which is a sufficiently large window, the mean should evaluate to the same value if the window is large enough (and I am measuring it towards the end of the simulation when the transient dies out) but somehow I see the difference between the ways of calculating it. i.e. the moving average window gives a value of 0.1650, whereas when i use the scope’s mean function it gives 0.159 which is what I expect.
And if you use a moving average interval of 1/5e6s what do you get for the average?
Same 0.1650 even for one cycle, whereas mean around the steady state in the waveform display window gives 0.159313 (which seems to be a correct value).
Hi again, perhaps there is a solver-related influence on the averaging calculation, but it is hard to diagnose this without a basic working example. Are you able to strip out the circuit and just demonstrate the core issue you are seeing and include that here? In the meantime, I am attaching a simple example for you that shows that changing the Moving Average averaging period can greatly influence the result. But with the current setup the results match:
However, if I change the averaging interval to 10x the switching frequency (0.01s), you can see the average is not more accurate in this case, but less:
Let me know if this provides any clarity.
Hello Kris,
I think in the example you shared, as you increase the averaging time (0.01s) (Measured Current: 0.0163), the reading gets more accurate, because the mean is expected to be around 0, since the voltage source is sinusoid, and you have added white noise with mean 0, therefore the average current should have a 0 mean, which is measured more accurately with larger averaging period. Moreover the scope data which you are measuring is for a very small interval, which will give a wrong value, if you expand the cursors you will see a near 0 value in the scope as well.
That is why I have used a larger averaging time in my sims as well, for more accurate evaluation which I am not seeing.
Also, if its a solver related issue, do you know what settings might result in a more accurate solution?
Okay, When I increase the relative tolerance of the simulator to 1ns (previously set to auto) then the average values match in the waveform display, and the moving avergae block in plecs. Thanks!! Kris
Yes, what you said makes sense - but the averaging interval has to make sense based on what data window you are interested in. Anyway, I’m glad the solver setting helped, that sounds very plausible. Cheers.
This question was solved in the above comment.