Please take a minute to review and accept our Terms of Use.
Welcome to the PLECS User Forum, where you can ask questions and receive answers from other members of the community.

Many technical questions regarding PLECS are answered on the Technical Solutions page of our website. Tutorial videos, specific application examples, and pre-recorded webinars are available on our YouTube page. Please follow us on LinkedIn for the latest Plexim news.

Run multiple simulations in parallel using XML-RPC

+1 vote


after some trouble, I can now use the XML-RPC interface to open and run PLECS-simulations from an external tool. But it seems, I can only run one simulation at once: set one parameter, run, wait for the result, modify parameter again, run again, wait for result, ...

Is is possible, to start multiple simulations on a multicore-system in parallel? I'd need to start PLECS with command line parameters like -multipleinstance -rpcport 1081 -nogui

Is this possible?

asked Apr 23, 2019 by msta (13 points)

2 Answers

+1 vote


Unfortunately this is not possible currently.

EDITED: Parallel simulation is now available natively in PLECS Standalone. Look at the "PLECS: Buck Converter with Parameter Sweep" demo model.

Thank you,


answered Apr 23, 2019 by Kris Eberle (1,288 points)
edited Apr 27 by Kris Eberle
Well, I think, one could work around this by using a different user for each process. But this would cost too much extra-effort for my project.

So I would like to see such a feature soon :-)

One more idea: The float-to-string/string-to-float conversions and the XML parsing of the XML-RPC response costs a significant percentage of CPU-time. For transfering large result data arrays, a memory mapped buffer (shm_open+mmap for Linux, OpenFileMapping+MapViewOfFile for Windows) would be nice. The XML-RPC could then deliver the name and size of this file.
what happen if you run multiple file with different filename can be run in parallel?
+1 vote

hi I wrote a workarund in python to do that you can find on my github

answered Oct 31, 2019 by tinivella (106 points)