-
Notifications
You must be signed in to change notification settings - Fork 333
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Replace multiprocessing with multiprocess #480
Comments
Is this still relevant? If so, what is blocking it? Is there anything you can do to help move it forward? This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. |
I'm interested in this. Is there a quick way to implement it? |
@chrisotoro sorry, I'm pretty busy right now and had no time to look at it so far. But as far as I remember I did exactly what I mentioned above. But if you give me a day, I can look it up. |
Is this still relevant? If so, what is blocking it? Is there anything you can do to help move it forward? This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. |
Hello everyone,
I just started to use pyswarms, thank you a lot for the nice package. During exploring the features I came across a minor issue.
Is your feature request related to a problem? Please describe.
I'm mainly using jupyter notebooks (conda python 3.8, Windows 10), but when running the optimizer with multiple processes, i.e.
the jupyter notebook is busy, but does nothing and the kernel cann't be interrupted. An error is thrown in the console from which the notebook is started,
AttributeError: Can't get attribute 'f' on <module '__main__' (built-in)>
. As it turns out, this is a known problem ofmultiprocessing
, and it is not related to pyswarms by itself.Describe the solution you'd like
Run parallelized optimization from jupyter notebooks. According to the stack exchange post, one solution is to replace
multiprocessing
with the forkmultiprocess
. I've tested it with the example from the multiprocessing documentation and it works out of the box. I see no reason why it should not work withpyswarms
Additional context
Just replacing
multiprocessing
withmultiprocess
should not be too difficult and, if there is interest, I would provide a PR. What do you think?Thanks!
The text was updated successfully, but these errors were encountered: