-
Notifications
You must be signed in to change notification settings - Fork 333
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix bug in topology classes #176
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Glad you caught this bug, @whzup !
Will a test asserting min(swarm.pbest_cost) == swarm.best_cost suffice for all cases? Even if we change the number of neighbors, we want to assure that the particle with the smallest possible fitness will always be considered.
A short integration test like the one you posted in pastebin is also a nice idea. We conjure up a small swarm in conftest, set the random seed, and check if the value will always be the case. I believe in pytest or numpy there is an approx()
method that can test equality with tolerance.
Also, I might be quite inactive for the next two weeks until August 4 (next two weeks: conference and moving out). I suggest we do a feature lock in the development branch (no new features, just bug fixes if possible) so that I can keep up with my backlog in #157 and prepare for v.0.3.0 release by next month. What do you think?
tests/backend/topology/conftest.py
Outdated
"pbest_cost": np.array([1, 2, 3]), | ||
"pbest_pos": np.array([[1, 2, 3], [4, 5, 6], [7, 8, 9]]), | ||
"pbest_cost": np.array([2, 3, 1]), | ||
"pbest_pos": np.array([[4, 5, 6], [7, 8, 9], [1, 2, 3]]), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So changing this still works as expected, right? Interesting. Glad you caught this bug. 👍
Hey @ljvmiranda921. Yes, I think this test suits our needs. I'm going to add some new tests so we are sure it works as expected. |
Yup, people can still write issues. We attend to them if these are bugs We
relegate them to v.0.4.0 if they are feature requests :)
Also, the boundary conditions PR may come up in a few weeks, so let’s wait
for that. Thanks a lot Aaron!
|
Ok, stuff is done 😄 I found another bug in the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Implementation LGTM. How were you able to make the cost and positions deterministic? Did you set a random seed?
For the tests, if we only have one value, there's no need to turn it into a fixture. Usually we use fixtures for objects, data-holders, etc. This is just a parameter so we can set it locally inside the tests themselves. 👍 Using pytest.mark.parametrize
is enough.
@@ -206,7 +203,7 @@ def __compute_neighbors(self, swarm, k): | |||
# Generate connected graph. | |||
while connected_components(adj_matrix, directed=False, return_labels=False) != 1: | |||
for i, j in itertools.product(range(swarm.n_particles), repeat=2): | |||
if dist_matrix[i][j] == 0: | |||
if dist_matrix[i][j] == np.inf: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the bug in the Random
class right? 👍
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's right 👍
tests/backend/topology/conftest.py
Outdated
[9.99959923e-01, -5.32665972e-03, -1.53685870e-02]]), | ||
"best_cost": 1.0002528364353296, | ||
"best_pos": np.array([9.90438476e-01, 2.50379538e-03, 1.87405987e-05]), | ||
"options": {'c1': 0.5, 'c2': 0.3, 'w': 0.9}, | ||
} | ||
return Swarm(**attrs_at_t) | ||
|
||
|
||
@pytest.fixture(scope="module") | ||
def k(): | ||
"""Default neighbor number""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If this is just one value, no need to turn this into a fixture. If there will be multiple values in the future, just use pytest.mark.parametrize
. 👍
@ljvmiranda921 I actually just optimized a function with a |
Resolves #170 Fixed a bug in the topology class where not the best values were used as result when optimizing with the LocalBest class. Made the tests for the topologies less trivial by changing the index of the best cost and best position in the fixture swarm. Added a bigger fixture swarm to test the topology classes. Additionally, I fixed a little bug in the Random topology where the elements were compared to 0 instead of np.inf. Deleted the fixture for the neighbour number and replaced it with a pytest.mark.parametrize().
Description
Fixed a bug in the topology class where not the best values were used as the result when optimizing with the LocalBest class. Before, the
best_neighbor
was calculated as the particle that neighbours the best particle, not the best particle found. In theRing
topology, the special case fork=1
used the wrong axis fornp.argmin()
.Made the tests for the topologies less trivial by changing the index of the best cost and best position in the fixture swarm. Changed the index of the best position and the best cost of the fixture swarm.
Related Issue
See also #170
Motivation and Context
The
LocalBest
optimizer did not return the best result found.How Has This Been Tested?
I added some debugging prints to the code and analyzed the output of the different topologies.
Types of changes
Checklist: