[go: up one dir, main page]

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added a simple particle swarm based hyper-heuristic #125

Open
wants to merge 16 commits into
base: feature-hho
Choose a base branch
from

Conversation

Alan-Robertson
Copy link
@Alan-Robertson Alan-Robertson commented Mar 29, 2017

First hyper-heuristic optimisation algorithm to be added for this pull request, an example follows:
First we define a test function

f_rosenbrock = lambda x: numpy.sum([((x[i+1]  - x[i]**2)**2 + (1 - x[i])**2)/len(x) for i in range(len(x)-1)])

Next we initialise the hyper-heuristic optimiser and pass it the relevant parameters over which to optimise before calling it to perform the optimisation

hh_opt = ParticleSwarmOptimiser(['x','y'], fitness_function = f_rosenbrock)
hh_opt()

Similarly this is parrallelisable given a map function

import numpy
from ipyparallel import Client
from functools import partial
import pickle as pkl
import dill

f_rosenbrock = lambda x: numpy.sum([((x[i+1]  - x[i]**2)**2 + (1 - x[i])**2)/len(x) for i in range(len(x)-1)])

c = Client(profile='testing_profile')
c[:].use_dill()
dview = c[:]

hh_opt = ParticleSwarmOptimiser(['x','y','z','a'], fitness_function = f_rosenbrock)
hh_opt(serial_map=dview.map_sync)

@coveralls
Copy link
coveralls commented Mar 29, 2017

Coverage Status

Coverage decreased (-1.3%) to 70.407% when pulling 82fe4cd on Alan-Robertson:feature-hyper-heuristic-optimisation into 8118ab7 on QInfer:master.

Copy link
Collaborator
@cgranade cgranade left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the submission, @Alan-Robertson! It's looking like a really good feature so far, so thank you again. I've left some comments on some individual issues, but I think it's looking really good. In combination with some tests and some documentation, I think this will be awesome to have included. ♥

@@ -0,0 +1,202 @@

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should make sure the copyright header and -*- coding lines are kept on this file as well.

## CLASSES ####################################################################

__all__ = [
'ParticleSwarmOptimiser'
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The performance objective class should also be listed under __all__.

'ParticleSwarmOptimiser'
]

class HyperHeuristicOptimiser(object):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

At the level of generality that this class is written, including HyperHeuristic in the name may be slightly misleading? I think it makes a lot of sense to keep the default choice of function being the performance objective below, but to rename here as a nod to that generality. Perhaps even just Optimiser (or even Optimizer, since the project was started in American English instead of something a bit more common)...

self._funct_kwargs = funct_kwargs

if fitness_function is None: # Default to calling perf test multiple
self._optimisable = PerfTestMultipleAbstractor(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this be _optimisable or _fitness_function? I don't see any other references to _optimisable?

def fitness_function(self, params):
return self._fitness_function(params)

def parrallel(self):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think there's only one "r" in parallel?

omega_v=0.35,
phi_p=0.25,
phi_g=0.5,
serial_map=map
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The name serial_map is a bit misleading here, since it's either serial or parallel depending?

self._fitness[itr - 1]["velocities"])

# Apply the boundary conditions if any exist
if self._boundary_map is not None:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The behavior of _boundary_map seems to be similar to Model.canonicalize rather than Models.are_models_valid? If so, that's quite reasonable, but should be clearly indicated in the documentation since it's an unusual way to specify boundaries.

fitness = serial_map(self.fitness_function, particles)
return fitness

def update_positions(self, positions, velocities):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These methods are public and should thus have at least some brief docstrings. They probably don't need to be very detailed, but there should be something.

*args,
**kwargs):
self._heuristic = kwargs['heuristic_class']
del kwargs['heuristic_class']
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This can cause an exception if heuristic_class isn't defined. That's clearly an error, since this class doesn't make any sense without a heuristic class, but the exception that gets raised isn't immediately obvious. It might make sense to promote heuristic_class to being an explicitly named kwarg so that exceptions raised are more readable.

global_attractor = local_attractors[np.argmin(local_attractors["fitness"])]
return local_attractors, global_attractor

class PerfTestMultipleAbstractor:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the term Abstractor here doesn't quite communicate what the class is doing. I'd suggest something like PerformanceObjective, perhaps? Also, this should inherit from object so that it's forced to be a new-style class even on Python 2.

@cgranade
Copy link
Collaborator

It seems that unrelated to the PR itself, there's build failures that are caused by Travis not being properly configured for QuTiP 4.1 (see discussion at #126). I'll work on getting that diagnosed, then, so we can include this PR ASAP.

@cgranade
Copy link
Collaborator

When you get a second, can you pull master into this PR to include #127? That should help fix build errors. Thanks!

@coveralls
Copy link
coveralls commented Mar 30, 2017

Coverage Status

Coverage decreased (-1.4%) to 70.407% when pulling 0325109 on Alan-Robertson:feature-hyper-heuristic-optimisation into 75f8723 on QInfer:master.

@cgranade
Copy link
Collaborator

It looks like pulling in master helped fix the issues with the build, thanks again for that!

@taalexander
Copy link
Contributor

I think some documentation on use cases for the hyper-heuristic optimiser would be nice. From this commit alone I am having trouble piecing it together. I assume this is related to the paper recently listed on the Arxiv by Alan, and Chris but I have yet to find time to read it.

@cgranade
Copy link
Collaborator

I agree entirely, the plan is to add documentation to both the guide and apiref, including descriptions of a few explicit usecases. At the moment, we wanted to start the PR as-is to get initial feedback on the design, since ideally the particle swarm optimizer will be useful outside the immediate usecases that @Alan-Robertson described in his initial comment.

@coveralls
Copy link
coveralls commented Apr 5, 2017

Coverage Status

Coverage decreased (-1.4%) to 70.407% when pulling 60ce342 on Alan-Robertson:feature-hyper-heuristic-optimisation into 75f8723 on QInfer:master.

@coveralls
Copy link
coveralls commented Apr 11, 2017

Coverage Status

Coverage decreased (-0.7%) to 71.108% when pulling a2762ec on Alan-Robertson:feature-hyper-heuristic-optimisation into 75f8723 on QInfer:master.

@cgranade cgranade changed the base branch from master to feature-hho May 12, 2017 06:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants