Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Discrete Parameters #186

Open
philippkraft opened this issue Sep 25, 2018 · 2 comments
Open

Discrete Parameters #186

philippkraft opened this issue Sep 25, 2018 · 2 comments

Comments

@philippkraft
Copy link
Collaborator

All parameters sample from a continuous distribution. But not all parameters are on a cardinal scale. To sample from nominal and ordinal scales we should provide a discrete parameter.

  • The simplest form would be uniform sampling from integer values using the np.random.randint function
  • Next feature would be for nominal scaled values, which are not necessarily numbers, eg. names, but with still equal probability is also simple to implement using a "value" list
  • A bit more complicated would be to assign probabilities for each of the discrete values that are still respected.

This is needed to keep the full functionality for the DDS algorithm #185 but would also open the door for other samplers that respect the parameter distribution.

@thouska
Copy link
Owner

thouska commented Feb 18, 2019

This has been picked up by @bees4ever at #195. Each parameters can be set now with a new flag which can be set to treat them as integer values. This is used so far only for DDS. Needs still to be implemented for all other algorithms.

@philippkraft
Copy link
Collaborator Author

I would prefer a clear new parameter class for discrete parameters, as discrete distributions are not the same as continuous distributions. But it would not work for algorithms that do not respect the distribution of the parameter. Something like this:

class UniformDiscrete(Base):
   """
   A parameter sampling from a discrete range of integer numbers.

   .. warning:: This sampler should be unproblematic for undirected samplers (eg. mc and lhs),
                but the behaviour for directed samplers like sceua, dds, rope is not explored.

   """
   __rndargs__ = 'low', 'high'

   @staticmethod
   def sample(low, high, size):
       return np.float(rnd.randint(low, high, size))

   def __init__(self, *args, **kwargs):
       """
       :name: Name of the parameter
       :low: Lower limit of the parameter
       :high: Upper limit, should be larger than `left`.
       :step:     (optional) number for step size required for some algorithms,
               eg. mcmc need a parameter of the variance for the next step
               default is quantile(0.5) - quantile(0.4) of
       :optguess: (optional) number for start point of parameter
               default is median of rndfunc(*rndargs, size=1000)
               rndfunc(*rndargs, size=1000)
       """

       super(UniformDiscrete, self).__init__(rnd.randint, 'randint', *args, **kwargs)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants