Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixing some minor typos in the docs #2790

Merged
merged 2 commits into from
Jan 13, 2018
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions docs/source/advanced_theano.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ Using shared variables

Shared variables allow us to use values in theano functions that are
not considered an input to the function, but can still be changed
later. They are very similar to global variables in may ways.::
later. They are very similar to global variables in may ways::

a = tt.scalar('a')
# Create a new shared variable with initial value of 0.1
Expand All @@ -23,7 +23,7 @@ their shape as long as the number of dimensions stays the same.

We can use shared variables in PyMC3 to fit the same model to several
datasets without the need to recreate the model each time (which can
be time consuming if the number of datasets is large).::
be time consuming if the number of datasets is large)::

# We generate 10 datasets
true_mu = [np.random.randn() for _ in range(10)]
Expand Down
2 changes: 1 addition & 1 deletion docs/source/gp.rst
Original file line number Diff line number Diff line change
Expand Up @@ -244,7 +244,7 @@ also need to include the additional arguments, :code:`X`, :code:`y`, and
This second block produces the conditional distributions. Notice that extra
arguments are required for conditionals of :math:`f1` and :math:`f2`, but not
:math:`f`. This is because those arguments are cached when calling
:code:`.marginal_likelihood` was called on :code:`gp`.
:code:`.marginal_likelihood` on :code:`gp`.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the original is correct here: "...are cached when calling marginal likelihood on gp."

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My thinking when I read the sentence was that the action "to be called" was repeated and that one copy should be removed, but since I'm not familiar with the code at all it's completely possible I misinterpreted the statement, and "calling .marginal_likelihood" (as a gerund) is what is actually called on gp.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. Could say “are cached when marginal_likelihood is called on gp.” instead.


.. note::
When constructing conditionals, the additional arguments :code:`X`, :code:`y`,
Expand Down
2 changes: 1 addition & 1 deletion docs/source/intro.rst
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ PyMC began development in 2003, as an effort to generalize the process of
building Metropolis-Hastings samplers, with an aim to making Markov chain Monte
Carlo (MCMC) more accessible to applied scientists.
The choice to develop PyMC as a python module, rather than a standalone
application, allowed the use MCMC methods in a larger modeling framework. By
application, allowed the use of MCMC methods in a larger modeling framework. By
2005, PyMC was reliable enough for version 1.0 to be released to the public. A
small group of regular users, most associated with the University of Georgia,
provided much of the feedback necessary for the refinement of PyMC to a usable
Expand Down
4 changes: 2 additions & 2 deletions docs/source/prob_dists.rst
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ A variable requires at least a ``name`` argument, and zero or more model paramet

p = pm.Beta('p', 1, 1, shape=(3, 3))

Probability distributions are all subclasses of ``Distribution``, which in turn has two major subclasses: ``Discrete`` and ``Continuous``. In terms of data types, a ``Continuous`` random variable is given whichever floating point type defined by ``theano.config.floatX``, while ``Discrete`` variables are given ``int16`` types when ``theano.config.floatX`` is ``float32``, and ``int64`` otherwise.
Probability distributions are all subclasses of ``Distribution``, which in turn has two major subclasses: ``Discrete`` and ``Continuous``. In terms of data types, a ``Continuous`` random variable is given whichever floating point type is defined by ``theano.config.floatX``, while ``Discrete`` variables are given ``int16`` types when ``theano.config.floatX`` is ``float32``, and ``int64`` otherwise.

All distributions in ``pm.distributions`` will have two important methods: ``random()`` and ``logp()`` with the following signatures:

Expand Down Expand Up @@ -128,4 +128,4 @@ The original variable is simply treated as a deterministic variable, since the v
>>> model.deterministics
[g]

By default, auto-transformed variables are ignored when summarizing and plotting model output.
By default, auto-transformed variables are ignored when summarizing and plotting model output.
6 changes: 3 additions & 3 deletions docs/source/theano.rst
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ is similar to eg SymPy's `Symbol`)::
y = tt.ivector('y')

Next, we use those variables to build up a symbolic representation
of the output of our function. Note, that no computation is actually
of the output of our function. Note that no computation is actually
being done at this point. We only record what operations we need to
do to compute the output::

Expand All @@ -57,7 +57,7 @@ do to compute the output::
we are working with symbolic input instead of plain arrays.

Now we can tell Theano to build a function that does this computation.
With a typical configuration Theano generates C code, compiles it,
With a typical configuration, Theano generates C code, compiles it,
and creates a python function which wraps the C function::

func = theano.function([a, x, y], [out])
Expand All @@ -79,7 +79,7 @@ in `theano.sparse`. For a detailed overview of available operations,
see `the theano api docs <http://deeplearning.net/software/theano/library/tensor/index.html>`_.

A notable exception where theano variables do *not* behave like
NumPy arrays are operations involving conditional execution:
NumPy arrays are operations involving conditional execution.

Code like this won't work as expected::

Expand Down