Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Docs] typos #896

Merged
4 changes: 2 additions & 2 deletions .github/PULL_REQUEST_TEMPLATE.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,6 @@ I changed the `foo()` function so that ...
Here are some things to check before creating the PR. If you encounter any issues, do let us know :)

- [ ] I have read the [CONTRIBUTING](https://github.com/neuropsychology/NeuroKit/blob/master/.github/CONTRIBUTING.rst#structure-and-code) file.
- [ ] My PR is targetted at the **dev branch** (and not towards the master branch).
- [ ] My PR is targeted at the **dev branch** (and not towards the master branch).
- [ ] I ran the [CODE CHECKS](https://github.com/neuropsychology/NeuroKit/blob/master/.github/CONTRIBUTING.rst#run-code-checks) on the files I added or modified and fixed the errors.
- [ ] I have added the newly added features to **News.rst** (if applicable)
- [ ] I have added the newly added features to **News.rst** (if applicable)
2 changes: 1 addition & 1 deletion neurokit2/complexity/entropy_shannon.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ def entropy_shannon(signal=None, base=2, symbolize=None, show=False, freq=None,

Compute Shannon entropy (SE). Entropy is a measure of unpredictability of the
state, or equivalently, of its average information content. Shannon entropy (SE) is one of the
first and most basic measure of entropy and a foundational concept of information theory,
first and most basic measures of entropy and a foundational concept of information theory,
introduced by Shannon (1948) to quantify the amount of information in a variable.

.. math::
Expand Down
6 changes: 3 additions & 3 deletions neurokit2/hrv/hrv_nonlinear.py
Original file line number Diff line number Diff line change
Expand Up @@ -101,9 +101,9 @@ def hrv_nonlinear(peaks, sampling_rate=1000, show=False, **kwargs):
* **SampEn**: See :func:`.entropy_sample`.
* **ShanEn**: See :func:`.entropy_shannon`.
* **FuzzyEn**: See :func:`.entropy_fuzzy`.
* **MSE**: See :func:`.entropy_multiscale`.
* **CMSE**: See :func:`.entropy_multiscale`.
* **RCMSE**: See :func:`.entropy_multiscale`.
* **MSEn**: See :func:`.entropy_multiscale`.
* **CMSEn**: See :func:`.entropy_multiscale`.
* **RCMSEn**: See :func:`.entropy_multiscale`.
* **CD**: See :func:`.fractal_correlation`.
* **HFD**: See :func:`.fractal_higuchi` (with ``kmax`` set to ``"default"``).
* **KFD**: See :func:`.fractal_katz`.
Expand Down
Loading