You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/tutorials.md
+20-9Lines changed: 20 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -15,6 +15,17 @@ The command line interface takes csv files as input. Each csv file must contain
15
15
Below, we give an example based on files containing the evaluations of PPO,DDPG,SAC,TRPO, four Deep Reinforcement Learning algorithmes, given in the \`examples\` directory of the main repository.
16
16
17
17
18
+
## Installation
19
+
20
+
To install adastop, use pip:
21
+
```bash
22
+
pip install adastop
23
+
```
24
+
25
+
This will automatically install the command line interface as well as the python library.
26
+
27
+
28
+
18
29
## Help for cli tool
19
30
20
31
The AdaStop algorithm is initialized with the first test done through \`adastop compare\` and the current state of AdaStop is then saved in a pickle file. The help of \`adastop\` command line can be obtained with the following:
@@ -90,7 +101,7 @@ The input format of adastop is under the form of a csv file containing the score
90
101
91
102
Let us launch AdaStop on this first batch of data.
92
103
93
-
First, we clean up the corrent directory of any litter files that could have been spawned by a previous usage of \`adastop\` (if you never used \`adastop\` before, this command will not have any effect).
104
+
First, we clean up the current directory of any litter files that could have been spawned by a previous usage of \`adastop\` (if you never used \`adastop\` before, this command will not have any effect).
94
105
95
106
```bash
96
107
adastop reset .# reset the state of the comparator (remove hidden pickle file)
Copy file name to clipboardExpand all lines: docs/tutorials.org
+16-1Lines changed: 16 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -15,6 +15,21 @@ Please note that if, in the process of the algorithm, all the comparisons for on
15
15
16
16
Below, we give an example based on files containing the evaluations of PPO,DDPG,SAC,TRPO, four Deep Reinforcement Learning algorithmes, given in the =examples= directory of the main repository.
17
17
18
+
19
+
20
+
21
+
** Installation
22
+
23
+
To install adastop, use pip:
24
+
25
+
#+begin_src bash :session *shell* :results verbatim :exports both
26
+
pip install adastop
27
+
#+end_src
28
+
29
+
This will automatically install the command line interface as well as the python library.
30
+
31
+
32
+
18
33
** Help for cli tool
19
34
20
35
The AdaStop algorithm is initialized with the first test done through =adastop compare= and the current state of AdaStop is then saved in a pickle file. The help of =adastop= command line can be obtained with the following:
@@ -47,7 +62,7 @@ The input format of adastop is under the form of a csv file containing the score
47
62
48
63
Let us launch AdaStop on this first batch of data.
49
64
50
-
First, we clean up the corrent directory of any litter files that could have been spawned by a previous usage of =adastop= (if you never used =adastop= before, this command will not have any effect).
65
+
First, we clean up the current directory of any litter files that could have been spawned by a previous usage of =adastop= (if you never used =adastop= before, this command will not have any effect).
51
66
52
67
#+begin_src bash :session *shell* :results verbatim :exports both
53
68
adastop reset . # reset the state of the comparator (remove hidden pickle file)
Copy file name to clipboardExpand all lines: docs/user_guide.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -60,7 +60,7 @@ Then, once you did the comparison on the first file, you can use iteratively `ad
60
60
61
61
#### Choice of comparisons
62
62
63
-
In adastopn, one can choose which comparisons are done. The default is to do all the pairwise comparisons between two algorithms. In practice, it is sometimes sufficient to compare to only one of them, a benchmark, for this the `--compare-to-first` argument can be used. For a more fine-grained control on which comparison to do, the python API can take the comparisons as input.
63
+
In adastop, one can choose which comparisons are done. The default is to do all the pairwise comparisons between two algorithms. In practice, it is sometimes sufficient to compare to only one of them, a benchmark, for this the `--compare-to-first` argument can be used. For a more fine-grained control on which comparison to do, the python API can take the comparisons as input.
64
64
65
65
**Remark**: it is not statistically ok to execute adastop several times and interpret the result as though it was only one test, if adastop is run several times this is multiple testing and some calibration has to be done. Instead, it is better to do all the comparisons at the same time, running the adastop algorithm only once, and adastop will handle the multiplicity of hypotheses by itself.
0 commit comments