Browse code

fix several documentation mistakes

* add 'curvature_loss_function' to the 'tutorial.custom_loss.rst'
* fix header styling
* fix doc-string

Bas Nijholt authored on 22/11/2018 12:36:22
Showing 3 changed files
... ...
@@ -32,7 +32,7 @@ def uses_nth_neighbors(n):
32 32
     The next function is a part of the `curvature_loss_function` function.
33 33
 
34 34
     >>> @uses_nth_neighbors(1)
35
-    ...def triangle_loss(xs, ys):
35
+    ... def triangle_loss(xs, ys):
36 36
     ...    xs = [x for x in xs if x is not None]
37 37
     ...    ys = [y for y in ys if y is not None]
38 38
     ...
... ...
@@ -92,7 +92,7 @@ lines. However, as always, when you sample more points the graph will
92 92
 become gradually smoother.
93 93
 
94 94
 Using any convex shape as domain
95
+................................
95 96
 
96 97
 Suppose you do not simply want to sample your function on a square (in 2D) or in
97 98
 a cube (in 3D). The LearnerND supports using a `scipy.spatial.ConvexHull` as
... ...
@@ -46,11 +46,14 @@ tl;dr, one can use the following *loss functions* that
46 46
 
47 47
 + `adaptive.learner.learner1D.default_loss`
48 48
 + `adaptive.learner.learner1D.uniform_loss`
49
++ `adaptive.learner.learner1D.curvature_loss_function`
49 50
 + `adaptive.learner.learner2D.default_loss`
50 51
 + `adaptive.learner.learner2D.uniform_loss`
51 52
 + `adaptive.learner.learner2D.minimize_triangle_surface_loss`
52 53
 + `adaptive.learner.learner2D.resolution_loss_function`
53 54
 
55
+Whenever a loss function has `_function` appended to its name, it is a factory function
56
+that returns the loss function with certain settings.
54 57
 
55 58
 Uniform sampling
56 59
 ~~~~~~~~~~~~~~~~