Summary measures for learning curves
Suppose we wish to look at change in processing success (y) with ease of task (x).
We fit a cubic polynomial and obtain using multiple regression a best fitting cubic of form -0.000005x3 + 0.0003 x2 + 0.0026x + 0.7508. This is the curve in red in this plot.
The asymptote occurs at a turning point when its first derivative (a quadratic) equals zero. For the curve in red this quadratic, obtained from differentiating the above cubic, is of form -0.000015x^2 + 0.0006x - 0.0026 =0. This quadratic has a turning point in the sampled interval [10,60] at 43.94 obtained by inputting the regression coefficients into the quadratic root solver here.
The predicted learning slope, from baseline to asymptote, is obtained by subtracting the predicted response, obtained using the regression coefficients in the cubic, at the start (x=10) from that at the asymptote(x=43.94) of the learning line and dividing by the change in x (43.94-10),
So for the red curve the change from baseline to asymptote = (1.02 – 0.80)/(43.94-10)=0.0064.
Note that Howell (1995) shows that a quadratic curve with a linear component is sufficient to yield an asymptotic model. In this instance one would obtain the asymptote by differentiating the quadratic to get a linear equation with the asymptote equal to its root or, equivalently, by solving the quadratic equation and averaging the (real) roots (the vertex method).
Similarly if we anticipate having three turning points we can fit predictors of form x^4, x^3, x^2 and x, obtain their regression coefficients together with that of the intercept term and differentiate this quartic to obtain a cubic curve. The roots of this cubic (obtained using a cubic root solver calculator such as here) will give the location of the three turning points.
Howell DC (1997). Statistical methods for psychology. Fourth Edition. Wadsworth, Belmont, CA.