Enter pairs and reveal the best-fit logarithmic curve. Measure errors, coefficients, and predictive strength fast. Use clean inputs, instant tables, and interactive plotting tools.
| Point | X | Y |
|---|---|---|
| 1 | 1 | 4.8 |
| 2 | 2 | 6.1 |
| 3 | 3 | 6.7 |
| 4 | 4 | 7.1 |
| 5 | 6 | 7.8 |
| 6 | 8 | 8.3 |
| 7 | 12 | 9.0 |
| 8 | 16 | 9.4 |
This sample shows a rising response that grows quickly first, then slows. That shape is a strong candidate for logarithmic regression analysis.
Logarithmic regression models a curved relationship with the equation y = a + b ln(x). The method transforms the x values with the natural logarithm, then fits a straight-line least-squares model between ln(x) and y.
Slope: b = [nΣ(lnx·y) − Σlnx Σy] / [nΣ(lnx²) − (Σlnx)²]
Intercept: a = (Σy − bΣlnx) / n
Predicted value: ŷ = a + b ln(x)
Residual: e = y − ŷ
Goodness of fit: R² = 1 − SSE / SST
This model works only when x is strictly positive. It is useful when growth is fast at small x values and gradually levels as x becomes larger.
Logarithmic regression is valuable when data changes quickly at the beginning and then slows over time or scale. Many practical datasets behave this way. Examples include diminishing returns in advertising response, learning curves, saturation-style adoption, and measurements that flatten as input increases. A linear model often misses that early steep rise and later tapering behavior. Logarithmic regression captures it with a simple two-parameter equation.
This calculator fits the model y = a + b ln(x) using least squares. It converts each x value into its natural logarithm, then estimates the intercept and slope from the transformed relationship. That approach keeps the procedure efficient while still producing a nonlinear curve on the original graph. The result section reports the equation, correlation coefficient, R², adjusted R², standard error, RMSE, MAE, and detailed residual values for each point.
The graph combines the observed points with a fitted curve, which helps you inspect model shape visually. The residual table helps you detect patterns that may signal a poor model choice. If residuals stay small and roughly balanced around zero, the fit is usually reasonable. If residuals drift in one direction, another regression family may be better.
For reliable results, use x values that are strictly positive and measurements that share consistent units. Outliers can strongly influence coefficients, so it is helpful to inspect raw data before interpreting the curve. When the fitted equation makes sense and goodness-of-fit values are strong, you can use the model to summarize nonlinear trends and make cautious predictions within the observed range.
It measures a relationship where y changes with the logarithm of x. The curve rises or falls quickly first, then changes more slowly as x increases.
The model uses ln(x). Natural logarithms are undefined for zero and negative numbers, so every x input must be positive.
Choose it when the data bends strongly at smaller x values and gradually flattens. A straight line may underfit that kind of pattern.
The slope shows how much y changes for each one-unit increase in ln(x). A positive slope means y increases as x grows.
R² shows how much variation in y is explained by the fitted logarithmic model. Values closer to 1 indicate a tighter fit.
Residuals show the difference between observed and predicted y values. They help reveal outliers, bias, and whether the curve shape is appropriate.
Yes, but carefully. Predictions outside the observed x range may become unreliable because the fitted trend may not continue in the same way.
Try checking data quality, removing obvious entry errors, or comparing other models such as linear, exponential, or power regression for a better match.
Important Note: All the Calculators listed in this site are for educational purpose only and we do not guarentee the accuracy of results. Please do consult with other sources as well.