Show Sidebar Hide Sidebar

Underfitting vs. Overfitting in Scikit-learn

This example demonstrates the problems of underfitting and overfitting and how we can use linear regression with polynomial features to approximate nonlinear functions. The plot shows the function that we want to approximate, which is a part of the cosine function. In addition, the samples from the real function and the approximations of different models are displayed. The models have polynomial features of different degrees. We can see that a linear function (polynomial with degree 1) is not sufficient to fit the training samples. This is called underfitting. A polynomial of degree 4 approximates the true function almost perfectly. However, for higher degrees the model will overfit the training data, i.e. it learns the noise of the training data. We evaluate quantitatively overfitting / underfitting by using cross-validation. We calculate the mean squared error (MSE) on the validation set, the higher, the less likely the model generalizes correctly from the training data.

New to Plotly?

Plotly's Python library is free and open source! Get started by downloading the client and reading the primer.
You can set up Plotly to work in online or offline mode, or in jupyter notebooks.
We also have a quick-reference cheatsheet (new!) to help you get started!

Version

In [1]:
import sklearn
sklearn.__version__
Out[1]:
'0.18.1'

Imports

In [2]:
print(__doc__)

import plotly.plotly as py
import plotly.graph_objs as go
from plotly import tools

import numpy as np
from sklearn.pipeline import Pipeline
from sklearn.preprocessing import PolynomialFeatures
from sklearn.linear_model import LinearRegression
from sklearn.model_selection import cross_val_score
Automatically created module for IPython interactive environment

Calculations

In [3]:
np.random.seed(0)

n_samples = 30
degrees = [1, 4, 15]

true_fun = lambda X: np.cos(1.5 * np.pi * X)
X = np.sort(np.random.rand(n_samples))
y = true_fun(X) + np.random.randn(n_samples) * 0.1
titles = []
data = [] 

Plot Results

In [4]:
for i in range(len(degrees)):
    polynomial_features = PolynomialFeatures(degree=degrees[i],
                                             include_bias=False)
    linear_regression = LinearRegression()
    pipeline = Pipeline([("polynomial_features", polynomial_features),
                         ("linear_regression", linear_regression)])
    pipeline.fit(X[:, np.newaxis], y)

    # Evaluate the models using crossvalidation
    scores = cross_val_score(pipeline, X[:, np.newaxis], y,
                             scoring="neg_mean_squared_error", cv=10)

    X_test = np.linspace(0, 1, 100)
    
    if(i==1):
        leg=True
    else:
        leg=False
    data.append([])
    trace1 = go.Scatter(x=X_test, y=pipeline.predict(X_test[:, np.newaxis]), 
                        name="Model", mode='lines',
                        line=dict(color='blue', width=1),
                        showlegend=leg)
    data[i].append(trace1)
    trace2 = go.Scatter(x=X_test, y=true_fun(X_test), 
                        name="True function", mode='lines',
                        line=dict(color='green', width=1),
                        showlegend=leg)
    data[i].append(trace2)
    
    trace3 = go.Scatter(x=X, y=y, 
                        name="Samples", mode='markers',
                        marker=dict(color='blue', 
                                    line=dict(color='black', width=1)),
                        showlegend=leg)
    data[i].append(trace3)
    
    titles.append("Degree {}<br>MSE = {:.2e}(+/- {:.2e})".format(
                   degrees[i], -scores.mean(), scores.std()))
In [5]:
fig = tools.make_subplots(rows=1, cols=3,
                          subplot_titles=tuple(titles[:3]),
                          print_grid=False)

for i in range(0, len(data)):
    for j in range(0, len(data[i])):
        fig.append_trace(data[i][j], 1, i+1)
        
for i in map(str,range(1, 4)):
    y = 'yaxis'+i
    x = 'xaxis'+i
    fig['layout'][y].update(title='y', showgrid=False,
                            showticklabels=False, ticks='')
    fig['layout'][x].update(title='x', showgrid=False,
                            showticklabels=False, ticks='')
In [6]:
py.iplot(fig)
Out[6]:
Still need help?
Contact Us

For guaranteed 24 hour response turnarounds, upgrade to a Developer Support Plan.