Show Sidebar Hide Sidebar

Hierarchical Clustering Structured vs Unstructured Ward in Scikit-learn

Example builds a swiss roll dataset and runs hierarchical clustering on their position.

For more information, see Hierarchical clustering.

In a first step, the hierarchical clustering is performed without connectivity constraints on the structure and is solely based on distance, whereas in a second step the clustering is restricted to the k-Nearest Neighbors graph: it’s a hierarchical clustering with structure prior.

Some of the clusters learned without connectivity constraints do not respect the structure of the swiss roll and extend across different folds of the manifolds. On the opposite, when opposing connectivity constraints, the clusters form a nice parcellation of the swiss roll.

New to Plotly?

Plotly's Python library is free and open source! Get started by downloading the client and reading the primer.
You can set up Plotly to work in online or offline mode, or in jupyter notebooks.
We also have a quick-reference cheatsheet (new!) to help you get started!

Version

In [1]:
import sklearn
sklearn.__version__
Out[1]:
'0.18'

Imports

This tutorial imports AgglomerativeClustering and make_swiss_roll.

In [2]:
print(__doc__)

import plotly.plotly as py
import plotly.graph_objs as go


import time as time
import numpy as np
import matplotlib.pyplot as plt
import mpl_toolkits.mplot3d.axes3d as p3
from sklearn.cluster import AgglomerativeClustering
from sklearn.datasets.samples_generator import make_swiss_roll
Automatically created module for IPython interactive environment

Calculations

Generate data swiss roll dataset.

In [3]:
n_samples = 1500
noise = 0.05
X, _ = make_swiss_roll(n_samples, noise)
# Make it thinner
X[:, 1] *= .5

Plot result

Function to convert matplotlib colormap to plotly colormap.

In [4]:
def matplotlib_to_plotly(cmap, pl_entries):
    h = 1.0/(pl_entries-1)
    pl_colorscale = []
    
    for k in range(pl_entries):
        C = map(np.uint8, np.array(cmap(k*h)[:3])*255)
        pl_colorscale.append([k*h, 'rgb'+str((C[0], C[1], C[2]))])
        
    return pl_colorscale

Without connectivity constraints

Compute clustering

In [5]:
print("Compute unstructured hierarchical clustering...")
st = time.time()
ward = AgglomerativeClustering(n_clusters=6, linkage='ward').fit(X)
elapsed_time = time.time() - st
label = ward.labels_
print("Elapsed time: %.2fs" % elapsed_time)
print("Number of points: %i" % label.size)
Compute unstructured hierarchical clustering...
Elapsed time: 0.07s
Number of points: 1500
In [6]:
color =   matplotlib_to_plotly(plt.cm.jet, 6)

data = [ ]
for l in np.unique(label):
    trace = go.Scatter3d(x=X[label == l, 0],
                         y=X[label == l, 1], 
                         z=X[label == l, 2],
                         mode='markers', 
                         showlegend = False,
                         marker=dict( color= color[l][1],
                                       line= dict(color='black', width=1)
                                    ))
    data.append(trace)
    
layout = go.Layout(height = 600, 
                   title = 'Without connectivity constraints (time %.2fs)' % elapsed_time,
                   scene = dict(
                    xaxis = dict(
                         backgroundcolor="rgb(233, 233, 233)",
                         showbackground=True),
                    yaxis = dict(
                        backgroundcolor="rgb(233, 233, 233)",
                        showbackground=True,),
                    zaxis = dict(
                        backgroundcolor="rgb(233, 233, 233)",
                        showbackground=True)),
                    margin=dict(
                                l=0, r=0,
                                b=0, t=50)
                  )

fig = go.Figure(data=data, layout = layout)

py.iplot(fig)
Out[6]:

With connectivity constraints

Define the structure A of the data. Here a 10 nearest neighbors.

In [7]:
from sklearn.neighbors import kneighbors_graph
connectivity = kneighbors_graph(X, n_neighbors=10, include_self=False)

Compute clustering

In [8]:
print("Compute structured hierarchical clustering...")
st = time.time()
ward = AgglomerativeClustering(n_clusters=6, connectivity=connectivity,
                               linkage='ward').fit(X)
elapsed_time = time.time() - st
label = ward.labels_
print("Elapsed time: %.2fs" % elapsed_time)
print("Number of points: %i" % label.size)
Compute structured hierarchical clustering...
Elapsed time: 0.12s
Number of points: 1500
In [9]:
color =   matplotlib_to_plotly(plt.cm.jet, 6)

data = [ ]
for l in np.unique(label):
    trace = go.Scatter3d(x=X[label == l, 0],
                         y=X[label == l, 1], 
                         z=X[label == l, 2],
                         mode='markers', 
                         showlegend = False,
                         marker=dict( color= color[l][1],
                                       line= dict(color='black', width=1)
                                    ))
    data.append(trace)
    
layout = go.Layout(height = 600, 
                   title = 'With connectivity constraints (time %.2fs)' % elapsed_time,
                   scene = dict(
                    xaxis = dict(
                         backgroundcolor="rgb(233, 233, 233)",
                         showbackground=True),
                    yaxis = dict(
                        backgroundcolor="rgb(233, 233, 233)",
                        showbackground=True,),
                    zaxis = dict(
                        backgroundcolor="rgb(233, 233, 233)",
                        showbackground=True)),
                    margin=dict(
                                l=0, r=0,
                                b=0, t=50)
                  )

fig = go.Figure(data=data, layout = layout)

py.iplot(fig)
Out[9]:

License

Authors :

        Vincent Michel, 2010

        Alexandre Gramfort, 2010

        Gael Varoquaux, 2010

License:

        BSD 3 clause
Still need help?
Contact Us

For guaranteed 24 hour response turnarounds, upgrade to a Developer Support Plan.