Intel® Extension for Scikit-learn DBSCAN for spoken arabic digit dataset
[1]:
from timeit import default_timer as timer
from sklearn.model_selection import train_test_split
from sklearn.metrics import davies_bouldin_score
from sklearn.datasets import fetch_openml
from IPython.display import HTML
import warnings
warnings.filterwarnings("ignore")
Download the data
[2]:
x, y = fetch_openml(name="spoken-arabic-digit", return_X_y=True)
Preprocessing
Split the data into train and test sets
[3]:
x_train, x_test, y_train, y_test = train_test_split(x, y, test_size=0.3, random_state=0)
Normalize the data
[4]:
from sklearn.preprocessing import MinMaxScaler
scaler_x = MinMaxScaler()
[5]:
scaler_x.fit(x_train)
x_train = scaler_x.transform(x_train)
x_test = scaler_x.transform(x_test)
Patch original Scikit-learn with Intel® Extension for Scikit-learn
Intel® Extension for Scikit-learn (previously known as daal4py) contains drop-in replacement functionality for the stock Scikit-learn package. You can take advantage of the performance optimizations of Intel® Extension for Scikit-learn by adding just two lines of code before the usual Scikit-learn imports:
[6]:
from sklearnex import patch_sklearn
patch_sklearn()
Intel(R) Extension for Scikit-learn* enabled (https://github.com/intel/scikit-learn-intelex)
Intel® Extension for Scikit-learn patching affects performance of specific Scikit-learn functionality. Refer to the list of supported algorithms and parameters for details. In cases when unsupported parameters are used, the package fallbacks into original Scikit-learn. If the patching does not cover your scenarios, submit an issue on GitHub.
Training of the DBSCAN algorithm with Intel® Extension for Scikit-learn for spoken arabic digit dataset
[7]:
from sklearn.cluster import DBSCAN
params = {
"n_jobs": -1,
}
start = timer()
y_pred = DBSCAN(**params).fit_predict(x_train)
train_patched = timer() - start
f"Intel® extension for Scikit-learn time: {train_patched:.2f} s"
[7]:
'Intel® extension for Scikit-learn time: 6.37 s'
Let’s take a look at Davies-Bouldin score of the DBSCAN algorithm with Intel® Extension for Scikit-learn
[8]:
score_opt = davies_bouldin_score(x_train, y_pred)
f"Intel® extension for Scikit-learn Davies-Bouldin score: {score_opt}"
[8]:
'Intel® extension for Scikit-learn Davies-Bouldin score: 0.8542652084275848'
Train the same algorithm with original Scikit-learn
In order to cancel optimizations, we use unpatch_sklearn and reimport the class DBSCAN
[9]:
from sklearnex import unpatch_sklearn
unpatch_sklearn()
Training of the DBSCAN algorithm with original Scikit-learn library for spoken arabic digit dataset
[10]:
from sklearn.cluster import DBSCAN
start = timer()
y_pred = DBSCAN(**params).fit_predict(x_train)
train_unpatched = timer() - start
f"Original Scikit-learn time: {train_unpatched:.2f} s"
[10]:
'Original Scikit-learn time: 469.21 s'
Let’s take a look Davies-Bouldin score of the DBSCAN algorithm with original Scikit-learn
[11]:
score_original = davies_bouldin_score(x_train, y_pred)
f"Original Scikit-learn Davies-Bouldin score: {score_opt}"
[11]:
'Original Scikit-learn Davies-Bouldin score: 0.8542652084275848'
[12]:
HTML(
f"<h3>Compare Davies-Bouldin score of patched Scikit-learn and original</h3>"
f"Davies-Bouldin score of patched Scikit-learn: {score_opt} <br>"
f"Davies-Bouldin score of unpatched Scikit-learn: {score_original} <br>"
f"Metrics ratio: {score_opt/score_original} <br>"
f"<h3>With Scikit-learn-intelex patching you can:</h3>"
f"<ul>"
f"<li>Use your Scikit-learn code for training and prediction with minimal changes (a couple of lines of code);</li>"
f"<li>Fast execution training and prediction of Scikit-learn models;</li>"
f"<li>Get the similar quality</li>"
f"<li>Get speedup in <strong>{(train_unpatched/train_patched):.1f}</strong> times.</li>"
f"</ul>"
)
[12]:
Compare Davies-Bouldin score of patched Scikit-learn and original
Davies-Bouldin score of patched Scikit-learn: 0.8542652084275848Davies-Bouldin score of unpatched Scikit-learn: 0.8542652084275848
Metrics ratio: 1.0
With Scikit-learn-intelex patching you can:
- Use your Scikit-learn code for training and prediction with minimal changes (a couple of lines of code);
- Fast execution training and prediction of Scikit-learn models;
- Get the similar quality
- Get speedup in 73.6 times.