|
17 | 17 | .. |Python35| image:: https://img.shields.io/badge/python-3.5-blue.svg
|
18 | 18 | .. _Python35: https://badge.fury.io/py/skope-rules
|
19 | 19 |
|
| 20 | +.. image:: logo.png |
20 | 21 |
|
21 | 22 | skope-rules
|
22 | 23 | ===========
|
23 | 24 |
|
24 |
| -skope-rules is a Python machine learning module built on top of |
| 25 | +Skope-rules is a Python machine learning module built on top of |
25 | 26 | scikit-learn and distributed under the 3-Clause BSD license.
|
26 | 27 |
|
27 |
| -It aims at learning logical, interpretable rules for "scoping" a target |
| 28 | +Skope-rules aims at learning logical, interpretable rules for "scoping" a target |
28 | 29 | class, i.e. detecting with high precision instances of this class.
|
29 | 30 |
|
| 31 | +Skope-rules is a trade off between the interpretability of a Decision Tree |
| 32 | +and the modelization power of a Random Forest. |
| 33 | + |
30 | 34 | See the `AUTHORS.rst <AUTHORS.rst>`_ file for a list of contributors.
|
31 | 35 |
|
| 36 | +.. image:: schema.png |
| 37 | + |
| 38 | + |
| 39 | +Installation |
| 40 | +------------ |
| 41 | + |
| 42 | +You can get the latest sources with pip : |
| 43 | + |
| 44 | + pip install skope-rules |
| 45 | + |
| 46 | + |
| 47 | +Quick Start |
| 48 | +------------ |
| 49 | + |
| 50 | +SkopeRules can be used to describe classes with logical rules : |
| 51 | + |
| 52 | +.. code:: python |
| 53 | +
|
| 54 | + from sklearn.datasets import load_iris |
| 55 | + from skrules import SkopeRules |
| 56 | + |
| 57 | + dataset = load_iris() |
| 58 | + feature_names = ['sepal_length', 'sepal_width', 'petal_length', 'petal_width'] |
| 59 | + clf = SkopeRules(max_depth_duplication=2, |
| 60 | + n_estimators=30, |
| 61 | + precision_min=0.3, |
| 62 | + recall_min=0.1, |
| 63 | + feature_names=feature_names) |
| 64 | + |
| 65 | + for idx, species in enumerate(dataset.target_names): |
| 66 | + X, y = dataset.data, dataset.target |
| 67 | + clf.fit(X, y == idx) |
| 68 | + rules = clf.rules_[0:3] |
| 69 | + print("Rules for iris", species) |
| 70 | + for rule in rules: |
| 71 | + print(rule) |
| 72 | + print() |
| 73 | + print(20*'=') |
| 74 | + print() |
| 75 | +:: |
| 76 | + |
| 77 | +SkopeRules can also be used as a predictor if you use the "score_top_rules" method : |
| 78 | + |
| 79 | +.. code:: python |
| 80 | +
|
| 81 | + from sklearn.datasets import load_boston |
| 82 | + from sklearn.metrics import precision_recall_curve |
| 83 | + from matplotlib import pyplot as plt |
| 84 | + from skrules import SkopeRules |
| 85 | + |
| 86 | + dataset = load_boston() |
| 87 | + clf = SkopeRules(max_depth_duplication=None, |
| 88 | + n_estimators=30, |
| 89 | + precision_min=0.2, |
| 90 | + recall_min=0.01, |
| 91 | + feature_names=dataset.feature_names) |
| 92 | + |
| 93 | + X, y = dataset.data, dataset.target > 25 |
| 94 | + X_train, y_train = X[:len(y)//2], y[:len(y)//2] |
| 95 | + X_test, y_test = X[len(y)//2:], y[len(y)//2:] |
| 96 | + clf.fit(X_train, y_train) |
| 97 | + y_score = clf.score_top_rules(X_test) # Get a risk score for each test example |
| 98 | + precision, recall, _ = precision_recall_curve(y_test, y_score) |
| 99 | + plt.plot(recall, precision) |
| 100 | + plt.xlabel('Recall') |
| 101 | + plt.ylabel('Precision') |
| 102 | + plt.title('Precision Recall curve') |
| 103 | + plt.show() |
| 104 | +:: |
| 105 | + |
| 106 | + |
| 107 | +For more examples and use cases please check our `documentation <http://skope-rules.readthedocs.io/en/latest/>`_. |
| 108 | +You can also check the `demonstration notebooks <notebooks/>`_. |
| 109 | + |
32 | 110 | Links with existing litterature
|
33 | 111 | -------------------------------
|
34 | 112 |
|
@@ -69,12 +147,6 @@ skope-rules requires:
|
69 | 147 |
|
70 | 148 | For running the examples Matplotlib >= 1.1.1 is required.
|
71 | 149 |
|
72 |
| -Installation |
73 |
| ------------- |
74 |
| - |
75 |
| -You can get the latest sources with the command:: |
76 |
| - |
77 |
| - pip install skope-rules |
78 | 150 |
|
79 | 151 | Documentation
|
80 | 152 | --------------
|
|
0 commit comments