I'm trying to use the SVM classifier to separate the data into a fixed number of clusters. However, when doing that the classifier generates few clusters where no apparent relation is noticed between the points. Do I need another type of kernel, or simply the data is spread that way that SVM cannot do better?
The original problem is related to a Wireless Sensor Network (WSN), where a number of sensors are spread in space and communicate with a base station. My approach is to use SVM (sklearn svm.svc) currently with linear kernel, to cluster the data in k clusters.
I have N=300 sensors, to be clustered in K=5, clusters. PS. I'm aware of other algorithms (K-means, cmeans, fuzzy cmeans..) that can be used for this problem and probably perform better.
cmap_bold = ListedColormap(
['#FF0000', '#00FF00', '#0000FF', '#3F3FBF', '#0C0404'])
nb_clusters = 5
X = [[node.pos_x, node.pos_y] for node in network[0:-1]]
y = []
points_per_cluster = float(network.count_alive_nodes() / nb_clusters)
for i in range(nb_clusters):
for _ in range(int(points_per_cluster)):
y.append(i)
X = np.array(X)
y = np.array(y)
C = 1.0
svc = svm.SVC(kernel='linear', C=C)
svc.fit(X, y)
plot_predictions(svc, X, y)
plt.show()
The data for nodes is their network identified position, and for the y simply the classes (0..4).
[ 45.28952891 48.71941502]
[ 21.5114652 185.38108775]
[187.37250476 85.51585448]
[123.80470776 62.4834906 ]
[115.24942266 239.92792797]...
def plot_predictions(estimator, X, y):
estimator.fit(X, y)
x_min, x_max = X[:, 0].min() - .1, X[:, 0].max() + .1
y_min, y_max = X[:, 1].min() - .1, X[:, 1].max() + .1
xx, yy = np.meshgrid(np.linspace(x_min, x_max, 100),
np.linspace(y_min, y_max, 100))
Z = estimator.predict(np.c_[xx.ravel(), yy.ravel()])
Z = Z.reshape(xx.shape)
plt.figure()
plt.pcolormesh(xx, yy, Z, cmap=plt.cm.Paired, shading="auto", alpha=0.6)
plt.scatter(X[:, 0], X[:, 1], c=y.astype(float), cmap=cmap_bold)
plt.axis('tight')
plt.tight_layout()
When running this I get this, and it does not look right.

Any help is appreciated, thank you.