[Abstract]
The Support Vector Classifier (SVC) is a well-known and effective method for classification. One of its key strengths is the kernel trick, which implicitly maps complex, intertwined data into a higher-dimensional space using a kernel function—allowing for linear separation in that space. In this article, we’ll visually explore and enjoy the differences between classical and quantum kernels!
🟢 Input Dataset: Gaussian Parity
Classical SVC is highly powerful and often outperforms current quantum methods. However, quantum techniques offer novel capabilities that classical approaches cannot, which is why research in this area is gaining momentum.In this article, we use a dataset known as Gaussian Parity, which is considered well-suited for quantum methods.
As shown in Fig. 1, this dataset consists of 80 samples (56 for training and 24 for testing) and two class labels. The data points from each class are interleaved in a diagonally crossed pattern, making linear separation difficult without transformation. This is where the kernel function—mentioned in the abstract—comes into play.
🟢 SVC Results with Classical and Quantum Kernels
Let’s start with the conclusion. Figure 2 shows classification results using SVC trained with (a) a classical kernel (RBF) and (b) a quantum kernel (a kernel matrix based on the ZZFeatureMap). On the test set, classification accuracy was 0.58 for (a) and 0.93 for (b).
Of course, these results can vary depending on parameter settings. However, in this instance, the quantum kernel significantly outperformed the classical one. Also, the direction of the decision boundary differs considerably between (a) and (b).
🟢 Exploring the Classical Kernel (RBF)
Let’s take a closer look at the classical case. The RBF kernel is not used directly, but rather called internally during SVC training. It implicitly maps the input data into a higher-dimensional space.
Based on the learned model, we can use Principal Component Analysis (PCA) to project this high-dimensional space into three dimensions and visualize the decision function in 3D.
The top two images in Fig. 3 show this 3D decision function. You can observe two peaks and two valleys. If we slice this 3D surface at the decision function value = 0, the resulting plane gives us the decision boundary.
🟢 Exploring the Quantum Kernel Matrix
Now let’s examine the quantum case. As shown in Fig. 4, the decision function exhibits a more complex pattern, with more peaks and valleys than in the classical case.
When we slice the surface at decision function = 0, we obtain a decision boundary that achieves a classification accuracy of 0.93.
0 件のコメント:
コメントを投稿