2025年6月11日水曜日

Enjoy Observing Classical and Quantum Kernels in SVC

[Abstract]
The Support Vector Classifier (SVC) is a well-known and effective method for classification. One of its key strengths is the kernel trick, which implicitly maps complex, intertwined data into a higher-dimensional space using a kernel function—allowing for linear separation in that space. In this article, we’ll visually explore and enjoy the differences between classical and quantum kernels!

🟢 Input Dataset: Gaussian Parity
Classical SVC is highly powerful and often outperforms current quantum methods. However, quantum techniques offer novel capabilities that classical approaches cannot, which is why research in this area is gaining momentum.In this article, we use a dataset known as Gaussian Parity, which is considered well-suited for quantum methods.

     As shown in Fig. 1, this dataset consists of 80 samples (56 for training and 24 for testing) and two class labels. The data points from each class are interleaved in a diagonally crossed pattern, making linear separation difficult without transformation. This is where the kernel function—mentioned in the abstract—comes into play.

🟢 SVC Results with Classical and Quantum Kernels
Let’s start with the conclusion. Figure 2 shows classification results using SVC trained with (a) a classical kernel (RBF) and (b) a quantum kernel (a kernel matrix based on the ZZFeatureMap). On the test set, classification accuracy was 0.58 for (a) and 0.93 for (b).

     Of course, these results can vary depending on parameter settings. However, in this instance, the quantum kernel significantly outperformed the classical one. Also, the direction of the decision boundary differs considerably between (a) and (b).

🟢 Exploring the Classical Kernel (RBF)
Let’s take a closer look at the classical case. The RBF kernel is not used directly, but rather called internally during SVC training. It implicitly maps the input data into a higher-dimensional space.
Based on the learned model, we can use Principal Component Analysis (PCA) to project this high-dimensional space into three dimensions and visualize the decision function in 3D.

     The top two images in Fig. 3 show this 3D decision function. You can observe two peaks and two valleys. If we slice this 3D surface at the decision function value = 0, the resulting plane gives us the decision boundary.

🟢 Exploring the Quantum Kernel Matrix
Now let’s examine the quantum case. As shown in Fig. 4, the decision function exhibits a more complex pattern, with more peaks and valleys than in the classical case.
When we slice the surface at decision function = 0, we obtain a decision boundary that achieves a classification accuracy of 0.93.

     This example suggests that the quantum method is exploring and learning within a feature space that classical methods cannot access. This may open the door to entirely new possibilities!


2025年6月6日金曜日

初夏の散歩道-厚木市郊外

 6月に入りまだ梅雨前ですが、初夏らしい日が続いています。散歩道(厚木市郊外)で写した数枚の写真をご覧ください。 2025年6月6日午前の散歩にて。

 かっての同僚の先生から、「もうトウモロコシが実っているのですか?北海道の感覚では、もっと後ですよね。そういえば、大通公園のとうきび売りの露店は本当に少なくなりました。」とのコメントをいただきました。そうなんですね。北海道では、「トウモロコシ」とは言わずに、「とうきび」と呼びます。でも、東京近辺に長年住んでいるとそれに合わせてしまったことに気が付く。懐かしい響き。