CNN 中的 2D 卷积实际上不是数学的卷积操作, 是数学意义上的互相关操作.

Before we get into some theory, it is important to note that in CNNs although we call it a convolution, it is actually cross-correlation. It is a technicality, but in a CNN we do not flip the filter as is required in typical convolutions. However except for this flip, both operations are identical.

machine learning - Convolution and Cross Correlation in CNN - Data Science Stack Exchange

Actually most practical applications of convolutional neural networks (CNN) use cross-correlation instead of convolutions.

Why do CNNs use convolution instead of cross-correlation? - Quora

多项式实际上是 1D 的离散卷积。

一维信号线性卷积可以认为是一种多项式相乘。已知两个信号卷积结果是已知序列降幂排列组成的幂函数多项式相乘得到的多项式结果的降幂序列值。这也就是为什么计算一维线性卷积有不进位乘法这种计算方法的原因。

参考

如何通俗易懂地解释卷积? - 知乎 calculus - Convolution and multiplication of polynomials is the same? - Mathematics Stack Exchange