TY - GEN
T1 - LORENTZ TRANSFORMATION NEURAL NETWORK
AU - Li, Wenyuan
AU - Wang, Jingchao
AU - Huang, Guoheng
AU - Lin, Tongxu
AU - Zhong, Guo
AU - Yuan, Xiaochen
AU - Pun, Chi Man
AU - Wang, Zhibo
AU - Zeng, An
N1 - Publisher Copyright:
©2025 IEEE.
PY - 2025
Y1 - 2025
N2 - We propose a novel neural network architecture, the Lorentz Transformation Neural Network (LTNN), which utilizes Lorentz transformations to generate a complex computation matrix that enhances the network’s expressive power. Furthermore, LTNN is lightweight due to the shared weight matrices in the computation matrix. LTNN treats the input and output as coordinates in high-dimensional spacetime, with the weight matrices in each layer representing the velocity components of a spacetime reference frame. During training, these weight matrices are transformed into a computation matrix via Lorentz transformations, describing the coordinate transformations between different reference frames. We evaluate LTNN on four datasets: California Housing Prices, Iris, MNIST, and Fashion-MNIST. Experimental results demonstrate that LTNN outperforms conventional neural networks and quaternion neural networks in terms of both accuracy and parameter efficiency.
AB - We propose a novel neural network architecture, the Lorentz Transformation Neural Network (LTNN), which utilizes Lorentz transformations to generate a complex computation matrix that enhances the network’s expressive power. Furthermore, LTNN is lightweight due to the shared weight matrices in the computation matrix. LTNN treats the input and output as coordinates in high-dimensional spacetime, with the weight matrices in each layer representing the velocity components of a spacetime reference frame. During training, these weight matrices are transformed into a computation matrix via Lorentz transformations, describing the coordinate transformations between different reference frames. We evaluate LTNN on four datasets: California Housing Prices, Iris, MNIST, and Fashion-MNIST. Experimental results demonstrate that LTNN outperforms conventional neural networks and quaternion neural networks in terms of both accuracy and parameter efficiency.
KW - Lorentz transformation
KW - high-dimensional spacetime
KW - neural network
KW - reference frame
UR - https://www.scopus.com/pages/publications/105028565388
U2 - 10.1109/ICIP55913.2025.11084728
DO - 10.1109/ICIP55913.2025.11084728
M3 - Conference contribution
AN - SCOPUS:105028565388
T3 - Proceedings - International Conference on Image Processing, ICIP
SP - 2558
EP - 2563
BT - 2025 IEEE International Conference on Image Processing, ICIP 2025 - Proceedings
PB - IEEE Computer Society
T2 - 32nd IEEE International Conference on Image Processing, ICIP 2025
Y2 - 14 September 2025 through 17 September 2025
ER -