As Quantum Mechanics (where we have a complex-valued probability fluid for the so-badly-called ‘waves’) shows, nature loves complex numbers - they are algebraically closed for one - so how about ANNs using those? Signal processing engineers might enthuse:
January 28, 2021:
Artificial neural networks (ANNs) based machine learning models and especially deep learning models have been widely applied in computer vision, signal processing, wireless communications, and many other domains, where complex numbers occur either naturally or by design. However, most of the current implementations of ANNs and machine learning frameworks are using real numbers rather than complex numbers. There are growing interests in building ANNs using complex numbers, and exploring the potential advantages of the so-called complex-valued neural networks (CVNNs) over their real-valued counterparts. In this paper, we discuss the recent development of CVNNs by performing a survey of the works on CVNNs in the literature. Specifically, a detailed review of various CVNNs in terms of activation function, learning and optimization, input and output representations, and their applications in tasks such as signal processing and computer vision are provided, followed by a discussion on some pertinent challenges and future research directions.
If you look further, you will find people doing ANNs with quaternions (which actually express rotations in 3D space, which is handy), octonions and even sedenions, and, unbelievably trigintaduonions (I didn’t even know you can go that far)