Welcome to the IKCEST
Journal
IEEE Signal Processing Magazine

IEEE Signal Processing Magazine

Archives Papers: 381
IEEE Xplore
Please choose volume & issue:
Join SPS
New Society Editors-in-Chief Named for 2025 [Society News]
Keywords:Signal Processing
Abstracts:Provides society information that may include news, reviews or technical notes that should be of interest to practitioners and researchers.
New Society Officer Elected [Society News]
Abstracts:Provides society information that may include news, reviews or technical notes that should be of interest to practitioners and researchers.
ILN Online Course
Deep Hypercomplex Networks for Spatiotemporal Data Processing: Parameter efficiency and superior performance [Hypercomplex Signal and Image Processing]
Alabi BojesomoPanos LiatsisHasan Al Marzouqi
Keywords:Training dataConvolutional neural networksAlgebraQuaternionsImage processingData processingSpatiotemporal phenomenaHypercomplexNeural networksData processingBackpropagation algorithmsBatch normalizationImage ProcessingSpatiotemporal DataSpatiotemporal ProcessContralateralNeural NetworkActivation FunctionDeep LearningReal NumbersAlgebraNetwork PerformanceBatch NormalizationProcessing AdvantageNumber Of Network ParametersTraffic ForecastingComplex NetworkConvolutional Neural NetworkDeep NetworkDeep Neural NetworkConvolutional LayersGroup ConvolutionTime StampGraph Neural NetworksInput ComponentsChain RuleSuccess Of Deep LearningRecurrent Neural NetworkInput ChannelsDepthwise Convolution
Abstracts:Hypercomplex numbers, such as quaternions and octonions, have recently gained attention because of their advantageous properties over real numbers, e.g., in the development of parameter-efficient neural networks. For instance, the 16-component sedenion has the capacity to reduce the number of network parameters by a factor of 16. Moreover, hypercomplex neural networks offer advantages in the processing of spatiotemporal data as they are able to represent variable temporal data divisions through the hypercomplex components. Similarly, they support multimodal learning, with each component representing an individual modality. In this article, the key components of deep learning in the hypercomplex domain are introduced, encompassing concatenation, activation functions, convolution, and batch normalization. The use of the backpropagation algorithm for training hypercomplex networks is discussed in the context of hypercomplex algebra. These concepts are brought together in the design of a ResNet backbone using hypercomplex convolution, which is integrated within a U-Net configuration and applied in weather and traffic forecasting problems. The results demonstrate the superior performance of hypercomplex networks compared to their real-valued counterparts, given a fixed parameter budget, highlighting their potential in spatiotemporal data processing.
Quaternion Neural Networks: A physics-incorporated intelligence framework [Hypercomplex Signal and Image Processing]
Akira HiroseFang ShangYuta OtsukaRyo NatsuakiYuya MatsumotoNaoto UsamiYicheng SongHaotian Chen
Keywords:Radar remote sensingQuaternionsImage processingAirborne radarArtificial neural networksMobile communicationArtificial intelligenceNeural networksPhysicsHypercomplexNeural NetworkQuaternion Neural NetworksElectromagnetic WaveMobile CommunicationEarth ObservationNeural Network ClassifierPowerful FrameworkPolarization InformationConstruction Of SocietyActivation FunctionConvolutional Neural NetworkInput Vector3D SpacePhase InformationNeural Network StructureScattering MatrixSelf-organizing MapTime Series PredictionSynaptic WeightsDegree Of PolarizationStokes ParametersGround Penetrating RadarBare Soil Areas3D VectorDynamic Neural NetworkReservoir ComputingAbsolute PhaseConventional Convolutional Neural NetworksRGB ChannelsHigh Ability
Abstracts:Why quaternions in neural networks (NNs)? Are there quaternions in the human brain? “No” may be an ordinary answer. However, quaternion NNs (QNNs) are a powerful framework that strongly connects artificial intelligence (AI) and the real world. In this article, we deal with NNs based on quaternions and describe their basics and features. We also detail the underlying ideas in their engineering applications, especially when we adaptively process the polarization information of electromagnetic waves. We focus on their role in remote sensing, such as Earth observation radar mounted on artificial satellites or aircraft and underground radar, as well as mobile communication. There, QNNs are a class of NNs that know physics, especially polarization, composing a framework by fusing measurement physics with adaptive-processing mathematics. This fusion realizes a seamless integration of measurement and intelligence, contributing to the construction of a human society having harmony between AI and real human lives.
Augmented Statistics of Quaternion Random Variables: A lynchpin of quaternion learning machines [Hypercomplex Signal and Image Processing]
Clive Cheong TookSayed Pouria TalebiRosa Maria Fernandez AlcalaDanilo P. Mandic
Keywords:Machine learning algorithmsThree-dimensional displaysQuaternionsImage processingSignal processing algorithmsTutorialsMachine learningHypercomplexMachine LearningSignal ProcessingRole In The TreatmentStatistical Machine LearningDevelopment Of Novel MethodsSecond-order StatisticsDevelopment Of Statistical MethodsCorrelation MatrixIdentity MatrixImaginary PartSingular ValueSingular Value DecompositionUnmanned Aerial VehiclesRotation AxisTime Index2D ModelUnit SphereEuler AnglesImaginary ComponentsAutocorrelation Matrix3D Rotation3D OrientationS-moduleSymmetric Part3D SphereOuter ProductGyroscopeComplex NumbersImaginary Unit
Abstracts:Learning machines for vector sensor data are naturally developed in the quaternion domain and are underpinned by quaternion statistics. To this end, we revisit the “augmented” representation basis for discrete quaternion random variables (RVs) ${\bf{q}}^{a}[n]$, i.e., ${[}{\bf{q}}{[}{n}{]}\;{\bf{q}}^{\imath}{[}{n}{]}\;{\bf{q}}^{\jmath}{[}{n}{]}{\bf{q}}^{\kappa}{[}{n}{]]}$, and demonstrate its pivotal role in the treatment of the generality of quaternion RVs. This is achieved by a rigorous consideration of the augmented quaternion RV and by involving the additional second-order statistics, besides the traditional covariance $E\{{\bf{q}}\mathbf{[}{n}\mathbf{]}{\bf{q}}^{{*}}\mathbf{[}{n}\mathbf{]}\}$ [1]. To illuminate the usefulness of quaternions, we consider their most well-known application—3D orientation—and offer an account of augmented statistics for purely imaginary (pure) quaternions. The quaternion statistics presented here can be exploited in the analysis of existing and the development of novel statistical machine learning methods, hence acting as a lynchpin for quaternion learning machines.
Demystifying the Hypercomplex: Inductive biases in hypercomplex deep learning [Hypercomplex Signal and Image Processing]
Danilo ComminielloEleonora GrassucciDanilo P. MandicAurelio Uncini
Keywords:Deep learningTraining dataMultidimensional signal processingThree-dimensional displaysAlgebraImage processingHypercomplexDeep LearningSignal ProcessingInductive BiasLearning ProcessTraditional LearningMultidimensional ProcessField Of Deep LearningReal Vector SpaceProminent Field3D SignalFoundational FrameworkNeural NetworkHigh-dimensionalTransformerConvolutional LayersDeep Learning ModelsInput SignalPoint CloudCommutativeRepresentation LearningImage SynthesisSet Of AssumptionsGeometric TransformationClifford AlgebraAlgebraic PropertiesSignal RepresentationFC LayerEmotional ProsodyImaginary UnitPose Estimation
Abstracts:Hypercomplex algebras have recently been gaining prominence in the field of deep learning owing to the advantages of their division algebras over real vector spaces and their superior results when dealing with multidimensional signals in real-world 3D and 4D paradigms. This article provides a foundational framework that serves as a road map for understanding why hypercomplex deep learning methods are so successful and how their potential can be exploited. Such a theoretical framework is described in terms of inductive bias, i.e., a collection of assumptions, properties, and constraints that are built into training algorithms to guide their learning process toward more efficient and accurate solutions. We show that it is possible to derive specific inductive biases in the hypercomplex domains, which extend complex numbers to encompass diverse numbers and data structures. These biases prove effective in managing the distinctive properties of these domains as well as the complex structures of multidimensional and multimodal signals. This novel perspective for hypercomplex deep learning promises to both demystify this class of methods and clarify their potential, under a unifying framework, and in this way, promotes hypercomplex models as viable alternatives to traditional real-valued deep learning for multidimensional signal processing.
Understanding Vector-Valued Neural Networks and Their Relationship With Real and Hypercomplex-Valued Neural Networks: Incorporating intercorrelation between features into neural networks [Hypercomplex Signal and Image Processing]
Marcos Eduardo Valle
Keywords:Training dataDeep learningImage processingNeural networksTraining dataParallel processingVectorsHypercomplexMultidimensional signal processingNeural NetworkImage ProcessingSignal ProcessingTraining DataDeep LearningReal NumbersDeep Learning ModelsAdditional PropertiesFeature ChannelsFewer ParametersTraditional NetworkTraditional Neural NetworkRobust TrainingAlgebraic PropertiesMultidimensional ArrayCurrent LibraryArray Of NumbersActivation FunctionBuilding BlocksConvolutional LayersDense LayerSplit FunctionReal-valued MatrixImage Processing TasksReal-valued FunctionProjection MatrixBilinear FormSynaptic WeightsKronecker ProductParametrized
Abstracts:Despite the many successful applications of deep learning models for multidimensional signal and image processing, most traditional neural networks process data represented by (multidimensional) arrays of real numbers. The intercorrelation between feature channels is usually expected to be learned from the training data, requiring numerous parameters and careful training. In contrast, vector-valued neural networks (referred to as V-nets) are conceived to process arrays of vectors and naturally consider the intercorrelation between feature channels. Consequently, they usually have fewer parameters and often undergo more robust training than traditional neural networks. This article aims to present a broad framework for V-nets. In this context, hypercomplex-valued neural networks are regarded as vector-valued models with additional algebraic properties. Furthermore, this article explains the relationship between vector-valued and traditional neural networks. To be precise, a V-net can be obtained by placing restrictions on a real-valued model to consider the intercorrelation between feature channels. Finally, I show how V-nets, including hypercomplex-valued neural networks, can be implemented in current deep learning libraries as real-valued networks.
Hypercomplex Signal Processing in Digital Twin of the Ocean: Theory and application [Hypercomplex Signal and Image Processing]
Zhaoyuan YuDongshuang LiPei DuWen LuoKit Ian KouUzair Aslam BhattiWerner BengerGuonian LvLinwang Yuan
Keywords:Analytical modelsComputational modelingData visualizationGeoscienceSignal processingPredictive modelsHypercomplexDigital twinsOceanographyArtificial intelligenceHigh performance computingGeoscienceSignal ProcessingDigital TwinIntelligenceMachine LearningPattern RecognitionCoordinate SystemData IntegrationForecastingHeterogeneous DataVisual FeedbackOceanographicMultidimensional DataEarth ScienceInformation FusionFeedback InteractionsOcean SystemOcean ObservationsScene ModelNeural NetworkData ModelOcean DataVector FieldClifford AlgebraMultivectorMulti-source DataUnique DataField Of Signal ProcessingUnified RepresentationVarious Types Of DataRecurrent Neural Network
Abstracts:The digital twin of the ocean (DTO) is a groundbreaking concept that uses interactive simulations to improve decision-making and promote sustainability in earth science. The DTO effectively combines ocean observations, artificial intelligence (AI), advanced modeling, and high-performance computing to unite digital replicas, forecasting, and what-if scenario simulations of the ocean systems. However, there are several challenges to overcome in achieving the DTO’s objectives, including the integration of heterogeneous data with multiple coordinate systems, multidimensional data analysis, feature extraction, high-fidelity scene modeling, and interactive virtual–real feedback. Hypercomplex signal processing offers a promising solution to these challenges, and this study provides a comprehensive overview of its application in DTO development. We investigate a range of techniques, including geometric algebra, quaternion signal processing, Clifford signal processing, and hypercomplex machine learning, as the theoretical foundation for hypercomplex signal processing in the DTO. We also review the various application aspects of the DTO that can benefit from hypercomplex signal processing, such as data representation and information fusion, feature extraction and pattern recognition, and intelligent process simulation and forecasting, as well as visualization and interactive virtual–real feedback. Our research demonstrates that hypercomplex signal processing provides innovative solutions for DTO advancement and resolving scientific challenges in oceanography and broader earth science.
Hot Journals