Algorithm Discovery
Generate novel matrix multiplication and linear algebra algorithms that reduce computational complexity beyond known theoretical limits.

The Challenge
Matrix multiplication is the computational primitive underlying deep learning, scientific simulation, signal processing, and computer graphics. The complexity of matrix multiplication — whether general or structured — directly determines the computational cost of these workloads. Despite decades of research, the optimal complexity of matrix multiplication remains unknown, and the best known algorithms (Strassen and its descendants) were discovered through human ingenuity applied to a vast search space of possible decompositions. Recent work by DeepMind's AlphaTensor demonstrated that novel algorithms can be found through systematic search, but the design space remains largely unexplored for specialized matrix structures, hardware-aware decompositions, and domain-specific linear algebra operations.
Current approaches to matrix algorithm design rely on mathematical insight to identify algebraic decompositions that reduce operation counts, followed by implementation optimization for specific hardware. This is fundamentally a generation problem — producing novel algebraic structures from a specification of correctness and complexity constraints — yet existing tools treat it as an analysis problem, verifying proposed algorithms rather than constructing new ones. Automated search methods exist but are limited to small matrix sizes and specific decomposition structures.
The MatterSpace Approach
MatterSpace Algo generates novel matrix algorithms by searching the space of valid algebraic decompositions under constraints on operation count, numerical stability, and hardware utilization. Specify the matrix operation, size range, precision requirements, and target hardware architecture, and Algo generates decompositions that are provably correct and minimize computational cost.
The Matrix Algorithm domain pack encodes algebraic decomposition theory, numerical stability analysis, and hardware performance models. Users define the computational target and Algo generates novel algorithms with correctness proofs, operation count analyses, and predicted speedups on specified hardware platforms.
Specify what the output must satisfy. MatterSpace constructs candidates that meet all constraints simultaneously.
Every output satisfies physical laws, stability criteria, and domain constraints — no post-hoc filtering needed.
Powered by a domain-specific generation engine with physics-aware priors and adaptive dynamics control.
Generation Output
Key Differentiators
MatterSpace Algo generates matrix algorithms that are provably correct by construction, with formal verification of equivalence to the reference operation. The system discovers decompositions in regions of algebraic space that human mathematicians and existing automated search methods have not explored, producing algorithms tailored to specific matrix structures and hardware constraints that generic approaches cannot exploit.
Same sector
Generate novel optimization algorithms and solver architectures tailored to specific problem structures and constraint landscapes.
ViewGenerate novel neural network architectures with target accuracy, latency, and parameter budgets through constraint-based construction beyond established architecture families.
ViewGenerate novel loss functions and training objectives tailored to specific learning problems and data characteristics, moving beyond generic cross-entropy and MSE formulations.
ViewGet started
Whether you are exploring matrix algorithm search for the first time or scaling an existing research programme, MatterSpace generates novel candidates that satisfy your constraints by construction.
Contact us