The determinant and the permanent of a matrix, though deceivingly similar in their definitions, behave very differently with respect to how efficiently one can compute these quantities. The determinant of a matrix over a field can be easily computed via Gaussian elimination while computing the permanent, as shown by Valiant, is at least as hard as counting the number of satisfiable assignments to a Boolean formula. Given this, it is natural to ask what makes the determinant easy and permanent hard to compute?, is commutativity, the key to efficient determinant computation?
We show that computing the determinant of an nxn matrix whose entries are themselves 2x2 matrices over a field is as hard as computing the permanent over the field (Extending the recent result of Arvind and Srinivasan [STOC 2010]). On the other hand, surprisingly if one restricts the elements to be dxd upper triangular matrices, then determinant can be computed in poly(n^d) time. Combining this with the decomposition theorem of finite algebras, we get the following dichotomy result: if A is a finite dimensional algebra over a finite field, then the commutativity of A/R(A) determines efficient determinant computation (where R(A) is the radical of A).