This paper discusses rowwise matrix correlation, based on the weighted sum of correlations between all pairs of corresponding rows of two proximity matrices, which may both be square (symmetric or asymmetric) or rectangular. Using the correlation coefficients usually associated with Pearson, Spearman, and Kendall, three different rowwise test statistics and their normalized coefficients are discussed, and subsequently compared with their nonrowwise alternatives like Mantel's Z. It is shown that the rowwise matrix correlation coefficient between two matrices X and Y is the partial correlation between the entries of X and Y controlled for the nominal variable that has the row objects as categories. Given this fact, partial rowwise correlations (as well as multiple regression extensions in the case of Pearson's approach) can be easily developed.