Hi, I tried looking this up and got into a weird recursive search through engineering literature.

If you want to explain it to me in an easy example I can try to translate the langauge.

In the meantime I can conjecture what I think it will end up being: If V,W are vector-spaces or free modules then an element of V* \otimes W can be described by a matrix if I wish to choose bases for V and W (but this is not true if V and W are only locally free modules or coherent sheaves etc). Even in general, the 'contraction' or 'trace' map from V* \otimes V to the trivial (free of rank one) module or vector-space or sheaf together with multilinearity create a lot of contraction mappings. So for instance there is a contraction map V\otimes V* \otimes W -> W. It is denoted by the symbol trace \otimes 1 where 1 denotes the identity endomorphism of W. And if X is yet another object (vector-space, module etc) then there is also the contraction V\otimes V* \otimes W \otimes X -> W \otimes X which is sometimes denoted as trace \otimes 1 \otimes 1. Note that an element of V \otimes X together with an element of V* \otimes W can be thought of as a pair consisting of a tensor and a matrix, and when we put them together we get an element of our domain V\otimes V*\otimes W \otimes X to which we can apply the contraction to end up with an element of W \otimes X, so this does constitute 'something you can do with a matrix and a tensor'

It is my guess that you are looking at one or another contracting map but it is frustrating to go through the SIAM literature and figure out the notation.

Here is roughly how the notation departs.

In engineering language (also sometime in physics) people sometimes use context to speak of ' a tensor' as if there were some big universal 'set of all tensors' in whatever context they happen to be working, and may say in physics "A tensor is just for example a differential form or a vector-field", or in engineering "A tensor is just a tuple A\^I\_J with subscripts and superscripts indexing its entries."

By contrast in math language there isn't a definition of 'a tensor' just like there is no definition of 'an element of the inverse image' without specifying, the inverse image of what function?

In the world of rings and modules, there is an adjunction Hom\_R(A\otimes\_S B, C) \cong Hom\_R(A, Hom\_S(B,C)) when A is an R-S bimodule and C is an S module which characterises tensor product categorically, A\otimes B is 'the module whose maps to C could be defined to be maps from A to Hom(B,C)' and in the case of free modules like if B is a copy of S\^n then A \otimes B is just a copy of A\^n.

TL;DR It might be (trace \otimes 1\otimes 1) but if you give me a real-world example I'll know for sure.