As multiple-input multiple-output (MIMO) radar gains popularity, more efficient and better-performing detection algorithms are developed to exploit the benefits of having more transmitters and receivers. Many of these algorithms are based on the assumption that the multiple waveforms used for target scanning are orthogonal to each other in fast time. It has been shown that this assumption can limit the practical detector performance due to the reduction of the area that is clear of sidelobes in the MIMO radar ambiguity function.
In this work it is shown that using the same waveform with a different carrier frequency and/or delay across different transmitters ensures relative waveform orthogonality while alleviating the negative effects on the ambiguity function. This is demonstrated in a practical scenario where the probing waveforms consist of Gaussian pulse trains (GPTs) separated in frequency. An approximate theoretical model of the ambiguity is proposed and it is shown that the effects of cross-ambiguity in the MIMO system are negligible compared to the waveform autoambiguities.