I don't think that's correct. Imagine a "worst case" digital test signal that is all zeroes (or randomize the LSB for dither, doesn't matter). After about 15 seconds of this, set one sample to full scale / max amplitude, then immediately back to zeroes. That 15 seconds of silence isn't really necessary, but I include it to illustrate your point. Consider it as "the music has been playing for a long time, everything is in steady state". Plenty of time to accommodate filter delay and fill all taps. A linear phase filter will produce an analog wave that looks like sinc(t), which is symmetric in time surrounding the pulse, so it has a bit of ripple both before and after it. To clarify, this ripple is not caused by the linear phase filter suddenly starting up, and the pulse happening before the filter sees enough samples to satisfy its latency requirements. That wouldn't happen anyway, because the DA converter would delay output of any analog signal until it has read ahead far enough to satisfy its latency requirements. A finite bandwidth analog signal representing an impulse will ALWAYS have some ripple, somewhere. It's mathematically required. One can argue that this ripple is inaudible, especially when oversampling the digital signal to widen the transition band and reduce the filter slope. I'm inclined to agree with that. But that ripple always exists.