Using mostly-static frames of grayscale snow to represent signal loss is an interesting decision. Normally when such a noise pattern is used, it would be rendered at a high frame rate, but Blue Iris tends to just generate one, or a few frames, throughout the entire signal loss period, so to me it just feels weird. It is an extremely low resolution noise pattern so the CPU cost to generate it, even with a poorly optimized algorithm, should be negligible. Where the efficiency may matter more is when H.264 or jpeg encoding a noise frame, I think it probably eats a larger chunk of data/bandwidth than the old color bars pattern.
The only other thing I notice in this update is that the camera title bars have changed from a gradient to a flat color. A flat color which, interestingly, is actually quite far away from the color assigned to the camera by the user.