A digital measuring instrument employs a sampling rate of 100 samples/second. The sampled input x(n) is averaged using the difference equation: Y (n) = For a step input, the maximum time taken for the output to reach the final value after the input transition is

Correct Answer: x (n)+x (n-1)+x(n-2)+x(n-4)/4
Since output y depends on input, such as no delay, delay by 1 unit, and delay by 2 unit, delay by 4 unit, so it will sum all the samples after 4 Ts (maximum delay), to get one sample of y. T =40 msec.