#StackBounty: #time-series #autoregressive How to quantify sensitivity in time series model?

Bounty: 50

X and Y are time series of length T. X is the predictor and Y is the response. A linear model is fitted as follows:

$$hat{Y_t}=alpha+sum_{i=1}^{N}{beta_iX_{t-i}}$$

where $beta$‘s and $alpha$ are such that they minimise squared errors between $Y$ and $hat{Y}$.

Now I want to know "How sensitive is $hat{Y}$ to X?"

In an ordinary linear regression (without the temporally lagged quantities on the right), the answer would just be $beta$, but here I have $N$ different $beta$‘s. Are there ways in which I can condense the$N$ different $beta$‘s into a scalar quantity? Or any other method to answer "How sensitive is $hat{Y}$ to X?"

Potentially relevant information but ignore if not needed:

  1. X and Y vectors are highly auto-correlated. For example, X is daily temperature, and Y is daily ice cream sales.
  2. When I say "How sensitive is $hat{Y}$ to X?", I mean how much is Y affected for changes in X. For example, ice cream sales would likely be very sensitive to daily lagged temperature, but laptop sales would probably be insensitive to daily lagged temperature.


Get this bounty!!!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.