Understanding exponentially weighted averages, week 2

Hello mentors,
I am very confused about this topic understanding exponentially weighted averages and i am not able to understand weighted decay and also this formula (1 - epsilon) ^1/epsilon.

Hello @Shantanu7 ,
Thanks a lot for asking this question. I am a mentor and I will try to explain exponentially weighted averages.

Exponentially Weighted Averages (EWA) is a technique used to calculate the moving average of a time series, where more recent data points are given higher weights, and older data points are given lower weights. The weights decline exponentially as the data points get older, hence the name “exponentially weighted”. This method is commonly used as a smoothing technique in time series analysis and is also applied in various optimization algorithms in deep learning, such as Gradient Descent with Momentum, RMSprop, and Adam.

The EWA is calculated using the following formula:

V_t = \beta * (V_{t-1}) + (1-\beta) * \text{NewSample}

Here, V_t represents the weighted average at time t, \beta is a parameter that determines the weight given to previous values, and \text{NewSample} is the new data point at time t . The parameter \beta is usually between 0 and 1, and it determines how important the current observation is in the calculation of the EWA. The higher the value of \beta, the more closely the EWA tracks the original time series.

The EWA can be thought of as an exponential decay, where the current value is multiplied by (1-\beta), and the previous values are multiplied by exponentially decaying factors of \beta . This ensures that more recent data points have a higher impact on the moving average, while older data points have a lower impact.

In summary, Exponentially Weighted Averages is a method for calculating the moving average of a time series, where recent data points are given higher weights, and older data points are given lower weights that decay exponentially. This technique is useful for smoothing time series data and is applied in various optimization algorithms in deep learning.

I hope my reply explained exponentially weighted averages clearly. Please feel free to ask a followup question if you still have uncertainties about exponentially weighted averages.
Regards,
Can Koz

1 Like

Hello sir,
Thank you for your help, I know all of these things prof. andrew ng explain all of these things.
I am little bit confused into this formula (1 - epsilon) ^1/epsilon.

This is awesome. Thank you.