Introduction - Dave Lommen

Hi!
My name is Dave, and I just started a course on Coursera. I was recommended to join this forum, so here I am. I need to learn machine learning to understand and adapt nowcasting models.
Cheers, Dave

@davelomm Hey Dave,

Welcome.

I mean if you are totally new to all this, I wouldn’t start there (rather, say, the ML Cert)-- But by the time you get to DLS Course 5 I think you’ll find the discussion on constructing sentiment analysis interesting (if that turns out to be the path you wish to take).

Best,
-A

Hey Nevermind,

Thanks for your prompt reply! Frankly, I have no idea what you are talking about when you say DLS Course 5 or sentiment analysis. I just started with Supervised Machine Learning: Regression and Classification on Coursera. So far, it seems incredibly basic.

Cheers,
Dave

DLS = Deep Learning Specialization, and course 5 of that series focuses on what are called ‘Sequence Models’, or basically neural networks with some time-series component attached.

By sentiment analysis I mean the automatic parsing of a large body of data, most often text, to determine a general over all ‘mood’ or ‘intent’ of sorts or similar.

Not as sophisticated as what LLMs do in terms of summarization, but, seeing as I’d never heard of ‘nowcasting’ (and had to look it up), but did study Economics, it could be potentially useful to know for detecting the ‘moods’ of markets in real time (i.e. analyzing live feed financial news, twitter feeds, forums, etc).

Best of luck with your journey !

It looks like this Deep Learning Specialization is what I need, although I should probably start at course 1.
Where may I find it?
Is it on Coursera, or somewhere here on DeepLearning.AI?

The nowcasting I am supposed to get involved in is short-term weather prediction. Rough weather predictions can be done pretty well several days into the future. However, predicting where and when and how much exactly it is going to rain is still very difficult, even in terms of hours.

@davelomm Ohhh… Chaos theory. Fun. Unfortunately I don’t believe that is covered here :wink:

Deep Learning | Coursera

Sequence Models | Coursera

Also, I’m not sure (yet) if this is above my paygrade but you might find this course interesting too:

Probabilistic Graphical Models | Coursera

1 Like

I strongly recommend you complete the Machine Learning Specialization before you move on to other more advanced courses.

1 Like

Hi Tom,
Thanks for the tip! At the rate I am going, I should finish the Machine Learning Specialization within a week, so it will not set me back too much.
Cheers,
Dave

@davelomm Actually, this is something I’ve kind of been wondering about for a long time, but not had a chance to play around with yet.

I mean, they say NNs have been proven to be able to model any function-- Well… What if you feed it a dynamic (aka Chaotic) one ? I mean of course the system is still deterministic (like the weather)… But will it be able to form the Lorenz attractors and what not ?

It has been well over 15 years since I read it, but I know at the start of James Gleick’s Chaos: Making of a new Science he points out such an equation for population.

I’ve always wondered if a NN could replicate this behavior, but, honestly-- I don’t know;

The challenge with creating a model for a chaotic system is, how do you get a suitable training set, if the behavior is chaotic?

Obviously, I am not an expert. However, if I understand correctly, chaotic systems diverge in the long term. For nowcasting (the weather), we are looking at changes and predictions in the short term, say several hours. Although this is still very challenging, there are some algorithms that do this reasonably well - at least under certain circumstances.

Still, the challenge remains. What we are really interested in is extreme cases: what kind of a downpour will flood (part of) a city? Since these are extreme cases, they are relatively rare. If we just feed a hundred days’ worth of satellite data into a system, where there is only one such an extreme case, the system might “learn” that they do not form. So, indeed, how do we teach the system that extreme cases do form, and how does it distinguish it from the cases where they do not form? People are currently unable to properly predict these extreme cases; we hope that computers (in particular, neural networks) may be able to do so.

@davelomm again, it has been ages since I read this, and of course this is a generalist, not a specialist text-- And in a certain way you may be correct as suggesting this applies mostly to long term outcomes–

At the same time though, as you describe it (and this is the beauty of dynamical systems), even in the short term, these ‘extreme events’: 1) We are presuming these systems are, after all, deterministic (i.e. there are fixed rules that govern them, no ‘magic’ happens) 2) That these extremes are not so much ‘outliers’, but fundamental parts of the system itself.

In the context of Lorenz attractors, they are yet another ‘basin’, one perhaps the system doesn’t often fall into, but will do so under just the right conditions (the scales tip in just the right way).

This sort of activity gets washed when we see various outcomes only under the structure of ‘averages’ or standard approaches to probability.

Which makes me wonder how this would apply to neural networks, where, via gradient descent or other, we are seeking to minimize our cost in search of the ‘global minimum’ of the function.

We recognize local minima may exist, but we tend to ignore, try to push past them. But in the context of a dynamic system, perhaps some of these local minima are important after all–

The one other thing he mentioned, with regards to the weather, is apparently the issue is not simply a matter of ‘having enough sensors’. You could have them 100 miles apart, or every 2 inches apart. The increase in resolution does not improve the degree of determination of the function of the behavior of the system.

And, finally, though I greatly welcome your interest and hope you do pursue it (and if you figure something novel out please come and tells us ! :smiley:)-- But, they have literal super computers running on forecasts, something you don’t find for many tasks in this world… In part because so many economic value is tied up in its outcome… So someone must have already tried running neural networks on this type of application, and apparently it is not better yet.

My point: Maybe they are not constructing/running the right type of neural network yet.

Who knows ? Maybe you will invent something !

Happens every day.

Hi, Anthony!

It seems like you are hitting the nail on the head. Recent extreme weather events were not predicted, to some extent because we have taught our models that they do not occur. These are like the local minima you mention. We have to teach the system that these extremes do occur; we hope that the system figures out under which specific circumstances.

As for the number of sensors being irrelevant, that is not quite true. As you say, the weather is deterministic and, indeed, predictions have gotten better with increased input. There are at least two problems, though: our information will always be incomplete, regardless the sensor density (e.g., unpredictable pandemics affect the climate), and our computers are not strong enough to incorporate all the information we already do have.

And your point about the meteorologists using neural networks on supercomputers is basically why I am doing these courses to begin with. Predictions are getting better, and a state-of-the-art model I was tasked to study is NowcastNet. However, after studying that, and references to U-Net and the like, and looking up all kinds of fancy terms, I realised that I first really have to go through the basics to be able to contribute in any useful way.

Cheers,
Dave

1 Like

@davelomm stick with it ! UNets are covered in the Deep Learning Specialization course 4 (Convolutional Nets).

1 Like

Deep Learning Specialization course 4… It may take me a while to get there!