# Logistic_Regression_with_a_Neural_Network_mindset

Hi,
the course is talking about classifcation
is there any course for regression with neural network ?

Hi and welcome!

Courses on Deep Learning Specialization offer different learnings as you proceed with. You can check out the entire syllabus here. If you want to know anything else, DLAI team and the community would be happy to assist you further.

Thanks.

1 Like

Hi, @yahiaoui_asma. I encourage you to hang on, and here is why. A regression problem with a continuous-valued output can be converted into a â€śnon-parametricâ€ť or â€śsemi-parametricâ€ť regression (the difference is often in the eye of the beholder) using a classification model. To fully grasp this, you will need to stay on through at least through Course 2, although this interpretation will not be made explicitly apparent.

The key will be in converting your continuous output variable `y` (e.g. house prices) into a discrete one by â€śbinningâ€ť the continuous data. Think about a histogram. Each bin contains a number of observations of the output variable, and these can be expressed as a proportion of the data (i.e. number of observations in that bin divided by the total number of observations in the sample). The individual bins thus comprise a â€śclassâ€ť for which your model can then predict a probability that a particular constellation of input features in the design matrix `X`, will fall into within one category or another.

The output is then an entire discrete probability distribution, which is way more informative than a â€śpointâ€ť prediction. From that you can compute the distributionâ€™s: 1) mean, median, and mode for the point prediction, 2) the variance and standard deviation for the confidence you place in the point prediction, 3) and higher order statistics (â€śmomentsâ€ť) such as skewness (for asymmetry of the distribution), and kurtosis (for the fatness of the tails). And, if thatâ€™s not enough, you can estimate a continuous distribution from your discrete output through a technique called â€śkernel density estimation (KDE).â€ť

Obviously, this is beyond the scope of discussion in this forum, but with a little research you can fill in the gaps after you take Course 2, that covers multinomial classification (more classification than two). The key here is that the intervals or bins become your prediction target.

And, if you are interested in time-series analysis and prediction, the â€śconvolutional neural networksâ€ť used for image classification in Course 4, can be converted into (lower-dimensional) temporal convolutions. Course 5, is also highly relevant in this regard as you learn techniques to map sequences into sequences.

I am doubtful that there is an online course that covers this. In fact, I do no even know where to look because I did all of this on my own without the help of any existing literature that may, or may not, exist. My guess is that itâ€™s out there; itâ€™s a much too obvious application for me to have been the inventor.

I think that the key takeaway here is that the Specialization is foundational for whichever way you wish to go in deep learning. No disclaimer necessary: I am just a volunteer mentor who doesnâ€™t even own shares in Coursera!

2 Likes