Week 1: non-technical definition of Deep Learning- Optional Recording 1

Hi,

Would be glad to receive your response on,
“how the neuron definition are self created (as implied in the video) given a set of inputs A & set of outputs B? Does the system sort of run a regression analysis to assume best fitting functions where in at each level of neuron it can only take certain logical values only(Then the system must try all these possible values one by one to see which best fits the overall equation for the given value of inputs and outputs and finalize the ones that offer these best fits).”
Best
Pankaj

Not quite that fancy.

Each type of problem (i.e. predicting a real value, or predicting a classification) has a different standard model. One is “linear regression”, the other is “logistic regression”.

There is a weight value associated with each “activation unit” (i.e. ‘neuron’).

When designing the model, you can vary the number of units, and how they are interconnected.

A relatively simple mathematical process is used to adjust the weight values so that the model’s predictions matches the known values (using a set of labeled data for training) as closely as possible.

1 Like

Hey TMosh thanks for the prompt reply.

Hmmm, may not be as fancy as what I imagined it to be, nonetheless your response doesn’t read any less fascinating.

Where do I go and learn more about this?

The way you have interpreted it for me, I feel that it would render itself well in a graphical representation. Is there a graphical rendition as well for this explanation available anywhere?

Have a good day!!

Best
Pankaj

I recommend you enroll in the Machine Learning Specialization (on Coursera).

1 Like

Hi Tom,

I checked machine learning on Coursera and it threw tons of options at me. Did you mean any particular course when you suggested above?

Thanks

Its called the “Machine Learning Specialization”, from Stanford University and DeepLearning.AI.

https://www.coursera.org/specializations/machine-learning-introduction

1 Like

Note that this course is now available on the DL.AI Learning Platform as well.

1 Like

hi @Freeworld

Welcome to Deep learning.AI community. Hope you are having fun learning about deep learning and AI

Deep learning is a subset of artificial intelligence that uses machine learning, like a human brain, using layers of algorithms to recognize patterns in data. it uses many layers of artificial neurons to process information in a hierarchical, or pattern with multi faceted architecture design based on the dataset at hand.

I will try to explain from a simple, non-technical term to explain few terms which we use in deep learning and ai.

1. The Inputs (raw data)

Imagine teaching a child to recognize a dog. You show them pictures. In deep learning, the input is the raw data, such as the pixels of an image, words in a sentence, or rows in a spreadsheet.

How it works: Each pixel or data point is converted into a number and fed into the input layer.

Analogy: The input is like looking at the individual puzzle pieces before trying to figure out the pictures.

2. Neurons (the processors)

A neuron is the fundamental unit of a neural network.

Weights & Biases: Each connection to a neuron has a “weight” (strength) that determines how much importance to give that input.

Process: A neuron calculates a weighted sum of its inputs, adds a constant (called a “bias”), and passes it to the activation function.

3. Activation Units

An activation unit (or function) is a mathematical gate that decides whether a neuron should pass information to the next layer or stay silent.

Why it’s needed: Without activation functions, the network is just a simple calculator. These functions allow the network to understand complex, non-linear patterns in a data distribution. Like check when we notice a image of cat, our brain notices part of features that resembles a cat, and then it signals to part of that human brain to correlate it with cat when each puzzle pieces get assembled one by one.

Common Types:

i. ReLU (Rectified Linear Unit): If the input is negative, it outputs 0. If positive, it passes the value through. It’s the most popular, fast, and simple function.

ii. Sigmoid: Used for binary classification (e.g., Yes/No), it squeezes values between 0 and 1.

Analogy: Think of this as a “threshold” for a manager (neuron) deciding whether to ignore a report or act on it.

4. The Output (the outcome or result)

The output layer is the final step, providing the prediction or classification based on the task in hand.

For example for prediction if a task is to predict what an image has, is the neural network able to detect all the items in the image correctly as labelled.

For classification, if a task is to differentiate between two types of animals in a video or image containing cat/dog is the model able to correctly classify into correct detect cats as cats and dogs as dogs. (here one would use the sigmoid classification as we can label cat as value 0 and dog as 1, to be detected with the neural network algorithm)

How They All Work Together

  1. Input: An image of a cat is fed into the input layer.

  2. Hidden Layers: Several hidden layers of neurons process the data. Early layers might recognize simple lines, while later layers detect shapes like ears or whiskers.

  3. Activation: Activation units inside the neurons decide which features are important, passing the information forward.

  4. Output: The final layer calculates the probability and outputs “Cat”.

  5. Learning: If the answer is wrong, the network uses a technique called “backpropagation” to adjust its weights, allowing it to improve accuracy over time.

And to learn about deep learning, the right course would be Deep Learning Specialisation which explains both the machine learning and dynamic behind how neural network works.

Machine Learning Specialisation also explains the similar parts of this dynamic with more focus on weight matrix calculation on how weights calculates eventually to determine cost value to get the right output.

Don’t worry, go one step at a time, you will learn, once you start learning from one course to another.

Regards

Dr. Deepti