Hi
I audited this entire course and understood the theory part.But I need to learn how to code these theory part, can anyone suggest any Youtube videos or something like that.
Thanks in advance!!
Hi
I audited this entire course and understood the theory part.But I need to learn how to code these theory part, can anyone suggest any Youtube videos or something like that.
Thanks in advance!!
Not that i know of.
Hi @Shyam99 Ask a LLMs to provide code with step by step explanation on the things you’re learning and run in google colab or kaggle notebook.
Let me give you an example:
Use Gemini which is a free LLMs and explain what you are learning and that you need code:
Prompt:
“I am learning gradient descent, can you provide the code with step by step explanation on how to implement gradient descent from scratch”
import numpy as np
import matplotlib.pyplot as plt
# Generate sample data
np.random.seed(0)
X = 2 * np.random.rand(100, 1)
y = 4 + 3 * X + np.random.randn(100, 1)
def compute_cost(X, y, theta):
m = len(y)
predictions = X.dot(theta)
cost = (1 / (2 * m)) * np.sum(np.square(predictions - y))
return cost
def gradient_descent(X, y, theta, learning_rate, n_iterations):
m = len(y)
cost_history = np.zeros(n_iterations)
for i in range(n_iterations):
predictions = X.dot(theta)
error = predictions - y
theta = theta - (learning_rate / m) * X.T.dot(error)
cost_history[i] = compute_cost(X, y, theta)
return theta, cost_history
X_b = np.c_[np.ones((len(X), 1)), X]
theta = np.random.randn(2, 1)
learning_rate = 0.1
n_iterations = 1000
theta, cost_history = gradient_descent(X_b, y, theta, learning_rate, n_iterations)
print("Theta (m, c):", theta)
print("Final Cost:", cost_history[-1])
plt.scatter(X, y)
plt.plot(X, X_b.dot(theta), color='red')
plt.xlabel("X")
plt.ylabel("y")
plt.title("Linear Regression with Gradient Descent")
plt.show()
plt.plot(cost_history)
plt.xlabel("Iterations")
plt.ylabel("Cost (MSE)")
plt.title("Cost History")
plt.show()
Didn’t check the accuracy of the code but looks good.
I hope this helps!
Worth a shot!!
Thanks a LOT