I noticed some mistranscriptions in the transcript (subtitles, captions) for the course videos. Below is the errata for the 3rd week. I’ll post the errata for the other weeks in those groups.
The format for the errata is as follows. The approximate time is in braces, then the mistranscribed word or phrase follows, followed by a hyphen, then followed by the correct word or phrase. So like this:
[time] mistranscribed word or phrase - correct word or phrase
Video 2 evaluating a model
[0:12] his performance - its performance
[1:13] weakly - wiggly
[1:15] parody - probably
[2:33] it has set - a test set
[2:46] his performance - its performance
[6:35] My love life these - might look like these
[9:24] By new classification toss - binary classification task
[9:46] Learning outcomes doing - learning algorithm is doing
Video 3 - model selection
[7:03] Trust check - cross check
Video 4 - diagnosing bias and variance
[3:48] chase in the middle - case in the middle
Video 7 - learning curves
[1:21] that has - that as
[2:16] Entries - increased
[2:33] Larger shading - larger training
[3:38] Average with high bias - algorithm with high bias
[4:46] It’s because be honest - it’s because beyond a certain point
[4:51] street now - straight line
[6:46] added - at it
[7:16] added - at it
[7:26] forefather - fourth-order
[10:37] Them although - a model
Video 8 - deciding what to do next revisited
[0:46] Algorithm mix on the set three large errors - algorithm makes unacceptably large errors
[8:04] This isn’t lost stuff - this is a lot of stuff
Video 9 - bias/variance and neural networks
[1:18] Buyers and variants - bias and variance
[1:45] Your networks - neural networks
[1:45] away all of this - a way out of this
[2:31] supplies - applies
[3:37] fall - well
[3:41] Doesn’t - does it
[3:53] Doesn’t want to - does well on
[4:13] Do you just want - does it do well
[4:18] does when the cross foundation - Does well on the cross validation
[6:16] buys - bias
[6:32] Neural your network - neural network
[7:24] Launch a neural network - larger neural network
[7:52] So the last year - so the loss here
[8:07] Longer over two m - lambda over two m
[8:12] A song over always W - a sum over all weights W
[9:38] Albums performance - algorithm’s performance
[9:47] New network - neural network
[10:00] Various problems - variance problems
[10:21] God - guide
Video 11 - error analysis
[1:19] Counter by hand - count up by hand
[1:19] in this classified - that are misclassified
[4:01] is classified - misclassifies
[4:29] is classifies - misclassifies
[5:35] Standards you’re - spammers are
[5:39] Former spam - pharma spam
[6:45] Lisa - we just
[6:55] Former spam - pharma spam
[7:25] Average - algorithm
Video 12 - adding data
[2:48] Album - algorithm
[3:19] That’s why they use - that’s widely used
[3:33] Train example - training example
[3:43] [INAUDIBLE] - OCR
[5:20] War pings - warpings
[5:52] Data documentation - data augmentation
[8:08] Here - to hear
[10:55] funds - fonts
[11:28] Albums - algorithm
[11:39] Toss - tasks
[11:42] talks - tasks
Video 13 - transfer learning
[1:23] Chip - keep
[7:28] They store generic - but still generic
Video 14 - full cycle machine learning project
[1:24] Caravel - carry out
[5:47] Of the not too higher computational - hopefully not too high of computational
[8:05] Parsers - class
Video 15 - fairness, bias, and ethics
[0:04] albums - algorithms
[0:48] hiring two - hiring tool
[1:09] Yeah, that’s where the community - get better as a community
[3:34] Standards - spammers
[6:06] Diversity and carried out - diverse team carrying out
[6:29] Approve those two - approve loans to
[8:33] Likely - lightly
[8:57] Buyers - biased
[9:35] Steve says - data sets
Video 16 - error metrics for skewed datasets
[3:20] of useful - it’s useful