![Carnegie Mellon University Deep Learning](/img/default-banner.jpg)
- Видео 731
- Просмотров 1 497 328
Carnegie Mellon University Deep Learning
США
Добавлен 5 янв 2018
“Deep Learning” systems, typified by deep neural networks, are increasingly taking over all AI tasks, ranging from language understanding, and speech and image recognition, to machine translation, planning, and even game playing and autonomous driving. As a result, expertise in deep learning is fast changing from an esoteric desirable to a mandatory prerequisite in many advanced academic settings, and a large advantage in the industrial job market.
In this course we will learn about the basics of deep neural networks, and their applications to various AI tasks. By the end of the course, it is expected that students will have significant familiarity with the subject, and be able to apply Deep Learning to a variety of tasks. They will also be positioned to understand much of the current literature on the topic and extend their knowledge through further study.
Instructor: Bhiksha Raj
In this course we will learn about the basics of deep neural networks, and their applications to various AI tasks. By the end of the course, it is expected that students will have significant familiarity with the subject, and be able to apply Deep Learning to a variety of tasks. They will also be positioned to understand much of the current literature on the topic and extend their knowledge through further study.
Instructor: Bhiksha Raj
Видео
S24 Recitation 15
Просмотров 4593 месяца назад
In this recitation, we will be covering Graph Neural Networks
IDL Spring 2024: Lecture 27
Просмотров 4743 месяца назад
This marks the twenty-seventh and final lecture of CMU's 11785 Introduction to Deep Learning course for the Spring 2024 semester, focusing on Hopfield Networks. We hope those of you who have followed along have had an enriching learning experience. Wishing you the best of luck and continued success in all of your future deep-learning adventures!
IDL Spring 2024: Lecture 26
Просмотров 5433 месяца назад
This is the twenty-sixth lecture of the 11785 Introduction to Deep Learning course at CMU on Graph Neural Networks.
IDL Spring 2024: Lecture 25
Просмотров 5183 месяца назад
This is the twenty-fifth lecture of the 11785 Introduction to Deep Learning course at CMU on GANs.
IDL Spring 2024: Lecture 24
Просмотров 5863 месяца назад
This is the twenty-fourth lecture of the 11785 Introduction to Deep Learning course at CMU on Diffusion Models.
S24 Recitation 13
Просмотров 2083 месяца назад
In this recitation, we will be covering Diffusion Models.
IDL Spring 2024: Lecture 23
Просмотров 6243 месяца назад
This is the twenty-second lecture of the 11785 Introduction to Deep Learning course at CMU on Variational Auto Encoders
S24 Recitation 12
Просмотров 1973 месяца назад
In this recitation, we will be covering Variational Autoencoders (VAEs)
IDL Spring 2024: Lecture 22
Просмотров 6833 месяца назад
This is the twenty-second lecture of the 11785 Introduction to Deep Learning course at CMU on Variational Auto Encoders I
IDL Spring 2024: Lecture 21
Просмотров 5813 месяца назад
This is the twenty-first lecture of the 11785 Introduction to Deep Learning course at CMU on Representation and Autoencoders.
S24 Bootcamp 4 Part 1
Просмотров 3224 месяца назад
Presentation for the bootcamp of Homework 4 Part 1.
S24 Bootcamp 4 Part 2
Просмотров 3274 месяца назад
Presentation for the bootcamp of Homework 4 Part 2.
Great lecture :)
59:46 "Sometimes these formulae may not make sense, but then if you look at them just right, they begin telling their own story, rigth? Every single mathematical term in life tells you a story if you know how to read it" 🤓✨
wow , everything falling in place !!!!!!!!
Wow, breathtaking quality. This series might be the most comprehensive explanation available for deep neural nets, somehow the professor is able wear the students hat and asks the most critical questions every time! Big thanks to everyone involved in making these available.
This was absolutely brilliant. A masterclass in lecture content design. Very well pieced together -> great flow -> Wow moment towards the end -> evokes a lot of curiosity
A great and informative lecture!! Very much appreciated!
best course about deep learning. now 2024 and happy I found it back. well done!
nice lecture!
lecture begins at at 6:02
Best explanation , can't thank enough for uploading these lectures .
Great lecture, thank you!
I struggled with grasping how the dimensions of the filters and data change with the convolutions and pooling, and this video made it clear. Thank you!
Wonderfull lecture! Thank you.
Lecture starts @ 2:26
Lecture starts @ 3:38
Start at 12:42
thanks!
Lecture starts at 1:15
great carefully thought out original course, was watching this leisurely and didnt realise an hour went by
This is a very good teacher. He knows how to explain things to students very well
Your teaching unravels the exact concept that is missed by most tutors. Thanks for the great lecture ❤
Thank you for great lecture! ps/ The stick you're holding is impressive.
These lectures are some of the best on the 'net along with Andrew Ng's lectures on Deep Learning. Mad props to the instructor who takes the time to go through the concepts. I wish I had access to the quizzes and group discussions.
What does "We have the id hiyore" mean?
Thank you again to Carnegie Mellon University & Bhiksha Raj. I find these lectures fascinating.
Couldn't help but think of 3B1B videos on hamming codes watching this.
Loving this series! Such a talented lecturer.
thanks for sharing ! :) how can I find the rest of the lectures of the bootcam? thanks again for such nice job!
30:12 question: When he says h1, h2 and h3 are k1, k2 and k3 but h1, h2 and h3 are hidden layers of a neural network. Right?
Amazing lecture as usual, thank you! 2 Cents from a German: Nouns (apple, name) start with a capital letter, so you would write "Apfel" and "Name"...but very happy you have chosen German in this example ;-)
'Promosm'
oh wow
worker app for open assistant plz
use recursive retention
lecture starts at 5:42
Thanks.
Excellent derivation
this lecture should come after lecture 23 - i.e. the videos labeled "18" should come before "17"
abraham lincoln and adolf hitler gay military wedding
opposing force ratio 100% attrition background 1947 european front
17:28 why do three rows mean three in_channels? I would expect that to just be the height of the input
This is when treating the data as 1D with 3 channels. There isn't a notion of height in 1D.
Thanks for posting the recitations too, you are truly transforming people's lives like mine ❤️
Thanks.
This is a gem. Made my fundamentals solid . Thanks :)
Thank you for putting this lecture online! This lecture should be number 18 😊
the matrix of l*d is for one word or complete sentence ? can you explain ?
These recitation videos and the gigantic amount of Python content that goes with them are much appreciated by self-learners, thanks TAs
CMU (like other top US colleges) seems to be cashing in on its brand and admitting tons of below average students into its Master's programs
The stock predictor network slide 14 has the output Y(t+6) for the state t+7. I guess it's a typo. ruclips.net/video/2-c1kaxUnmk/видео.html
Best lectures on DL
Can we get the assignments questions (Both Part 1 and Part 2)?
Hi, I was wondering if this term the part 1 of each homework would be open to students outside CMU? As a self learner, I think implementing NN from scratch using PyTorch will be very interesting, but there're very few tutorials on this.