AD
Based on 10 Users
TOP TAGS
- Has Group Projects
Grade distributions are collected using data from the UCLA Registrar’s Office.
Grade distributions are collected using data from the UCLA Registrar’s Office.
Grade distributions are collected using data from the UCLA Registrar’s Office.
Grade distributions are collected using data from the UCLA Registrar’s Office.
Grade distributions are collected using data from the UCLA Registrar’s Office.
Grade distributions are collected using data from the UCLA Registrar’s Office.
Grade distributions are collected using data from the UCLA Registrar’s Office.
Grade distributions are collected using data from the UCLA Registrar’s Office.
Grade distributions are collected using data from the UCLA Registrar’s Office.
Grade distributions are collected using data from the UCLA Registrar’s Office.
Sorry, no enrollment data is available.
AD
I came into the class excited but somewhat unsure of what I was getting myself into. It turns out to be tough but rewarding.
The class starts with a linear algebra refresher, which can be a bit hard to follow if you don't have experience with it before. That said, it was just to get everyone on the same mathematical footing before diving into neural networks. The lectures gradually build from linear associators and perceptrons to backpropagation, all framed through a very math-focused, linear-algebra lens. While the rigor is great, I do think the early lectures could benefit from more visualizations or diagrams (sometimes staring at equations makes the intuition a bit harder to grasp). However, the sections on biologically inspired models (e.g. lateral inhibition, Hopfield networks, some of the history behind associative memory/models) were especially interesting since they connected classical ideas to neuroscience, modern applications, and ongoing research.
Take the programming prerequisite seriously. All six coding assignments assume you can independently debug and write working code. The professor makes it very clear that he cannot help troubleshoot code, and assignments that don’t compile do receive zeroes. The syllabus says Python or C++ are both allowed, but realistically, given the timeline, I think Python is the way to go unless you’re very comfortable in C++ already.
The grading is based on 1 mini quiz, 6 homework assignments, a final exam, and a final project. The quiz and homeworks were straightforward, but the final exam was tougher than expected, partly because we weren’t given much indication of what content would be emphasized, and it included material from the assigned readings.
The class is quite doable if you come in with linear algebra and some basic machine learning experience. Things start getting difficult once the final project begins. You get a lot of creative freedom: you can explore any topic you’re interested in and build something, whether that’s a research idea you want to develop or something simpler just to pass the class. But there’s a noticeable gap between what we learned in lecture (up to feedforward networks, implemented via vector/matrix operations) and what was expected in the project (using and improving open-source models, e.g. pre-trained CNNs, LSTMs). We were never directly taught how to use tools like PyTorch, Keras, scikit-learn, or data-processing libraries, so some background in data science definitely helps. Despite the difficulty, the project experience was very fulfilling. The professor conducts a Q&A after each final presentation, intentionally intimidating because he tries to poke holes in your work in a way that simulates a real academic defense. I found that part especially valuable. Our TA (Zoe) was also very supportive, often staying after class to help answer questions.
Overall, this is a challenging but meaningful class. You will get out of it what you put in. Both the professor and the TA are obviously knowledgeable and happy to help. If you’re willing to learn independently, wrestle with new tools, and push yourself, you’ll walk away with a deeper appreciation for classical/modern neural networks and the satisfaction of building something you didn’t know you could build.
Only take it if you're really interested in NNs or need it as a req. Basically, you'll be on your own; the TA can be helpful with the programming questions, and the professor can help with some conceptual questions (probably far detached from what you actually should know about the subject). Read homework spec's carefully, as you'll be graded on the stuff that wasn't even there (but somehow is on the grading rubric). Overall grading was arbitrary, and especially so for the final project and the final grade. I seriously believe that the professor graded based on the impressions that he randomly got from us during final presentations and/or office hours. You'll receive no legitimate feedback about your final grade, even if asked.
I came into the class excited but somewhat unsure of what I was getting myself into. It turns out to be tough but rewarding.
The class starts with a linear algebra refresher, which can be a bit hard to follow if you don't have experience with it before. That said, it was just to get everyone on the same mathematical footing before diving into neural networks. The lectures gradually build from linear associators and perceptrons to backpropagation, all framed through a very math-focused, linear-algebra lens. While the rigor is great, I do think the early lectures could benefit from more visualizations or diagrams (sometimes staring at equations makes the intuition a bit harder to grasp). However, the sections on biologically inspired models (e.g. lateral inhibition, Hopfield networks, some of the history behind associative memory/models) were especially interesting since they connected classical ideas to neuroscience, modern applications, and ongoing research.
Take the programming prerequisite seriously. All six coding assignments assume you can independently debug and write working code. The professor makes it very clear that he cannot help troubleshoot code, and assignments that don’t compile do receive zeroes. The syllabus says Python or C++ are both allowed, but realistically, given the timeline, I think Python is the way to go unless you’re very comfortable in C++ already.
The grading is based on 1 mini quiz, 6 homework assignments, a final exam, and a final project. The quiz and homeworks were straightforward, but the final exam was tougher than expected, partly because we weren’t given much indication of what content would be emphasized, and it included material from the assigned readings.
The class is quite doable if you come in with linear algebra and some basic machine learning experience. Things start getting difficult once the final project begins. You get a lot of creative freedom: you can explore any topic you’re interested in and build something, whether that’s a research idea you want to develop or something simpler just to pass the class. But there’s a noticeable gap between what we learned in lecture (up to feedforward networks, implemented via vector/matrix operations) and what was expected in the project (using and improving open-source models, e.g. pre-trained CNNs, LSTMs). We were never directly taught how to use tools like PyTorch, Keras, scikit-learn, or data-processing libraries, so some background in data science definitely helps. Despite the difficulty, the project experience was very fulfilling. The professor conducts a Q&A after each final presentation, intentionally intimidating because he tries to poke holes in your work in a way that simulates a real academic defense. I found that part especially valuable. Our TA (Zoe) was also very supportive, often staying after class to help answer questions.
Overall, this is a challenging but meaningful class. You will get out of it what you put in. Both the professor and the TA are obviously knowledgeable and happy to help. If you’re willing to learn independently, wrestle with new tools, and push yourself, you’ll walk away with a deeper appreciation for classical/modern neural networks and the satisfaction of building something you didn’t know you could build.
Only take it if you're really interested in NNs or need it as a req. Basically, you'll be on your own; the TA can be helpful with the programming questions, and the professor can help with some conceptual questions (probably far detached from what you actually should know about the subject). Read homework spec's carefully, as you'll be graded on the stuff that wasn't even there (but somehow is on the grading rubric). Overall grading was arbitrary, and especially so for the final project and the final grade. I seriously believe that the professor graded based on the impressions that he randomly got from us during final presentations and/or office hours. You'll receive no legitimate feedback about your final grade, even if asked.
Based on 10 Users
TOP TAGS
- Has Group Projects (2)