   ### 11-785 Spring 2023 Recitation 0B: Fundamentals of NumPy (Part 3/8)

We’re going to be looking at pivoting data and more specifically reshaping numpy arrays. So what is a reshape operation? A reshape operation is used to change the shape of a numpy array without altering the values or the number of elements in the array. So we’re first going to look at reshaping within the […]

### 11-785 Spring 2023 Recitation 0G: Debugging and Visualisation (Part 2/3)

Hi everyone, this is the part 2 of Recipes 0G, debugging of deep neural networks. In this video, we’ll go through 3 types of basic coding errors, syntax errors, logic errors, and runtime errors. After following tips in this section, you should be able to get your code running smoothly. Firstly, the most common beginner […]

### #23 Machine Learning Specialization [Course 1, Week 2, Lesson 1]

I remember when I first learned about vectorization, I spent many hours on my computer taking an unvectorized version of an algorithm running it, see how long I ran, and then running a vectorized version of the code and seeing how much faster that ran. And I just spent hours playing with that, and it […]

### 6 Tips to write BETTER For Loops in Python

Hi everyone, I’m Patrick and in this video I show you six tips how you can write better fall loops in Python. These tips include some refactorings that you can apply right away to improve your code. So we start with a few very beginner friendly tips and then also move on to some advanced […]

### Encoding a Feature Vector for PyTorch Deep Learning (4.1)

Welcome to applications of deep neural networks and PyTorch with Washington University. In this part, we’re going to take a look at how to encode tabular data for PyTorch. Tabular data is data that easily fits in something like Microsoft Excel. It’s not the slickest application of deep neural networks that are often doing computer […]

### MaDL – Eigenvalue and Singular Value Decomposition

Many mathematical objects can be better understood by breaking them into parts. An eigen decomposition decomposes a matrix into so-called eigenvectors and eigenvalues.

### 11-785, Fall 22 Lecture 23: Generative Adversarial Networks (Part 1)

The first lecture on GANs was the first lecture of the semester on Generative models. We have seen discriminator models which Model the conditional distribution. Discriminative models find and it aims to find a decision boundary which separates this data from this set of data. So in in in generator models your aim is to just find the distribution of the data and not just to find the boundary.

In this unit, we will see how we can add and multiply matrices and vectors. We add or subtract vectors or matrices by adding or subtracting them element-wise. Here on the left we can see an example for a vector where the input vectors A and B are element-wise summed to yield respectively each element […]