Machine Learning for Earthquake Seismology with Karianne Bergen – #554

Today we’re joined by Karianne Bergen, an assistant professor at Brown University. In our conversation with Karianne, we explore her work at the intersection of earthquake seismology and machine learning, where she’s working on interpretable data classification for seismology. We discuss some of the challenges that present themselves when trying to solve this problem, and […]

A Universal Law of Robustness via Isoperimetry with Sebastien Bubeck – #551

Today we’re joined by Sebastian Bubeck a sr principal research manager at Microsoft, and author of the paper A Universal Law of Robustness via Isoperimetry, a NeurIPS 2021 Outstanding Paper Award recipient. We begin our conversation with Sebastian with a bit of a primer on convex optimization, a topic that hasn’t come up much in […]

Google's Incredible New Quantum Computer Company – SandBox

Sandbox is the newest Quantum Computer company straight from Google which is focusing on the newly discovered Time Crystals which are posed to revolutionize computers in terms of efficiency and performance. Sandbox is separate from Google’s quantum computing team in Santa Barbara, and focuses on software and experimental quantum projects. The unit is currently led […]

The New DBfication of ML/AI with Arun Kumar – #553

Today we’re joined by Arun Kumarm, an associate professor at UC San Diego. We had the pleasure of catching up with Arun prior to the Workshop on Databases and AI at NeurIPS 2021, where he delivered the talk “The New DBfication of ML/AI.” In our conversation, we explore this “database-ification” of machine learning, a concept […]

Applications of Deep Neural Networks Course Overview (1.1, Spring 2022)

Spring 2022 Version. Applications of deep neural networks is a course offered in a hybrid format by Washington University in St. Louis. This course introduces Keras deep neural networks and highlights applications that neural networks are particularly adept at handling compared to previous machine learning models. Deep learning is a group of exciting new technologies […]

Building Public Interest Technology with Meredith Broussard – 552

Today we’re joined by Meredith Broussard, an associate professor at NYU & research director at the NYU Alliance for Public Interest Technology. Meredith was a keynote speaker at the recent NeurIPS conference, and we had the pleasure of speaking with her to discuss her talk from the event, and her upcoming book, tentatively titled More […]

TWiML & AI x Fast.ai Machine Learning Study Group – Session 3 – October 21, 2018

**SUBSCRIBE AND TURN ON NOTIFICATIONS** **twimlai.com** This video is a recap of our Fast.ai x TWiML Online Machine Learning Study Group. In this session, we review Lesson 3, Performance, Validation and Model Interpretation. It’s not too late to join the study group. Just follow these simple steps: 1. Sign up for the TWiML Online Meetup, […]

DeepMind: The Podcast with Hannah Fry – Season 2 coming soon!

The chart-topping podcast which uncovers the extraordinary ways artificial intelligence (AI) is transforming our world is back for a second season. Join mathematician and broadcaster Professor Hannah Fry behind the scenes of world-leading AI research lab DeepMind to get the inside story of how AI is being created – and how it can benefit our […]

AI for Ecology and Ecosystem Preservation with Bryan Carstens – #449

Today we’re joined by Bryan Carstens, a professor in the Department of Evolution, Ecology, and Organismal Biology & Head of the Tetrapod Division in the Museum of Biological Diversity at The Ohio State University. In our conversation with Bryan, who comes from a traditional biology background, we cover a ton of ground, including a foundational […]

Trends in NLP with John Bohannon – #550

Today we’re joined by friend of the show John Bohannon, the director of science at Primer AI, to help us showcase all of the great achievements and accomplishments in NLP in 2021! In our conversation, John shares his two major takeaways from last year, 1) NLP as we know it has changed, and we’re back […]

TWiML x CS224n Study Group – Lesson 3

This is a recording of the CS224n Study Group on Lecture 2, presented by Avinash Kappa. It’s not too late to join the study group. Just follow these simple steps: 1. Head over to twimlai.com/meetup, and sign up for the programs you’re interested in, including either the CS224n study group or our Monthly Meetup groups. […]

Trends in Computer Vision with Georgia Gkioxari – #549

Happy New Year! We’re excited to kick off 2022 joined by Georgia Gkioxari, a research scientist at Meta AI, to showcase the best advances in the field of computer vision over the past 12 months, and what the future holds for this domain. Welcome back to AI Rewind! In our conversation Georgia highlights the emergence […]

MirroredStrategy demo for distributed training

Google Cloud Developer Advocate Nikita Namjoshi demonstrates how to get started with distributed training on Google Cloud. Learn how to distribute training across multiple GPUs within a managed Jupyter Lab environment. Intro to distributed training → https://goo.gle/3FolcIz Creating and managing GCP projects → https://goo.gle/3mDZ4m7 Vertex AI → https://goo.gle/3FzfmUU Cloud Storage Quickstart → https://goo.gle/3qsNWcN Notebook Executor […]

061: Interpolation, Extrapolation and Linearisation (Prof. Yann LeCun, Dr. Randall Balestriero)

We are now sponsored by Weights and Biases! Please visit our sponsor link: http://wandb.me/MLST Yann LeCun thinks that it’s specious to say neural network models are interpolating because in high dimensions, everything is extrapolation. Recently Dr. Randall Bellestrerio, Dr. Jerome Pesente and prof. Yann LeCun released their paper learning in high dimensions always amounts to […]

A friendly introduction to distributed training (ML Tech Talks)

Google Cloud Developer Advocate Nikita Namjoshi introduces how distributed training models can dramatically reduce machine learning training times, explains how to make use of multiple GPUs with Data Parallelism vs Model Parallelism, and explores Synchronous vs Asynchronous Data Parallelism. Mesh TensorFlow → https://goo.gle/3sFPrHw Distributed Training with Keras tutorial → https://goo.gle/3FE6QEa GCP Reduction Server Blog → […]

Hypergraphs, Simplicial Complexes and Graph Representations of Complex Systems – #547

https://Today we continue our NeurIPS coverage joined by Tina Eliassi-Rad, a professor at Northeastern University, and an invited speaker at the I Still Can’t Believe It’s Not Better! Workshop. In our conversation with Tina, we explore her research at the intersection of network science, complex networks, and machine learning, how graphs are used in her […]

[ML News] AI learns to search the Internet | Drawings come to life | New ML journal launches

#webgpt #aiart #mlnews The latest and greatest from the Machine Learning world. OUTLINE: 0:00 – Intro 0:20 – Sponsor: Weights & Biases 2:40 – WebGPT: When GPT-3 can search the Internet 15:45 – MetaAI brings children’s drawings to life 17:15 – OpenAI lets anyone fine-tune GPT-3 18:15 – New Journal: Transactions on Machine Learning Research […]

Two Minute Papers: What is Optimization? + Learning Gradient Descent | Two Minute Papers #82

Let’s talk about what mathematical optimization is, how gradient descent can solve simpler optimization problems, and Google DeepMind’s proposed algorithm that automatically learn optimization algorithms. The paper “Learning to learn by gradient descent by gradient descent” is available here: http://arxiv.org/pdf/1606.04474v1.pdf Source code: https://github.com/deepmind/learning-to-learn ______________________________ Recommended for you: Gradients, Poisson’s Equation and Light Transport – https://www.youtube.com/watch?v=sSnDTPjfBYU […]

Google Colab features you may have missed

Nate from the Google Colab team shares lesser known, powerful features in Google Colaboratory that will allow you to dynamically explore Pandas DataFrames, view your history to see past commands you ran in your notebook, and boost your productivity. Any features you want to highlight? Let us know in the comments! 00:00 Introduction 00:08 Overview […]

[ML News] DeepMind builds Gopher | Google builds GLaM | Suicide capsule uses AI to check access

#mlnews #gopher #glam Your updates on everything going on in the Machine Learning world. Sponsor: Weights & Biases https://wandb.me/yannic OUTLINE: 0:00 – Intro & Overview 0:20 – Sponsor: Weights & Biases 3:05 – DeepMind releases 3 papers on large language models 11:45 – Hugging Face Blog: Training CodeParrot from scratch 14:25 – Paper: Pre-Training vision […]