OpenAI Codex (Bonus Episode)

Yannic Lightspeed Kilcher and Tim Scarfe play around with OpenAI Codex. We then invite Connor Leahy and Walid Saba in for to get their take.

OpenAI Codex is a descendant of GPT-3; its training data contains both natural language and billions of lines of source code from publicly available sources, including code in public GitHub repositories. OpenAI Codex is most capable in Python, but it is also proficient in over a dozen languages including JavaScript, Go, Perl, PHP, Ruby, Swift and TypeScript, and even Shell. It has a memory of 14KB for Python code, compared to GPT-3 which has only 4KB—so it can take into account over 3x as much contextual information while performing any task.

Dr. Walid Saba:

Connor Leahy:

YouTube Source for this AI Video

AI video(s) you might be interested in …