Checking out a 6-Billion parameter GPT model, GPT-J, from Eleuther AI

An actually-open-AI model is available for you to download the weights to and tinker with, GPT-J. It’s a 6 billion parameter large language model that can do long-form natural language generation, do programming, be a chatbot, do translations, and much more.

GPT-J Text examples: https://pythonprogramming.net/GPT-J-6b-parm-transformer/
GPT-J Github: https://github.com/kingoflolz/mesh-transformer-jax
The Pile paper: https://arxiv.org/pdf/2101.00027.pdf
GPT-J Web demo: https://6b.eleuther.ai/
EleutherAI: https://www.eleuther.ai/
Multimodal Few-Shot Learning with
Frozen Language Models: https://arxiv.org/pdf/2106.13884.pdf

Neural Networks from Scratch book: https://nnfs.io
Channel membership: https://www.youtube.com/channel/UCfzlCWGWYyIQ0aLC5w48gBQ/join
Discord: https://discord.gg/sentdex
Reddit: https://www.reddit.com/r/sentdex/
Support the content: https://pythonprogramming.net/support-donate/
Twitter: https://twitter.com/sentdex
Instagram: https://instagram.com/sentdex
Facebook: https://www.facebook.com/pythonprogramming.net/
Twitch: https://www.twitch.tv/sentdex

YouTube Source for this AI Video

AI video(s) you might be interested in …