Hi Harley, I agree with you on everything you said!

I'm not sure if GPT-6 or -7 will exist, but we'll see AIs getting closer and closer to passing the Turing test for enough time to actually confuse people.

And it's also true that the datasets GPT models are trained on, affect the degree of proficiency the system develops for specific tasks. For example, GPT-J, an open-source version of GPT-3, can code better being smaller because it was also trained on GitHub and StackExchange. (I'm writing an article about it right now).

About DeepMind's paper, I had it in the pile "to read," but when I saw your comment, I decided to read it right away, and I found it very interesting. Although is a hypothesis paper (they don't aim at solving the problem they present), it's very appealing.

Cheers :)

--

--

AI and Technology | Analyst at CambrianAI | Weekly Newsletter: https://mindsoftomorrow.ck.page | Contact: alber.romgar@gmail.com

Love podcasts or audiobooks? Learn on the go with our new app.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Alberto Romero

Alberto Romero

AI and Technology | Analyst at CambrianAI | Weekly Newsletter: https://mindsoftomorrow.ck.page | Contact: alber.romgar@gmail.com

More from Medium

Can EEG data aid clinical diagnosis of Major Depressive Disorder (MDD)?

How Much Time Does the Average Homeowner Save with a Robot Mower?

How Much Time Does the Average Homeowner Save with a Robot Mower?

Beyond the sweet swingin’ and sheet music

Science and technology do not create utopia [1] - MV? CRI? ATV?