just occurred to me that current chatbot llms are the equivalent of discovering pen and paper whereas the end game is ephemeral ai where an ai just generates the ideal user interface for you in realtime
but in order to get there i see a few intermediary trends playing out
1. llms start to naturally incorporate images, video and audio into their responses. google has already started to do this with 3 + nano banana.
2. llms start speaking to each of us in our preferred “style” eg if i’m trying to learn about something it might respond to me in the form of a Stratechery blog post inclusive of diagrams formatted in the same style. conversely if you’re into tiktok’s then the ai will spit out 30s genAI videos explaining the concept. whatever floats your boat.
3. action-based. the llms respond with things-to-do based off of stuff i’ve asked it. eg it could create a custom agent app that automatically pops up on my phone or device that manages my investment portfolio based off the specific trade i was just exploring with it. just plugs into my stock account.
4. predictive-based. this is the ephemeral part where all of the above happens before i even ask for it. whether it’s an app, video, blog post or something completely new - the ai dreams it up and spits it out in microseconds. a machine finely attuned to your every need and thought.
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
Future of AI brain dump #1
just occurred to me that current chatbot llms are the equivalent of discovering pen and paper whereas the end game is ephemeral ai where an ai just generates the ideal user interface for you in realtime
but in order to get there i see a few intermediary trends playing out
1. llms start to naturally incorporate images, video and audio into their responses. google has already started to do this with 3 + nano banana.
2. llms start speaking to each of us in our preferred “style” eg if i’m trying to learn about something it might respond to me in the form of a Stratechery blog post inclusive of diagrams formatted in the same style. conversely if you’re into tiktok’s then the ai will spit out 30s genAI videos explaining the concept. whatever floats your boat.
3. action-based. the llms respond with things-to-do based off of stuff i’ve asked it. eg it could create a custom agent app that automatically pops up on my phone or device that manages my investment portfolio based off the specific trade i was just exploring with it. just plugs into my stock account.
4. predictive-based. this is the ephemeral part where all of the above happens before i even ask for it. whether it’s an app, video, blog post or something completely new - the ai dreams it up and spits it out in microseconds. a machine finely attuned to your every need and thought.