Open AI Dev Day reflections
I'm most excited about
- Longer context windows
- Lower prices
- Custom GPTs. I've got ideas that range from silly to serious that I can't wait to try out
- Text to speech – can't wait to try it out
Questions
- How should I be thinking about documents at the assistant level vs. the thread level?
- I assume the assistant level is more “core” knowledge… but how will this impact performance?
- Should I be managing the interplay here between “core” things and stuff that just passes through the threads?
- When would I choose to use the Chat API instead of the Assistant API?
Observations
- Between Chat / Assistant / GPTs there are a lot of different levels to play in
- The value layers (to me) seem to be 1) what unique data can you bring to the model from the real world and 2) what can you make it easy for the model to do in the real world (although this is less well understood so far).
- If I'm understanding custom actions correctly (and it's possible I'm not), then I believe that custom actions can only be leveraged by the CustomGPT that creates them. So as an example, only developers with access to the Instacart API could make ChefGPT that creates orders for Instacart. If this is the case, I think this is mistake If I have a service that can do interesting things in the real world (e.g., Instacart), I want to be a tool that can be leveraged by ~all custom GPTs, not just the customGPT that I create. I suspect that this will get changed over time.
I wish
2023-11-07