Date/time variable for system prompt #2244
Replies: 2 comments
-
I can't get mine to tell me the time. It says it need access to the internet to do this? |
Beta Was this translation helpful? Give feedback.
-
I second the question here. Currently I use AutoHotKey to provide me a timestamp with simple key combination. The reason I always want a timestamp after my prompt is simply that this way I can give LLM an idea of the current time. Another reason is that I myself keep track of when a particular chat was discussed. So it's not that LLM would tell me the time, but that the information would be passed automatically with the prompt, as part of it. [prompt] Once i press enter, client adds timestamp there before my prompt is passed to LLM. [adjusted-prompt-by-client] Timestamp would be part of my chat messages and visible on it. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I was wondering if there is some way to have a date/time pre-prompt to make the LLM's aware of the user moment in time.
It has been proven that LLM's can answer much more naturally when they know when and where (location) they are.
For example if LLM's know it is 1 december and I ask an llm "what should i wear" it would be inclined to think it is cold weather and therefore come with realistic answers. Opposed to not knowing the date would result in "training bias" which would result in an answer based on how much training data happend for a certain date. If I ask for food stuff it will go for winter things. If i ask for ideas to do something it would think about in house stuff or warmer stuff. It would certainly be less inclined to propose a walk in the park.
Same thing if I would ask when the sun would come up, or what time is it in new-york. All of them could be answerred and taken in account.
As for the location that would help the LLM to provide information about the user location.
If I ask "How much does a human weight" , it would answer in POUNDS if the location is in USA, or in KG if the location is somewhere in europe.
This is a great example on how dates and location have an impact on how an LLM answers even if the question is not date or location related.
For the LLM's there is a relation anyways and it would be able to feel much smarter because it's results will be more tuned to your location and moment.
Therefore my question is: how could we best integrate a date/time/location variable ? in the prompt template? in the system prompt?
Do you guys have already a variable that could be used to achieve this?
Beta Was this translation helpful? Give feedback.
All reactions