Green GPT
Video
Team name / Company name: Hyeny
Team leader: Tomáš Kroupa
Challenge: no. 3: EcoGen: Unleash the Sustainable Power of AI
Problem: Improving sustainability of large generative AI models with inference costs optimisation.
Solution: Our solution proposes application for automatically caching frequently occuring user prompts
and fetching them from a database later during prompting. It will result in less electrical energy consumption,
faster replies and more pleasant UI for inexperienced users.
Impact: Our proposed solution will save electrical energy used for the generative AI models.
Amount of saved energy is expected to grow, since popularity of those tools will
continue to increase in the future. The proposed solution is scalable to any generative AI model.
Feasibility: The solution could be implemented as internal component of any LLM or as a browser extension offering the cached prompt and model output database on the cloud.
The expected development time is 4 months with budget of 50,000€.
What you built: A browser application imitating the ChatGPT UI with the implemented automatic prompt suggestions.
https://github.com/Rigos0/LLM_inference_Greenhack
What you had before: We had an idea for the prompt+output database.
We use OpenAI API to communicate with the model, transformers library for context-based prompt suggestions with sentence encoders and Flask
What comes next: We would like to propose the solution to LLM providers. It can either be deployed internally in the UI or as a browser extension. We would like to improve the user experience with writing prompts and make the large generative models more eco friendly.
Video: https://youtu.be/mC6s2D3xzDs