Skip to content

Instantly share code, notes, and snippets.

@ruvnet
Last active April 12, 2024 20:45
Show Gist options
  • Save ruvnet/bce2ade6fb7d51710f72c3b7bad83e85 to your computer and use it in GitHub Desktop.
Save ruvnet/bce2ade6fb7d51710f72c3b7bad83e85 to your computer and use it in GitHub Desktop.
A Super Coder for LiteLLM

AI-Powered Function Generator

This project is an AI-powered tool that generates Python functions based on natural language prompts. It leverages the power of large language models (LLMs) like GPT-3.5 and GPT-4 to understand the user's intent and generate code that meets the specified requirements.

Introduction

The AI-Powered Function Generator is designed to streamline the process of writing Python functions. Instead of manually coding each function from scratch, developers can provide a high-level description of what they want the function to do, and the tool will automatically generate the corresponding Python code.

This project aims to improve developer productivity, reduce coding errors, and enable rapid prototyping. By leveraging the capabilities of advanced language models, it can generate functions that adhere to best practices and coding standards.

Benefits

  • Increased Productivity: The AI-Powered Function Generator saves developers time and effort by automatically generating Python functions based on natural language descriptions. This allows developers to focus on higher-level tasks and problem-solving.

  • Reduced Errors: By relying on the knowledge and expertise of large language models, the generated functions are less likely to contain syntax errors or logical mistakes. The tool ensures that the generated code follows best practices and coding standards.

  • Rapid Prototyping: With the ability to quickly generate functions from high-level descriptions, developers can rapidly prototype and iterate on their ideas. This accelerates the development process and enables faster feedback loops.

  • Improved Code Quality: The AI-Powered Function Generator leverages the vast knowledge and understanding of language models to generate code that is clean, readable, and follows established coding conventions. This leads to higher-quality code and easier maintainability.

  • Continuous Learning: As developers use the tool and provide feedback, the language models can learn and adapt to generate functions that better align with the specific needs and preferences of the development team.

Usage Examples

  1. Generating a Function to Calculate the Factorial of a Number:

    • Prompt: "Generate a Python function that calculates the factorial of a given positive integer."
    • Generated Function:
      def factorial(n):
          if n < 0:
              raise ValueError("Factorial is not defined for negative numbers.")
          if n == 0:
              return 1
          return n * factorial(n - 1)
  2. Generating a Function to Reverse a String:

    • Prompt: "Create a Python function that takes a string as input and returns the reversed string."
    • Generated Function:
      def reverse_string(string):
          return string[::-1]
  3. Generating a Function to Find the Maximum Element in a List:

    • Prompt: "Write a Python function that finds and returns the maximum element in a given list of numbers."
    • Generated Function:
      def find_max(numbers):
          if not numbers:
              raise ValueError("The list is empty.")
          max_num = numbers
          for num in numbers:
              if num > max_num:
                  max_num = num
          return max_num

Customization and Usage

To customize and use the AI-Powered Function Generator, follow these steps:

  1. Clone the repository:

    git clone https://github.com/your-username/ai-function-generator.git
    
  2. Install the required dependencies:

    pip install -r requirements.txt
    
  3. Set up the necessary API keys and configurations:

    • Obtain API keys for the language models you want to use (e.g., GPT-3.5, GPT-4).
    • Update the config.py file with your API keys and other relevant settings.
  4. Run the tool:

    python main.py
    
  5. Follow the prompts in the text-based user interface:

    • Enter the natural language description of the function you want to generate.
    • Specify any additional requirements or constraints.
    • Review the generated function and provide feedback if needed.
  6. Customize the code generation process (optional):

    • Modify the generate_function() function in generator.py to fine-tune the code generation process.
    • Adjust the prompts, parameters, and post-processing steps to suit your specific needs.
  7. Integrate the generated functions into your project:

    • Copy the generated functions from the tool's output and paste them into your Python codebase.
    • Ensure that the generated functions are properly integrated and tested within your project.

Contributing

Contributions to the AI-Powered Function Generator are welcome! If you have any ideas, suggestions, or bug reports, please open an issue on the GitHub repository. If you'd like to contribute code, please fork the repository and submit a pull request with your changes.

When contributing, please adhere to the following guidelines:

  • Follow the existing code style and conventions.
  • Write clear and concise commit messages.
  • Provide thorough documentation for any new features or changes.
  • Test your changes thoroughly before submitting a pull request.

License

This project is licensed under the MIT License. Feel free to use, modify, and distribute the code as per the terms of the license.

Acknowledgements

The AI-Powered Function Generator was inspired by the advancements in natural language processing and the potential of large language models to assist in software development. We would like to acknowledge the contributions of the open-source community and the developers of the language models used in this project.

Happy coding with the AI-Powered Function Generator! πŸš€

import litellm
import os
import random
from tenacity import retry, stop_after_attempt, wait_fixed, retry_if_exception_type
from interpreter import interpreter
comments = [
"Generating function... πŸš€",
"Testing function... πŸ§ͺ",
"Oops, something went wrong! πŸ˜…",
"Function passed the test! πŸŽ‰",
"Getting everything together... πŸ’ͺ",
"Debugging in progress... πŸ›",
"Unleashing the power of LLMs! 🧠",
"Crafting the perfect function... πŸ› οΈ",
"Generating brilliant ideas, weaving them into elegant code... The journey begins! πŸš€",
"Rigorously testing our creation, refining it in the crucible of experimentation πŸ§ͺ",
"Embracing the challenges that arise, for through them we innovate πŸ’‘",
"Eureka! Our code emerges victorious, passing every trial! Perseverance prevails πŸŽ‰",
"Assembling the pieces, watching in wonder as they form a magnificent whole 🧩",
"Debugging with determination, each issue resolved brings us closer to perfection πŸ”",
"Harnessing the awe-inspiring potential of language models to push boundaries ✨",
"Pouring heart and soul into crafting a function that is a work of art 🎨",
]
conversation_history = []
@retry(stop=stop_after_attempt(3), wait=wait_fixed(2), retry=retry_if_exception_type(litellm.exceptions.AuthenticationError))
def get_llm_response(prompt, model="gpt-4-turbo-preview"):
print(random.choice(comments))
try:
response = litellm.completion(
model=model,
messages=[{"role": "user", "content": prompt}],
temperature=0.7
)
return response.choices[0].message.content
except litellm.exceptions.AuthenticationError as e:
print(f"Authentication Error: {str(e)}")
raise e
def test_function(function_code):
try:
print("Executing the generated function... πŸƒ")
interpreter.auto_run = True
output = interpreter.chat(function_code)
print(f"Function output: {output}")
print("Function passed the test! βœ…")
return True, None
except Exception as e:
print(f"Error occurred: {str(e)} ❌")
return False, str(e)
def generate_and_test_function(prompt, previous_code=None, previous_error=None, iteration=1):
print(f"Generating function for prompt (Iteration {iteration}): {prompt}")
# Append previous code and error to the prompt for context
if previous_code and previous_error:
prompt += f"\nPrevious code:\n{previous_code}\n\nPrevious error:\n{previous_error}\n\n"
prompt += "Please analyze the previous code and error, and provide suggestions and insights to fix the issue."
# Use GPT-3.5 for internal guidance
guidance_prompt = f"Provide guidance and suggestions for generating a function based on the following prompt and conversation history:\n{prompt}\n\nConversation History:\n{conversation_history}"
# guidance_response = get_llm_response(guidance_prompt, model="gpt-3.5-turbo")
guidance_response = get_llm_response(guidance_prompt, model="gpt-3.5-turbo-0125")
# Use GPT-4 for final guidance to Open Interpreter
generation_prompt = f"""
{prompt}
Guidance from super intelligent code bot:
{guidance_response}
Please generate a Python function that satisfies the prompt and follows the provided guidance, while adhering to these coding standards:
- Use descriptive and meaningful names for variables, functions, and classes.
- Follow the naming conventions: lowercase with underscores for functions and variables, CamelCase for classes.
- Keep functions small and focused, doing one thing well.
- Use 4 spaces for indentation, and avoid mixing spaces and tabs.
- Limit line length to 79 characters for better readability.
- Use docstrings to document functions, classes, and modules, describing their purpose, parameters, and return values.
- Use comments sparingly, and prefer descriptive names and clear code structure over comments.
- Handle exceptions appropriately and raise exceptions with clear error messages.
- Use blank lines to separate logical sections of code, but avoid excessive blank lines.
- Import modules in a specific order: standard library, third-party, and local imports, separated by blank lines.
- Use consistent quotes (single or double) for strings throughout the codebase.
- Follow the PEP 8 style guide for more detailed coding standards and best practices.
"""
generated_function = get_llm_response(generation_prompt, model="gpt-4-turbo-preview")
print("Testing the generated function...")
success, error = test_function(generated_function)
# Append the generated function to the conversation history
conversation_history.append({"role": "assistant", "content": generated_function})
return success, error, generated_function
def save_function_to_file(generated_function, file_name):
with open(file_name, "w") as file:
file.write(generated_function)
print(f"Function saved to {file_name}")
def handle_post_success_actions(generated_function):
while True:
print("\nOptions:")
print("1. Modify the function further")
print("2. Save the function to a file")
print("3. Return to main menu")
option = input("Enter your choice (1-3): ")
if option == "1":
modification_prompt = input("Enter the modification prompt: ")
success, error, modified_function = generate_and_test_function(modification_prompt, generated_function)
if success:
generated_function = modified_function
else:
print("Modification failed. Keeping the original function.")
elif option == "2":
file_name = input("Enter the file name to save the function (e.g., hello_world.py): ")
save_function_to_file(generated_function, file_name)
elif option == "3":
return generated_function
def main():
initial_prompt = input("Enter the initial prompt for the development process: ")
while True:
print("\nMenu:")
print("1. Generate and test a function 🎨")
print("2. Exit πŸ‘‹")
choice = input("Enter your choice (1-2): ")
if choice == "1":
run_mode = input("Select run mode:\n1. Single run\n2. Multiple runs\n3. Continuous mode\nEnter your choice (1-3): ")
if run_mode == "1":
success, error, generated_function = generate_and_test_function(initial_prompt)
if success:
generated_function = handle_post_success_actions(generated_function)
initial_prompt = f"Continue developing the function:\n{generated_function}"
else:
print("Function test failed. 😞")
elif run_mode == "2":
num_runs = int(input("Enter the number of runs: "))
for i in range(num_runs):
print(f"\nRun {i+1}:")
success, error, generated_function = generate_and_test_function(initial_prompt)
if success:
generated_function = handle_post_success_actions(generated_function)
initial_prompt = f"Continue developing the function:\n{generated_function}"
else:
print("Function test failed. 😞")
elif run_mode == "3":
while True:
success, error, generated_function = generate_and_test_function(initial_prompt)
if success:
generated_function = handle_post_success_actions(generated_function)
initial_prompt = f"Continue developing the function:\n{generated_function}"
else:
print("Function test failed. Retrying...")
elif choice == "2":
print("Exiting... Goodbye! πŸ‘‹")
break
else:
print("Invalid choice. Please try again. πŸ˜…")
if __name__ == "__main__":
main()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment