Skip to content

Instantly share code, notes, and snippets.

@mberman84
Created March 30, 2024 16:23
Show Gist options
  • Save mberman84/2ad782e90d18650dfdf42d677c18c520 to your computer and use it in GitHub Desktop.
Save mberman84/2ad782e90d18650dfdf42d677c18c520 to your computer and use it in GitHub Desktop.
OpenDevin Installation
git clone https://github.com/OpenDevin/OpenDevin.git
cd OpenDevin
conda create -n od python=3.10
conda activate od
docker ps
(optional) install docker if not already installed
docker pull ghcr.io/opendevin/sandbox
export OPENAI_API_KEY={your key}
(optional I had to install rust) curl --proto '=https' --tlsv1.2 -sSf [https://sh.rustup.rs](https://sh.rustup.rs/) | sh
(optional) restart terminal
python -m pip install -r requirements.txt
(optional) orjson issue (MacOS)
- pip uninstall orjson
- pip install --no-cache-dir --only-binary :all: orjson
uvicorn opendevin.server.listen:app --port 3000
@tellesus
Copy link

tellesus commented Apr 1, 2024

Anyone get this working on Windows? Sorry, Matt, love your work, but trying to get this to work on windows seems harder than MAC.

Not yet. Managed to get most of the way through but had to manually set an environment variable for the OpenAI api key since export isn't a thing, then got hung up on a bunch of modules not being installed so just installed them all, but now it's hung up on a module called llama_index.vector_stores which doesn't seem to exist.

@newjxmaster
Copy link

got some issues when running the backend:
uvicorn opendevin.server.listen:app --port 3000

i tried to install requirements.txt by doing this from the tips given in the comments:

  1. python -m pipenv requirements > requirements.txt && python -m pip install -r requirements.txt
  2. pip freeze > requirements.txt

then used : "which python" copied the path then:
the copied path -m pip install -r requirements.txt

tried to run : uvicorn opendevin.server.listen:app --port 3000 got some issues i then used "pipenv install" to install the dependencies i needed as "pip install" didn't solved them for my case it was:

pipenv install json-repair
pipenv install docker
pipenv install chromadb

it took a while to create the locking so i opened 1 terminal for each and install them in different terminal after it's done you will receive this:
Installing dependencies from Pipfile.lock (26f40f)...
To activate this project's virtualenv, run pipenv shell.

i run: pipenv shell then you get this:

Loading .env environment variables...
Loading .env environment variables...
Launching subshell in virtual environment...
. /Users/saua/.local/share/virtualenvs/OpenDevin-_AgX3R2K/bin/activate
(base) saua@Nap-MacBook-Pro OpenDevin % . /Users/saua/.local/share/virtualenvs/OpenDevin-_AgX3R2K/bin/activate

then run again : uvicorn opendevin.server.listen:app --port 3000

it should work

@newjxmaster
Copy link

(odd) ➜ OpenDevin git:(main) ✗ python --version
Python 3.10.14
(odd) ➜ OpenDevin git:(main) ✗ uvicorn opendevin.server.listen:app --port 3000
Traceback (most recent call last):
File "/usr/local/bin/uvicorn", line 8, in
sys.exit(main())
^^^^^^
File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1157, in call
return self.main(*args, **kwargs)

The error you're encountering when trying to run uvicorn suggests there's a mismatch between the Python version you're using in your environment (Python 3.10.14) and the Python version where uvicorn (and possibly other packages) is installed (Python 3.11). This kind of mismatch can occur if you have multiple Python versions installed on your system and the environment paths are not correctly aligned with the version you intend to use.

Here are a few steps to troubleshoot and resolve this issue:

1. Check the Active Python Environment

First, ensure that the Python environment you're currently using is the one you intend to. If you're using a virtual environment, make sure it's activated. You can activate a virtual environment with:

source /path/to/venv/bin/activate  # On Unix/macOS
.\path\to\venv\Scripts\activate    # On Windows

Replace /path/to/venv with the actual path to your virtual environment.

2. Verify uvicorn Installation

Check if uvicorn is installed in the current environment by running:

which uvicorn

or, on Windows:

where uvicorn

This command tells you the path of the uvicorn executable that's being called. It should be within your active environment's bin or Scripts directory. If it's not, you might be running uvicorn from a global installation or a different environment.

3. Install uvicorn in the Correct Environment

If uvicorn is not installed in your current environment or is installed in a different Python version, you should install it in the environment you're currently using. First, ensure your environment is activated, then run:

pip install uvicorn

This ensures that uvicorn and its dependencies are aligned with your project's Python version.

4. Align Python Versions

If your project is meant to run with Python 3.10 but uvicorn is installed under Python 3.11, you might want to align the Python versions. You can either:

  • Update Your Project Environment to Use Python 3.11: Create a new virtual environment with Python 3.11, or adjust your current environment to use Python 3.11, if possible.

    # Example of creating a new virtual environment with Python 3.11
    python3.11 -m venv /path/to/new/venv
  • Install uvicorn Under Python 3.10: Ensure that you're working in a Python 3.10 environment and install uvicorn there, as shown in the previous step.

5. Use the Correct Python Executable

When running your application, ensure you're using the correct Python executable. If you have multiple versions, you might need to specify the version explicitly, for example, python3.10 or python3.11, depending on your setup.

6. Check for Environment Conflicts

If you're using tools like pyenv or conda to manage multiple Python versions, ensure they're correctly configured to use the intended Python version for your project.

By following these steps, you should be able to resolve the version mismatch and run uvicorn successfully with your project.

@FetchFast
Copy link

FetchFast commented Apr 1, 2024

According to ActiveState, the dependancies are unresolvable??:
https://platform.activestate.com/Undated1946-org/OpenDevin

What I don't understand is why any of this is needed. I was expecting it all to be setup in Docker already?

@psychovelcro
Copy link

Hi! Everything looks fine until I use it. Error: Oops. Something went wrong: 'NoneType' object has no attribute 'request'
devil

@m2web
Copy link

m2web commented Apr 1, 2024

I followed the steps from @dfsm, and am getting the following error:

 AGENT ERROR:
        HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: /api/embeddings (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f2d27d0c310>: Failed to establish a new connection: [Errno 111] Connection refused'))

when issuing instructions to the UI. Thoughts?

Maybe try updating and restarting ollama? I tried to recreate your error by messing with my config.toml, but I couldn't reproduce.

Similar error: ollama/ollama#1579

Thanks @dfsm. When I issue:

curl --location 'http://localhost:11434/api/chat' \
--header 'Content-Type: application/json' \
--data '{
    "model": "llama2:latest",
    "messages": [
        {
            "role": "user",
            "content": "Why is the sky blue?"
        }
    ]
}'

I get a response.

Here is my config.toml content:

LLM_API_KEY="na"
WORKSPACE_DIR="./workspace"
LLM_BASE_URL="http://localhost:11434"
LLM_MODEL="ollama/llama2" # have also tried ollama/codellama, ollama/openhermes
LLM_EMBEDDING_MODEL="llama2" # can be "llama2", "openai", "azureopenai", or "local"

@AgatheBauer
Copy link

I managed to get it running on Windows. This is the summary of the process by GPT4:

Comprehensive Setup Guide for OpenDevin Project on Windows 11

Prerequisites

  • Windows 11: Ensure your system is running Windows 11, for the latest WSL support.
  • WSL Installed: WSL must be installed on your Windows 11 system. Follow Microsoft's guide on installing WSL.
  • Ubuntu on WSL: Install Ubuntu from the Microsoft Store post-WSL setup. This Linux distribution is where the OpenDevin project setup occurs.
  • Node.js: Required for frontend development.
  • Python 3.11: Necessary for backend development.
  • Pipenv: For managing Python packages and environments.
  • Git: To clone the project repository.

Step-by-Step Guide

1. Setting Up WSL and Ubuntu

  • Install WSL on Windows 11 by following the official instructions.
  • Install Ubuntu from the Microsoft Store and set up your UNIX username and password upon launch.

2. Installing Node.js

  • Install NVM (Node Version Manager) in Ubuntu to manage Node.js versions:
    curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.1/install.sh | bash
  • Install Node.js version 16 (or higher) using NVM:
    nvm install 16
    nvm use 16
  • Verify the Node.js installation: node --version.

3. Setting Up Python and Pipenv

  • Ensure Python 3.11 is installed: python3 --version.
  • Install Pipenv with pip:
    pip install pipenv

4. Cloning the OpenDevin Project

  • Clone the OpenDevin repository into your desired directory:
    git clone https://github.com/OpenDevin/OpenDevin.git
    cd OpenDevin

5. Backend Setup with Pipenv

  • Set up the backend environment within the OpenDevin directory:
    pipenv --python 3.11
    pipenv install
  • Activate the environment: pipenv shell.

6. Frontend Setup

  • Navigate to the frontend directory and install dependencies:
    npm install
  • Start the frontend server: npm start.

7. Running the Project

  • Follow the project's README.md for instructions on running both frontend and backend.

Additional Notes

  • Docker: If required by the project, Docker usage will be detailed in the project's documentation.
  • Troubleshooting: Refer to the project's README.md or issues section for any setup issues or compatibility concerns.

This guide provides an overview of setting up the OpenDevin project on Windows 11 using WSL with Ubuntu. Always refer to the project's official documentation for the most accurate and updated information.

@HasibTheNeWb
Copy link

The term 'conda' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is
correct and try again.

@tellesus
Copy link

tellesus commented Apr 1, 2024 via email

@m2web
Copy link

m2web commented Apr 1, 2024

I followed the steps from @dfsm, and am getting the following error:

 AGENT ERROR:
        HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: /api/embeddings (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f2d27d0c310>: Failed to establish a new connection: [Errno 111] Connection refused'))

when issuing instructions to the UI. Thoughts?

Maybe try updating and restarting ollama? I tried to recreate your error by messing with my config.toml, but I couldn't reproduce.

Similar error: ollama/ollama#1579

Quick question @dfsm, where should ollama be running? In WSL2 or the Windows host? Thanks!

@chutezin
Copy link

chutezin commented Apr 1, 2024

Mine don't run the npm, always return that the npm command was not found or sudo was not found etc,
and I`m using wsl not sure why is that any idea on how can I debug?

$ /bin/bash: line 1: npm: command not found

@HeritageGuardian
Copy link

The term 'conda' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.

try .conda in the top search bar

@kamkode
Copy link

kamkode commented Apr 2, 2024

image

why i am facing this issue

@mberman84
Copy link
Author

image

why i am facing this issue

I had this issue also. Are you sure the backend server is running?

@tellesus
Copy link

tellesus commented Apr 2, 2024 via email

@HeritageGuardian
Copy link

Keep getting this

"Oops. Something went wrong: Error condensing thoughts: OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: <YOUR OP*********KEY>. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}"

@HeritageGuardian
Copy link

image
why i am facing this issue

I had this issue also. Are you sure the backend server is running?

had this issue as well just make sure you fully execute your front end

@kamkode
Copy link

kamkode commented Apr 2, 2024

Screenshot (169)
Screenshot (168)
Screenshot (167)

here after using all the possible things and cmnd which is suggest by the people YouTube and blog still got this error

@KamleshAlp
Copy link

Keep getting this

"Oops. Something went wrong: Error condensing thoughts: OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: <YOUR OP*********KEY>. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}"

Yes provided

@KamleshAlp
Copy link

Try to pip install npm?

On Mon, Apr 1, 2024, 16:44 Chutes @.> wrote: @.* commented on this gist. ------------------------------ Mine dont run the npm, always return that the npm command was not found or sudo was not found etc, and Im using wsl not sure why is that any idea on how can I debug? — Reply to this email directly, view it on GitHub https://gist.github.com/mberman84/2ad782e90d18650dfdf42d677c18c520#gistcomment-5008663 or unsubscribe https://github.com/notifications/unsubscribe-auth/BHO7XPW3XZ45SQH5XTOCWELY3HPLZBFKMF2HI4TJMJ2XIZLTSKBKK5TBNR2WLJDUOJ2WLJDOMFWWLO3UNBZGKYLEL5YGC4TUNFRWS4DBNZ2F6YLDORUXM2LUPGBKK5TBNR2WLJDHNFZXJJDOMFWWLK3UNBZGKYLEL52HS4DFVRZXKYTKMVRXIX3UPFYGLK2HNFZXIQ3PNVWWK3TUUZ2G64DJMNZZDAVEOR4XAZNEM5UXG5FFOZQWY5LFVEYTEOJTG4YDSMJZU52HE2LHM5SXFJTDOJSWC5DF . You are receiving this email because you commented on the thread. Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub .

same problem

@KamleshAlp
Copy link

image
why i am facing this issue

I had this issue also. Are you sure the backend server is running?

please help me to fix this issue

@tellesus
Copy link

tellesus commented Apr 2, 2024

Everything worked but I got this error. I guess I'm setting up the API key incorrectly. I did use "set OPENAI_API_KEY={your key}" instead of "export OPENAI_API_KEY={your key}" since I'm on Windows.

Oops. Something went wrong: OpenAIException - Traceback (most recent call last): File "C:\Users\USER\anaconda3\envs\vscode\Lib\site-packages\litellm\llms\openai.py", line 376, in completion raise e File "C:\Users\USER\anaconda3\envs\vscode\Lib\site-packages\litellm\llms\openai.py", line 312, in completion openai_client = openai( ^^^^^^^ File "C:\Users\USER\anaconda3\envs\vscode\Lib\site-packages\openai_client.py", line 98, in init raise openaiError( openai.openaiError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

I have basically the same thing, have you solved it? This is what I'm getting:

Oops. Something went wrong: Error condensing thoughts: OpenAIException - Traceback (most recent call last): File "C:\Users\Michael\anaconda3\Lib\site-packages\litellm\llms\openai.py", line 376, in completion raise e File "C:\Users\Michael\anaconda3\Lib\site-packages\litellm\llms\openai.py", line 312, in completion openai_client = openai( ^^^^^^^ File "C:\Users\Michael\anaconda3\Lib\site-packages\openai_client.py", line 98, in init raise openaiError( openai.openaiError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

I've tried setting the environment variable via ENV in windows and also using the set command both in the frontend and backend terminal instances and not had any luck.

@kamkode
Copy link

kamkode commented Apr 2, 2024

ANY ONE HERE WHO CAN CONNECT ME OVER A MAIL TO RESOLVE MY ISSUE PLEASE
SHARE ME YOUR MAIL ID ON

kamleshji.kk@gmail.com

@Jaitwo
Copy link

Jaitwo commented Apr 2, 2024

(op) C:\Users\pc\Desktop\OpenDevin> uvicorn opendevin.server.listen:app --port 3000
Traceback (most recent call last):
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\asyncio\windows_events.py", line 434, in select
self._poll(timeout)
RuntimeError: <_overlapped.Overlapped object at 0x000002497669D740> still has pending operation at deallocation, the process may crash
Traceback (most recent call last):
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\asyncio\windows_events.py", line 434, in select
self._poll(timeout)
RuntimeError: <_overlapped.Overlapped object at 0x000002497669D740> still has pending operation at deallocation, the process may crash
Traceback (most recent call last):
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\runpy.py", line 86, in run_code
exec(code, run_globals)
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\Scripts\uvicorn.exe_main
.py", line 7, in
sys.exit(main())
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\click\core.py", line 1157, in call
return self.main(*args, **kwargs)
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\click\core.py", line 1078, in main
rv = self.invoke(ctx)
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\click\core.py", line 1434, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\click\core.py", line 783, in invoke
return __callback(*args, **kwargs)
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\uvicorn\main.py", line 409, in main
run(
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\uvicorn\main.py", line 575, in run
server.run()
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\uvicorn\server.py", line 65, in run
return asyncio.run(self.serve(sockets=sockets))
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\asyncio\runners.py", line 44, in run
return loop.run_until_complete(main)
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\asyncio\base_events.py", line 641, in run_until_complete
return future.result()
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\uvicorn\server.py", line 69, in serve
await self._serve(sockets)
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\uvicorn\server.py", line 76, in serve
config.load()
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\uvicorn\config.py", line 433, in load
self.loaded_app = import_from_string(self.app)
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\uvicorn\importer.py", line 19, in import_from_string
module = importlib.import_module(module_str)
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\importlib_init
.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "", line 1050, in _gcd_import
File "", line 1027, in _find_and_load
File "", line 1006, in _find_and_load_unlocked
File "", line 688, in load_unlocked
File "", line 883, in exec_module
File "", line 241, in call_with_frames_removed
File "C:\Users\pc\Desktop\OpenDevin\opendevin\server\listen.py", line 4, in
import agenthub # noqa F401 (we import this to get the agents registered)
File "C:\Users\pc\Desktop\OpenDevin\agenthub_init
.py", line 5, in
from . import monologue_agent # noqa: E402
File "C:\Users\pc\Desktop\OpenDevin\agenthub\monologue_agent_init
.py", line 2, in
from .agent import MonologueAgent
File "C:\Users\pc\Desktop\OpenDevin\agenthub\monologue_agent\agent.py", line 28, in
from agenthub.monologue_agent.utils.memory import LongTermMemory
File "C:\Users\pc\Desktop\OpenDevin\agenthub\monologue_agent\utils\memory.py", line 37, in
embed_model = HuggingFaceEmbedding(
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\llama_index\embeddings\huggingface\base.py", line 86, in init
self._model = SentenceTransformer(
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\sentence_transformers\SentenceTransformer.py", line 191, in init
modules = self._load_sbert_model(
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\sentence_transformers\SentenceTransformer.py", line 1246, in _load_sbert_model
module = module_class.load(module_path)
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\sentence_transformers\models\Pooling.py", line 227, in load
with open(os.path.join(input_path, "config.json")) as fIn:
FileNotFoundError: [Errno 2] No such file or directory: 'C:\Users\pc\AppData\Local\llama_index\models--BAAI--bge-small-en-v1.5\snapshots\5c38ec7c405ec4b44b94cc5a9bb96e735b38267a\1_Pooling\config.json'

@SlippyDong
Copy link

The term 'conda' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.

You need to have conda installed in order to run it. If you use venv for virtual environments you can run these instead in the directory where you have the project files:

python -m venv od
od\Scripts\activate
python3.10 -m venv od

@mgilank
Copy link

mgilank commented Apr 2, 2024

(op) C:\Users\pc\Desktop\OpenDevin> uvicorn opendevin.server.listen:app --port 3000 Traceback (most recent call last): File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\asyncio\windows_events.py", line 434, in select self._poll(timeout) RuntimeError: <_overlapped.Overlapped object at 0x000002497669D740> still has pending operation at deallocation, the process may crash Traceback (most recent call last): File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\asyncio\windows_events.py", line 434, in select self._poll(timeout) RuntimeError: <_overlapped.Overlapped object at 0x000002497669D740> still has pending operation at deallocation, the process may crash Traceback (most recent call last): File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\runpy.py", line 196, in _run_module_as_main return run_code(code, main_globals, None, File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\runpy.py", line 86, in run_code exec(code, run_globals) File "C:\Users\pc\AppData\Local\Programs\Python\Python310\Scripts\uvicorn.exe__main.py", line 7, in sys.exit(main()) File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\click\core.py", line 1157, in call return self.main(*args, **kwargs) File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\click\core.py", line 1078, in main rv = self.invoke(ctx) File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\click\core.py", line 1434, in invoke return ctx.invoke(self.callback, **ctx.params) File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\click\core.py", line 783, in invoke return __callback(*args, **kwargs) File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\uvicorn\main.py", line 409, in main run( File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\uvicorn\main.py", line 575, in run server.run() File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\uvicorn\server.py", line 65, in run return asyncio.run(self.serve(sockets=sockets)) File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\asyncio\runners.py", line 44, in run return loop.run_until_complete(main) File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\asyncio\base_events.py", line 641, in run_until_complete return future.result() File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\uvicorn\server.py", line 69, in serve await self.serve(sockets) File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\uvicorn\server.py", line 76, in serve config.load() File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\uvicorn\config.py", line 433, in load self.loaded_app = import_from_string(self.app) File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\uvicorn\importer.py", line 19, in import_from_string module = importlib.import_module(module_str) File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\importlib__init.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 1050, in _gcd_import File "", line 1027, in find_and_load File "", line 1006, in find_and_load_unlocked File "", line 688, in load_unlocked File "", line 883, in exec_module File "", line 241, in call_with_frames_removed File "C:\Users\pc\Desktop\OpenDevin\opendevin\server\listen.py", line 4, in import agenthub # noqa F401 (we import this to get the agents registered) File "C:\Users\pc\Desktop\OpenDevin\agenthub__init.py", line 5, in from . import monologue_agent # noqa: E402 File "C:\Users\pc\Desktop\OpenDevin\agenthub\monologue_agent__init.py", line 2, in from .agent import MonologueAgent File "C:\Users\pc\Desktop\OpenDevin\agenthub\monologue_agent\agent.py", line 28, in from agenthub.monologue_agent.utils.memory import LongTermMemory File "C:\Users\pc\Desktop\OpenDevin\agenthub\monologue_agent\utils\memory.py", line 37, in embed_model = HuggingFaceEmbedding( File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\llama_index\embeddings\huggingface\base.py", line 86, in init self._model = SentenceTransformer( File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\sentence_transformers\SentenceTransformer.py", line 191, in init modules = self._load_sbert_model( File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\sentence_transformers\SentenceTransformer.py", line 1246, in _load_sbert_model module = module_class.load(module_path) File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\site-packages\sentence_transformers\models\Pooling.py", line 227, in load with open(os.path.join(input_path, "config.json")) as fIn: FileNotFoundError: [Errno 2] No such file or directory: 'C:\Users\pc\AppData\Local\llama_index\models--BAAI--bge-small-en-v1.5\snapshots\5c38ec7c405ec4b44b94cc5a9bb96e735b38267a\1_Pooling\config.json'

got this kinda same error in ubuntu
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/llama_index/models--BAAI--bge-small-en-v1.5/snapshots/5c38ec7c405ec4b44b94cc5a9bb96e735b38267a/1_Pooling/config.json'

@Jaitwo
Copy link

Jaitwo commented Apr 2, 2024

The term 'conda' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.

You need to have conda installed in order to run it. If you use venv for virtual environments you can run these instead in the directory where you have the project files:

python -m venv od od\Scripts\activate python3.10 -m venv od

I did it and it still shows this : (OpenDevin-PwRe2zua) (base) C:\Users\pc\OpenDevin>uvicorn opendevin.server.listen:app --port
3000
Traceback (most recent call last):
File "C:\Users\pc\Documents\Anaconda\Lib\asyncio\windows_events.py", line 444, in select
self._poll(timeout)
RuntimeError: <_overlapped.Overlapped object at 0x00000201325CF630> still has pending operation at deallocation, the process may crash
Traceback (most recent call last):
File "C:\Users\pc\Documents\Anaconda\Lib\asyncio\windows_events.py", line 444, in select
self._poll(timeout)
RuntimeError: <_overlapped.Overlapped object at 0x00000201325CF630> still has pending operation at deallocation, the process may crash
Traceback (most recent call last):
File "", line 198, in _run_module_as_main
File "", line 88, in run_code
File "C:\Users\pc.virtualenvs\OpenDevin-PwRe2zua\Scripts\uvicorn.exe_main
.py", line 7, in
File "C:\Users\pc.virtualenvs\OpenDevin-PwRe2zua\Lib\site-packages\click\core.py", line 1157, in call
return self.main(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\pc.virtualenvs\OpenDevin-PwRe2zua\Lib\site-packages\click\core.py", line 1078, in main
rv = self.invoke(ctx)
^^^^^^^^^^^^^^^^
File "C:\Users\pc.virtualenvs\OpenDevin-PwRe2zua\Lib\site-packages\click\core.py", line 1434, in invoke
return ctx.invoke(self.callback, **ctx.params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\pc.virtualenvs\OpenDevin-PwRe2zua\Lib\site-packages\click\core.py", line 783, in invoke
return __callback(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\pc.virtualenvs\OpenDevin-PwRe2zua\Lib\site-packages\uvicorn\main.py", line 409, in main
run(
File "C:\Users\pc.virtualenvs\OpenDevin-PwRe2zua\Lib\site-packages\uvicorn\main.py", line 575, in run
server.run()
File "C:\Users\pc.virtualenvs\OpenDevin-PwRe2zua\Lib\site-packages\uvicorn\server.py", line 65, in run
return asyncio.run(self.serve(sockets=sockets))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\pc\Documents\Anaconda\Lib\asyncio\runners.py", line 190, in run
return runner.run(main)
^^^^^^^^^^^^^^^^
File "C:\Users\pc\Documents\Anaconda\Lib\asyncio\runners.py", line 118, in run
return self._loop.run_until_complete(task)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\pc\Documents\Anaconda\Lib\asyncio\base_events.py", line 653, in run_until_complete
return future.result()
^^^^^^^^^^^^^^^
File "C:\Users\pc.virtualenvs\OpenDevin-PwRe2zua\Lib\site-packages\uvicorn\server.py", line 69, in serve
await self._serve(sockets)
File "C:\Users\pc.virtualenvs\OpenDevin-PwRe2zua\Lib\site-packages\uvicorn\server.py", line 76, in serve
config.load()
File "C:\Users\pc.virtualenvs\OpenDevin-PwRe2zua\Lib\site-packages\uvicorn\config.py", line 433, in load
self.loaded_app = import_from_string(self.app)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\pc.virtualenvs\OpenDevin-PwRe2zua\Lib\site-packages\uvicorn\importer.py", line 19, in import_from_string
module = importlib.import_module(module_str)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\pc\Documents\Anaconda\Lib\importlib_init
.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "", line 1204, in _gcd_import
File "", line 1176, in _find_and_load
File "", line 1147, in _find_and_load_unlocked
File "", line 690, in load_unlocked
File "", line 940, in exec_module
File "", line 241, in call_with_frames_removed
File "C:\Users\pc\OpenDevin\opendevin\server\listen.py", line 4, in
import agenthub # noqa F401 (we import this to get the agents registered)
^^^^^^^^^^^^^^^
File "C:\Users\pc\OpenDevin\agenthub_init
.py", line 5, in
from . import monologue_agent # noqa: E402
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\pc\OpenDevin\agenthub\monologue_agent_init
.py", line 2, in
from .agent import MonologueAgent
File "C:\Users\pc\OpenDevin\agenthub\monologue_agent\agent.py", line 28, in
from agenthub.monologue_agent.utils.memory import LongTermMemory
File "C:\Users\pc\OpenDevin\agenthub\monologue_agent\utils\memory.py", line 37, in
embed_model = HuggingFaceEmbedding(
^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\pc.virtualenvs\OpenDevin-PwRe2zua\Lib\site-packages\llama_index\embeddings\huggingface\base.py", line 86, in init
self._model = SentenceTransformer(
^^^^^^^^^^^^^^^^^^^^
File "C:\Users\pc.virtualenvs\OpenDevin-PwRe2zua\Lib\site-packages\sentence_transformers\SentenceTransformer.py", line 191, in init
modules = self._load_sbert_model(
^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\pc.virtualenvs\OpenDevin-PwRe2zua\Lib\site-packages\sentence_transformers\SentenceTransformer.py", line 1246, in _load_sbert_model
module = module_class.load(module_path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\pc.virtualenvs\OpenDevin-PwRe2zua\Lib\site-packages\sentence_transformers\models\Pooling.py", line 227, in load
with open(os.path.join(input_path, "config.json")) as fIn:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
FileNotFoundError: [Errno 2] No such file or directory: 'C:\Users\pc\AppData\Local\llama_index\models--BAAI--bge-small-en-v1.5\snapshots\5c38ec7c405ec4b44b94cc5a9bb96e735b38267a\1_Pooling\config.json'

@tellesus
Copy link

tellesus commented Apr 2, 2024 via email

@orangejam72
Copy link

Hi! Everything looks fine until I use it. Error: Oops. Something went wrong: 'NoneType' object has no attribute 'request' devil

I got most of the errors on here yesterday, the new version looks to have fixed all the issues and now getting Oops Something went wrong.
I got this error on mac after running latest version using make

@efi20232
Copy link

efi20232 commented Apr 3, 2024

i am getting

/opt/homebrew/Caskroom/miniconda/base/envs/myenv/bin/python" -m pip install -r requirements.txt

ERROR: Could not open requirements file: [Errno 2] No such file or directory: 'requirements.txt'

it can not find requirements

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment