Skip to content

Instantly share code, notes, and snippets.

View thoraxe's full-sized avatar

Erik Jacobs thoraxe

View GitHub Profile
.error_wrappers.ValidationError: 1 validation error for ConversationChain
__root__
Got unexpected prompt input variables. The prompt expects ['input', 'original_query'], but got ['history'] as inputs from memory, and input as the normal input key. (type=value_error)
⌁63% θ63° [thoraxe:~/Red_Hat/ … /fastapi-lightspeed-service] [fastapi-ols-39] main(+20/-11)* 1 ±
⌁63% θ64° [thoraxe:~/Red_Hat/ … /fastapi-lightspeed-service] [fastapi-ols-39] main(+20/-11)* 1 ±
⌁63% θ64° [thoraxe:~/Red_Hat/ … /fastapi-lightspeed-service] [fastapi-ols-39] main(+20/-11)* 1 ±
⌁63% θ64° [thoraxe:~/Red_Hat/ … /fastapi-lightspeed-service] [fastapi-ols-39] main(+20/-11)* 1 ± python routertest.py
> Entering new MultiPromptChain chain...
Traceback (most recent call last):
File "/home/thoraxe/.pyenv/versions/3.9.16/envs/fastapi-ols-39/lib/python3.9/site-packages/uvicorn/protocols/http/h11_impl.py", line 408, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "/home/thoraxe/.pyenv/versions/3.9.16/envs/fastapi-ols-39/lib/python3.9/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in __call__
return await self.app(scope, receive, send)
File "/home/thoraxe/.pyenv/versions/3.9.16/envs/fastapi-ols-39/lib/python3.9/site-packages/fastapi/applications.py", line 292, in __call__
await super().__call__(scope, receive, send)
File "/home/thoraxe/.pyenv/versions/3.9.16/envs/fastapi-ols-39/lib/python3.9/site-packages/starlette/applications.py", line 122, in __call__
await self.middleware_stack(scope, receive, send)
File "/home/thoraxe/.pyenv/versions/3.9.16/envs/fastapi-ols-39/lib/python3.9/site-packages/starlette/middleware/errors.py", line 184, in __call__
@thoraxe
thoraxe / error
Last active October 5, 2023 16:37
File "/home/thoraxe/Red_Hat/openshift/llamaindex-experiments/fastapi-lightspeed-service/ols.py", line 4, in <module>
from model_context import get_watsonx_predictor
File "/home/thoraxe/Red_Hat/openshift/llamaindex-experiments/fastapi-lightspeed-service/model_context.py", line 3, in <module>
from watsonx_langchain_wrapper import WatsonxLLM
File "/home/thoraxe/Red_Hat/openshift/llamaindex-experiments/fastapi-lightspeed-service/watsonx_langchain_wrapper.py", line 17, in <module>
class WatsonxLLM(LLM, BaseModel):
TypeError: metaclass conflict: the metaclass of a derived class must be a (non-strict) subclass of the metaclasses of all its bases
import llama_index
from llama_index import (
SimpleDirectoryReader,
VectorStoreIndex,
StorageContext,
load_index_from_storage,
)
from llama_index.vector_stores import RedisVectorStore
from llama_index.tools import QueryEngineTool, ToolMetadata
from model_context import get_falcon_tgis_context
def get_falcon_tgis_context(temperature, repetition_penalty):
system_prompt = """
- You are a helpful AI assistant and provide the answer for the question based on the given context.
- You answer the question as truthfully as possible using the provided text, and if the answer is not contained within the text below, you say "I don't know".
"""
## This will wrap the default prompts that are internal to llama-index
#query_wrapper_prompt = SimpleInputPrompt(">>QUESTION<<{query_str}\n>>ANSWER<<")
query_wrapper_prompt = Prompt("[INST] {query_str} [/INST]")
@thoraxe
thoraxe / deploy.sh
Created September 14, 2023 12:22
ClearML on OpenShift Container Platform
#!/bin/bash
# create a project for the clearml solution
oc new-project clearml
# set the policies for the various service accounts
oc adm policy add-scc-to-user anyuid -z clearml-mongodb
oc adm policy add-scc-to-user anyuid -z clearml-redis
oc adm policy add-scc-to-user anyuid -z clearml-core
oc adm policy add-scc-to-user privileged -z clearml-elastic
Traceback (most recent call last):
File "/home/thoraxe/.pyenv/versions/llamaindex-39/lib/python3.9/site-packages/gradio/routes.py", line 488, in run_predict
output = await app.get_blocks().process_api(
File "/home/thoraxe/.pyenv/versions/llamaindex-39/lib/python3.9/site-packages/gradio/blocks.py", line 1431, in process_api
result = await self.call_function(
File "/home/thoraxe/.pyenv/versions/llamaindex-39/lib/python3.9/site-packages/gradio/blocks.py", line 1103, in call_function
prediction = await anyio.to_thread.run_sync(
File "/home/thoraxe/.pyenv/versions/llamaindex-39/lib/python3.9/site-packages/anyio/to_thread.py", line 33, in run_sync
return await get_asynclib().run_sync_in_worker_thread(
File "/home/thoraxe/.pyenv/versions/llamaindex-39/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 877, in run_sync_in_worker_thread
version: "3.1"
intents:
- greet
- goodbye
- affirm
- deny
- mood_great
- mood_unhappy
- bot_challenge
SettingsRegistryMergeUtils: Optional config file "/home/thoraxe/O3DE/Projects/P230720-01/Cache/linux/user.cfg" not found.
System: Network layer initialized
System: AudioSystem created!
RHISystem: Initializing RHI...
RHISystem: Enumerated physical device: llvmpipe (LLVM 16.0.5, 256 bits)
RHISystem: Using physical device: llvmpipe (LLVM 16.0.5, 256 bits)
System:
==================================================================
System: Trace::Assert
/builddir/build/BUILD/o3de-2305.0/Gems/Atom/RHI/Vulkan/Code/Source/RHI/Vulkan.h(96): (139675541223552) 'void AZ::Vulkan::AssertSuccess(VkResult)'
Clicked system: THX-1138
Planet:OnMouseDown() (at Assets/Scripts/Planets/Planet.cs:34)
UnityEngine.SendMouseEvents:DoSendMouseEvents(Int32)
Object clicked: Planet1 (Planet)
UIManager:OnClickedPlanet(ClickedPlanetEvent) (at Assets/Scripts/UIManager.cs:50)
EventBus.EventBus`1:Raise(ClickedPlanetEvent) (at Assets/Plugins/EventBus/EventBus.cs:227)
Planet:OnMouseDown() (at Assets/Scripts/Planets/Planet.cs:36)
UnityEngine.SendMouseEvents:DoSendMouseEvents(Int32)