Skip to content

Instantly share code, notes, and snippets.

View thoraxe's full-sized avatar

Erik Jacobs thoraxe

View GitHub Profile
@thoraxe
thoraxe / lightspeed-catalogsource.yaml
Created June 24, 2024 18:27
OpenShift Lightspeed CatalogSource object
apiVersion: operators.coreos.com/v1alpha1
kind: CatalogSource
metadata:
name: openshift-lightspeed-operator-catalog
namespace: openshift-marketplace
spec:
displayName: OpenShift Lightspeed
sourceType: grpc
image: quay.io/openshift-lightspeed/lightspeed-catalog:alpha-0.0.2
updateStrategy:
This file has been truncated, but you can view the full file.
TRACE LEVEL = 2
0.12s TraceLog.cpp 229: Command line: "+trace=2 +traceFlush"
0.14s main.cpp 480: RF2Log UTC=1da976ddcea9890 SteamUser=110000103d6fa75 LogGUID={AFC7D843-D62C-4C56-ACFB-78E0EBBD445E} Machine=ec94a74983d3dba0
0.14s DebugUtiliti 58: DebugUtilities::SetCurrentThreadName rFactor2 Main Thread : thread ID : 33860
0.14s main.cpp 863: Version=1.1134 Build=1134
0.14s main.cpp 864: FPU=0x0008001f
0.37s game.cpp 1350: Setting Mod Path "Packages" 1 0
1.59s game.cpp 1753: Entered Game::Enter()
1.59s osman.cpp 1037: Entered OSMan::Enter()
# https://docs.llamaindex.ai/en/stable/examples/vector_stores/MilvusIndexDemo.html
import os
import textwrap
# document indexing and embedding
from llama_index.core import Settings
from llama_index.core import StorageContext
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
import os
from dotenv import load_dotenv
load_dotenv()
import kubernetes.client
import kubernetes.utils
configuration = kubernetes.client.Configuration()
# Configure API key authorization: BearerToken
configuration.api_key['authorization'] = os.getenv("K8S_TOKEN")
import os
# Import things that are needed generically
from langchain.agents import AgentType, initialize_agent
from langchain.tools import tool
import langchain
langchain.debug = True
from dotenv import load_dotenv
==================================== ERRORS ====================================
___________ ERROR collecting query_helpers/test_yes_no_classifier.py ___________
/opt/app-root/lib64/python3.11/site-packages/_pytest/runner.py:341: in from_call
result: Optional[TResult] = func()
/opt/app-root/lib64/python3.11/site-packages/_pytest/runner.py:372: in <lambda>
call = CallInfo.from_call(lambda: list(collector.collect()), "collect")
/opt/app-root/lib64/python3.11/site-packages/_pytest/python.py:531: in collect
self._inject_setup_module_fixture()
/opt/app-root/lib64/python3.11/site-packages/_pytest/python.py:545: in _inject_setup_module_fixture
self.obj, ("setUpModule", "setup_module")
========================================================================================= FAILURES =========================================================================================
__________________________________________________________________________________ test_yes_no_classifier __________________________________________________________________________________
yes_no_classifier = <src.query_helpers.yes_no_classifier.YesNoClassifier object at 0x7faa85532d90>, monkeypatch = <_pytest.monkeypatch.MonkeyPatch object at 0x7faa860917d0>
def test_yes_no_classifier(yes_no_classifier, monkeypatch):
monkeypatch.setattr(
langchain.chains, "LLMChain", MockLLMChain(desired_response="9")
)
@app.post("/ols/raw_prompt")
@app.post("/base_llm_completion")
def base_llm_completion(llm_request: LLMRequest):
"""
Raw pass through to backend LLM
"""
conversation = get_suid()
llm_response = LLMRequest(query=llm_request.query)
llm_response.conversation_id = conversation
using Godot;
using System;
using System.CodeDom.Compiler;
using System.Collections.Generic;
using System.Linq;
[Tool]
public partial class dun_gen : Node3D
{
// dungeon generator
Traceback (most recent call last):
File "/home/thoraxe/.pyenv/versions/3.9.16/lib/python3.9/runpy.py", line 197, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/home/thoraxe/.pyenv/versions/3.9.16/lib/python3.9/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/home/thoraxe/Red_Hat/openshift/llamaindex-experiments/fastapi-lightspeed-service/modules/docs_summarizer.py", line 141, in <module>
docs_summarizer.summarize(
File "/home/thoraxe/Red_Hat/openshift/llamaindex-experiments/fastapi-lightspeed-service/modules/docs_summarizer.py", line 93, in summarize
summary = query_engine.query(query)
File "/home/thoraxe/.pyenv/versions/fastapi-ols-39/lib/python3.9/site-packages/llama_index/indices/query/base.py", line 31, in query