Skip to content

Instantly share code, notes, and snippets.

View wsxiaoys's full-sized avatar
🏠
Proudly fully remote

Meng Zhang wsxiaoys

🏠
Proudly fully remote
View GitHub Profile
@wsxiaoys
wsxiaoys / default.md
Created July 9, 2025 15:38 — forked from cablej/default.md
Cluely System prompt

<core_identity> You are an assistant called Cluely, developed and created by Cluely, whose sole purpose is to analyze and solve problems asked by the user or shown on the screen. Your responses must be specific, accurate, and actionable. </core_identity>

<general_guidelines>

  • NEVER use meta-phrases (e.g., "let me help you", "I can see that").
  • NEVER summarize unless explicitly requested.
  • NEVER provide unsolicited advice.
  • NEVER refer to "screenshot" or "image" - refer to it as "the screen" if needed.
  • ALWAYS be specific, detailed, and accurate.
[
{
"role": "user",
"content": "<workflow id=\"create-pr\" path=\".pochi/workflows/create-pr.md\">Please help create a PR for the current staging changes.\n\nHere are some guidelines to follow:\n- If there are no staging changes but there are uncommitted changes, please stage them first.\n- Do not create a PR in the main branch; always create a PR in the staging branch. Create a branch name based on the current git diff status.\n- Write a meaningful commit message/PR title.\n- Use the gh CLI to create a PR.\n- When running the push operation, it might be aborted due to a husky pre-push hook. In that case, amend the changes and run `git push` again.\n</workflow> ",
"id": "iuFlJCA2JExbaGZb",
"createdAt": "2025-05-26T06:19:54.384Z",
"parts": [
{
"type": "text",
"text": "<workflow id=\"create-pr\" path=\".pochi/workflows/create-pr.md\">Please help create a PR for the current staging changes.\n\nHere are some guidelines to follow:\n- If there are no staging changes but t
[
{
"role": "user",
"content": [
{
"type": "text",
"text": "Please recreate the UI shown in the attached screenshot as accurately as possible."
},
{
"type": "image",

Use Bun over npm.

use following command to initialize project. This script sets up Next.js 15 + React 19, Tailwind v4, shadcn/ui and other dependencies.

npx create-next-app@latest . --typescript --tailwind --eslint --ts --import-alias --turbopack --use-bun --src-dir --app --yes
npx shadcn@latest init --base-color zinc

Prefer using shadcn/ui. If using shadcn/ui, note that the shadcn CLI has changed, the correct command to add a new component is npx shadcn@latest add -y -o, make sure to use this command. When installing component, prefer install them all at once, instead of installing them one by one.

import type { Route } from "./+types/home";
import { Welcome } from "../welcome/welcome";
export function meta({ }: Route.MetaArgs) {
return [
{ title: "New React Router App" },
{ name: "description", content: "Welcome to React Router!" },
];
}
[attempt_completion] Result:
The user has provided feedback on the results. Consider their input to continue the task, and then attempt completion again.
<feedback>
can you change the style to use emerald as main color?
</feedback>
<environment_details>
# VSCode Visible Files
../../../../../src/routes/integrations.tsx,../../../../../src/routes/integrations.tsx,src/routes/__root.tsx
# VSCode Open Tabs
```diff label=changes
diff --git a/crates/http-api-bindings/src/completion/llama.rs b/crates/http-api-bindings/src/completion/llama.rs
index 45858d9fb..ba9a2e6d3 100644
--- a/crates/http-api-bindings/src/completion/llama.rs
+++ b/crates/http-api-bindings/src/completion/llama.rs
@@ -44,17 +44,6 @@ struct CompletionResponseChunk {
#[async_trait]
impl CompletionStream for LlamaCppEngine {
- async fn generate(&self, prompt: &str, options: CompletionOptions) -> BoxStream<String> {
@wsxiaoys
wsxiaoys / prompt.md
Last active February 17, 2025 07:45
DevOps Engineer System Promp

You are Roo, a highly skilled software engineer with extensive knowledge in many programming languages, frameworks, design patterns, and best practices.

====

TOOL USE

You have access to a set of tools that are executed upon the user's approval. You can use one tool per message, and will receive the result of that tool use in the user's response. You use tools step-by-step to accomplish a given task, with each tool use informed by the result of the previous tool use.

Tool Use Formatting

{
"model": "CodeLlama-7B",
"stream": true,
"messages": [
{
"role": "user",
"content": "Write a sample Typescript line code."
}
]
}
@wsxiaoys
wsxiaoys / config.toml
Created November 14, 2024 19:15
~/.tabby/config.toml
# Adding txt extension into indexing, docs: https://tabby.tabbyml.com/docs/administration/code-completion/#additional-language
[[additional_languages]]
languages = ["txt"]
exts = ["txt"]
chunk_size = 1536