Skip to content

Instantly share code, notes, and snippets.

@andrewyu0
Last active February 26, 2024 17:29
Show Gist options
  • Star 12 You must be signed in to star a gist
  • Fork 3 You must be signed in to fork a gist
  • Save andrewyu0/e1c70398458505dbb0a145090d44c491 to your computer and use it in GitHub Desktop.
Save andrewyu0/e1c70398458505dbb0a145090d44c491 to your computer and use it in GitHub Desktop.
obsidian-copilot

obsidian copilot

your notes = your lifelong ai companion + intelligence augmentation

COMMENTS VERY WELCOME! this is a first pass to put these ideas in one place

tldr - combine obsidian + openinterpreter to create a bespoke pkm copilot experience. if you follow the "file over app" philosophy, this combination can be your lifetime AI companion

image

my current workflow

  • tldr is navigate to the repo your vault is in 1) run interpreter which is llm in cli 2) you can interact w your md files from there 3) eg organize in bulk 4) have a chat and convert save the output to md directly into the vault i guess obsidian copilot kinda? tweet
image

sources

meta point on pulling the sources together

  • why is it difficult to pull all these disparate sources together?
  • most of these thoughts / workflow ideas came from passive interactions on X
  • scattered across X/twitter featureset (dms, threads, long posts) so the constraint to good ideas is the actual button clicking etc. better ideas / collaborations to come once X figures this out

next

will keep experimenting with workflows to see how far the 'file over app' philosophy can go with terminal based llm

brainstorming with chatgpt about workflows + experiment ideas

To effectively pressure test the Obsidian Copilot concept, let’s break down its potential strengths and limitations based on the integration of Obsidian with OpenInterpreter (OI) for executing LLM commands over markdown files within an Obsidian repository.

Where It Works:

Compatibility with Markdown: Given Obsidian's native support for markdown files, integrating OI to parse and interact with these files should be straightforward, enabling complex textual analysis and manipulation directly from the terminal.

Automation and Scripting:

Users can automate repetitive tasks or complex workflows by scripting LLM commands through OI, enhancing productivity and enabling advanced data processing within Obsidian.

Extensibility:

The open nature of both platforms supports custom plugins or scripts, potentially allowing for highly customized integrations tailored to specific user needs. Knowledge Management: Leveraging LLMs to summarize, categorize, and link ideas across a network of markdown files could significantly enhance the utility of Obsidian as a knowledge base.

Where It Might Not Work:

Complex Setup: Users might find the setup process challenging, especially if they are not familiar with terminal commands or scripting, limiting accessibility to technically savvy users. Performance Concerns: Running intensive LLM operations on large repositories might lead to performance issues, depending on the user's hardware and the efficiency of the integration. Privacy and Security: Executing LLM commands over personal or sensitive data stored in Obsidian files could raise privacy concerns, especially if OI communicates with external servers to process data. Limited by LLM Capabilities: The effectiveness of this integration heavily depends on the capabilities and limitations of the underlying LLM. Inaccuracies or biases in the model could impact the utility of the generated outputs.

Skeleton for Article Section

Introduction

Briefly introduce Obsidian and OI, emphasizing the "file over app" philosophy. Present the concept of Obsidian Copilot as a bridge between these two, leveraging LLM capabilities within Obsidian's markdown ecosystem. Potential Benefits

Detail how this integration enhances knowledge management, automation, and extensibility within Obsidian. Use cases that demonstrate practical applications, like automated summarization, linking ideas, or custom workflow scripts. Challenges and Limitations

Discuss the technical barriers, such as setup complexity and potential performance issues. Address privacy concerns and limitations imposed by LLM accuracy and biases. Conclusion

Summarize the potential impact of Obsidian Copilot on users' workflows. Highlight areas for further development or research to overcome identified limitations. This structure aims to provide a comprehensive overview of the Obsidian Copilot concept, balancing its innovative potential against practical considerations for implementation.

User

Let's focus on specific workflows. There's two in particular, and then I want you to generate a couple more. The first is just general file management and familiarity with the terminal. And then I think because a lot of people aren't familiar with the terminal, so it's ramping up quickly. For number two, as you learn with OI, you can actually output into Markdown directly into Obsidian, which aligns with the lifelong learning concept. Actually pressure test these two and do it technically. Don't make anything up. Only use what's actually in the feature set.

ChatGPT

Evaluating the specific workflows within the context of Obsidian Copilot, focusing on technical feasibility and the feature set of both Obsidian and OpenInterpreter (OI), provides a basis for understanding potential advantages and challenges.

Workflow 1: General File Management and Terminal Familiarity

Advantages:

Direct File Manipulation: Leveraging terminal commands allows for efficient file operations (e.g., renaming, moving, deletion) directly on markdown files, aligning with Obsidian's file-centric approach. Scripting and Automation: Users can script common tasks, enhancing productivity. This includes batch operations on files, such as bulk tagging or linking. Technical Challenges:

Learning Curve: The terminal's command-line interface (CLI) can be intimidating, requiring users to learn specific commands and scripting basics. Integration Friction: Seamless integration requires custom scripts or OI commands to manipulate files in ways that align with Obsidian's linking and tagging features without causing data loss or corruption.

Workflow 2: Markdown Output for Lifelong Learning

Advantages:

Content Generation: Using OI to directly generate markdown content (e.g., summaries, notes) that can be saved into Obsidian enhances the note-taking process, making it more dynamic and rich. Contextual Learning: Executing LLM queries to understand concepts, then outputting explanations or additional resources directly into Obsidian supports a continuous learning loop. Technical Challenges:

Formatting Consistency: Ensuring the markdown output from OI is consistently formatted for Obsidian's parser can be tricky, especially for complex notes with links, images, or code blocks. Content Management: Automatically generated content needs to be correctly filed and linked within the existing Obsidian vault structure to maintain an organized knowledge base.

Additional Workflow Ideas:

Automated Knowledge Synthesis:

Advantage: Use OI to automatically summarize and interlink related notes within the Obsidian vault, creating a synthesized view of topics or projects. Challenge: Developing an algorithm that accurately determines note relevance and creates meaningful links without manual oversight. Intelligent Research Assistant:

Advantage: Implementing a workflow where users can ask OI to research topics, pulling information from predefined sources or datasets, and then formatting this research into markdown notes. Challenge: Limiting the scope of research to ensure accuracy and relevance; managing data sources OI can access without internet connectivity, given privacy concerns. Technical Pressure Test Summary: The success of implementing these workflows hinges on the user's technical proficiency, especially concerning CLI use and scripting. Integration between Obsidian and OI must ensure data integrity and enhance the note-taking process without introducing additional management overhead. Addressing the technical challenges requires a careful balance of leveraging OI's capabilities while maintaining Obsidian's user-friendly and flexible environment.

@andrewyu0
Copy link
Author

@andrewyu0
Copy link
Author

adding context
image

@granawkins
Copy link

I wasn't familiar with 'file over app' so here's what I found for anyone else:

File over app is a philosophy: if you want to create digital artifacts that last, they must be files you can control, in formats that are easy to retrieve and read. Use tools that give you this freedom.

File over app is an appeal to tool makers: accept that all software is ephemeral, and give people ownership over their data.

https://stephango.com/file-over-app

@threefoldo
Copy link

I'm thinking about similar workflow on logseq which is open source and more flexible. My idea is to extract all my notes from the last ten years, turn them into markdown files, add tags by LLM and encode embeddings. Then, I can both explore the network of tags and find similar text by semantic meaning. I tried this idea with mem, it's good, but they didn't provide developer support for customization, the provided functionalities are too limited, only showing several similar notes for the selected note.

@andrewyu0
Copy link
Author

thank you @granawkins @threefoldo 🙏

@andrewyu0
Copy link
Author

andrewyu0 commented Feb 14, 2024

yea im team "unlinked refs" at the moment (still tbd, testing hypotheses) - after years of manual tagging, etc i started to just really leave things unstructured and with new llm tools its becoming more straight forward to either interact with the raw text semantically, or can very quickly (1-2 operations) get to the needed data as if it had been structured/tagged

leads me to believe that for our personal use tools / pkm (since we dont have to think too much about scalability) we can be even more relaxed, unstructured and focus more on the content vs. tooling now that the pkm/llm combo takes care of that for us (similar to this 2020 idea i had with roam research "roamie homie" it starts feeling more like a friend, companion, thinking partner)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment