In this piece, we'll outline the steps to construct a plugin system in Rust that executes asynchronous JavaScript/TypeScript tasks within a V8 environment powered by the rusty_v8
library. We'll divide the implementation into smaller manageable functions, maintaining similar levels of abstraction for readability and simplicity.
The system revolves around several struct types: Task
, Request
, Plugin
, and PluginMetaData
.
- A
Task
is defined as a boxed closure that performs an operation in V8's isolated runtime context. - A
Request
encapsulates aTask
and a one-shot channel for communicating results asynchronously. Plugin
is in charge of the task dispatch part of the system. It holds a sender for requests and a handle for the executing thread, making it central in executing tasks in the V8 environment.PluginMetaData
holds information about a plugin, such asid
,name
,description
, a potentially optionalrepo
URL, and a map oflanguage_servers
.
First, we'll define the core types. For tasks, we'll create a boxed dynamic dispatch function which operates on the V8 isolate.
For Plugin
, we store a thread handle executing our main V8 loop and an unbounded sender to send tasks we want to run on V8.
The Plugin
struct also keeps a reference to the script path, an important attribute which we'll later use during plugin initialization.
We'll go into the details of Plugin
's functions soon.
use anyhow::{Result, anyhow};
use futures::channel::{mpsc, oneshot};
use futures::FutureExt; // for `.boxed()`
use futures::{StreamExt, Future};
use std::collections::HashMap;
use std::path::{Path, PathBuf};
use std::pin::Pin;
use std::sync::Arc;
use std::thread::JoinHandle;
use tree_sitter::Parser;
use rusty_v8 as v8;
use toml;
type Task = Box<dyn FnOnce(&mut v8::HandleScope) -> Result<()> + Send + 'static>;
type PluginResult = Result<Arc<str>>;
type ExecutionResult = Result<()>;
pub struct Plugin {
thread_handle: JoinHandle<ExecutionResult>,
request_sender: mpsc::UnboundedSender<Request>,
script_path: Arc<Path>,
}
struct Request {
task: Task,
response: oneshot::Sender<PluginResult>,
}
pub struct PluginMetaData {
id: String,
name: String,
description: String,
repo: Option<String>,
language_servers: Vec<String>,
}
As part of the Plugin
struct, we include an implementation of our task processing method, update
.
The function accepts a task (a FnOnce
closure that performs an operation on the V8 isolate), pushes it to the worker thread via a sender, and returns a Future' that resolves once the task is complete.
impl Plugin {
pub fn update(&self, task: Task) -> Pin<Box<dyn Future<Output = PluginResult> + Send>> {
let (response_s, response_r) = oneshot::channel::<PluginResult>();
let request = Request { task, response: response_s };
Box::pin(async move {
match self.request_sender.unbounded_send(request) {
Ok(_) => response_r.await.unwrap_or_else(|_|
Err(anyhow::anyhow!("Worker has been dropped"))),
Err(_) => Err(anyhow::anyhow!("Worker thread panicked"))
}
})
}
}
In our plugin infrastructure, we'll spawn and manage a dedicated thread for our V8 Isolate
instance. Via this channel, we'll execute each task within the Isolate context.
impl Plugin {
pub fn new(script_path: Arc<Path>) -> Result<Self> {
let (request_sender, request_receiver) = mpsc::unbounded::<Request>();
let thread_handle = std::thread::spawn(move || {
let mut isolate = v8::Isolate::new(v8::CreateParams::default());
let mut locker = v8::Locker::new(&mut isolate);
let mut handle_scope = v8::HandleScope::new(&mut locker);
let context = v8::Context::new(&mut handle_scope);
let mut try_catch = v8::TryCatch::new(&mut handle_scope);
loop {
match request_receiver.next() {
None => break, // All plugins have been dropped and our work here is done
Some(request) => {
let mut context_scope = v8::ContextScope::new(&mut try_catch, context);
match (request.task)(&mut context_scope) {
Ok(result) => {
let _send_result = request.response.send(Ok(result));
// TODO: Could handle _send_result if needed
}
Err(e) => {
let _send_result = request.response.send(Err(e));
// TODO: Could handle _send_result if needed
}
}
}
}
}
Ok(()) // TODO: error handling elided
});
Ok(Self { thread_handle, request_sender, script_path })
}
}
To parse JavaScript (or TypeScript) scripts and extract meta-data (in TOML format) from comments at the top of the script file, we will use the parsing library tree-sitter
and the tree-sitter-javascript
grammar.
impl Plugin {
pub async fn load_from_script(script_path: PathBuf) -> Result<Self> {
// Load the script
let script_data = fs::read_to_string(&script_path).await?;
// Parse plugin scripts to AST
let mut parser = Parser::new();
parser.set_language(tree_sitter_javascript::language())?;
let tree = parser.parse(&script_data, None).unwrap();
// Extract first comment that supposed to contain TOML
let metainfo_comment = extract_first_comment(&tree).unwrap(); // TODO: Implement extract_first_comment
let metadata: PluginMetaData = toml::from_str(&metainfo_comment)?;
Ok(Self { script_path: Arc::new(script_path), metadata }) // TODO: create other necessary fields
}
}
Here is a simple function that would extract the first comment from the tree. You would need to add Node
to your imports from tree_sitter
.
fn extract_first_comment(tree: &tree_sitter::Tree) -> Option<String> {
let root_node = tree.root_node();
let first_comment_node = root_node
.descendants()
.find(|node| node.kind() == "comment")?;
Some(first_comment_node.utf8_text(&code).expect("Invalid UTF-8").into())
}
We now have a plugin script parsed and loaded with metadata ready. We can now focus on calling the cachedServerBinary
method from JavaScript code for each language server.
First, we have a utility function that will construct and send an appropriate FnOnce
task to update
. This FnOnce
will be designed to call the V8 function cachedServerBinary
in the plugin.
impl Plugin {
pub fn cached_server_binary(&self, lang_server: &str) -> Pin<Box<dyn Future<Output = PluginResult> + Send>> {
let task = Box::new(move |context_scope: &mut v8::HandleScope| -> anyhow::Result<_> {
let context = context_scope.get_current_context();
let global = context.get_global(context_scope);
let lang_server_key = v8::String::new(context_scope, lang_server).expect("Failed to create string for Language Server");
if let Some(lang_server_obj) = global.get(context_scope, context, lang_server_key.into()) {
if lang_server_obj.is_object(context_scope) {
let service_object = lang_server_obj.to_object(context_scope).unwrap();
let func_key = v8::String::new(context_scope, "cachedServerBinary").expect("Failed to create string for Function Name");
if let Some(func) = service_object.get(context_scope, context, func_key.into()) {
if func.is_function(context_scope) {
let function = func.to_function(context_scope).unwrap();
// We assume `cachedServerBinary` takes no arguments
let v8_result = function.call(context_scope, context, service_object.into(), &[])?;
// TODO: handle the result accordingly
} else {
Err(anyhow::anyhow!("'cachedServerBinary' is not a function on {}", lang_server))
}
} else {
Err(anyhow::anyhow!("'cachedServerBinary' not found on {}", lang_server))
}
} else {
Err(anyhow::anyhow!("{} does not point to an object", lang_server))
}
} else {
Err(anyhow::anyhow!("{} not found", lang_server))
}
});
self.update(task)
}
}
We now have a way to call the cachedServerBinary
method on a given language server from JavaScript code.
Finally, our main entry point function cached_server_binaries
will execute the method cached_server_binary
for all language servers listed in the plugin metadata concurrently:
impl Plugin {
pub fn cached_server_binaries(&self) -> Pin<Box<dyn Future<Output = Result<HashMap<Arc<str>, PluginResult>, anyhow::Error>> + Send>> {
let tasks = self.metadata.language_servers.iter()
.map(|lang_server| {
let lang_id = Arc::<str>::from(lang_server.as_str());
let binary_task = self.cached_server_binary(lang_server);
async move {
let binary_result = binary_task.await;
(lang_id, binary_result)
}
})
.collect::<Vec<_>>();
futures::future::join_all(tasks).boxed()
}
}
That sums up our implementation of a Rust-based plugin system executing JavaScript/TypeScript tasks in a robust, reliable, and production-suitable manner. Remember to handle every error diligently and use good Rust practices, especially around concurrent and async programming, to ensure a smooth and efficient implementation. If you have questions or need further refinements, feel free to reach out. Happy coding!