Skip to content

Instantly share code, notes, and snippets.

@slavafomin
Last active November 26, 2021 19:56
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save slavafomin/97fd4a7844ade540d27925b04493298c to your computer and use it in GitHub Desktop.
Save slavafomin/97fd4a7844ade540d27925b04493298c to your computer and use it in GitHub Desktop.
import { Transformer } from 'grammy/out/core/client';
import { Cache } from './cache';
export function createCachingTransformer(options?: {
cacheTtl?: number;
}): Transformer {
const { cacheTtl = 5 * 1000 } = (options || {});
type Response = Promise<any>;
const cache = new Cache<Response>({
defaultTtl: cacheTtl,
});
// Removing expired responses from the cache
setInterval(() => cache.evictExpired(), cacheTtl);
// Transformer function
return async (prev, method, payload) => {
if (
method === 'answerCallbackQuery' &&
'callback_query_id' in payload
) {
const queryId = payload.callback_query_id;
const cacheKey = `${method}/${queryId}`;
// Ignoring duplicate requests
if (cache.has(cacheKey)) {
console.debug('Repeating redundant request', { method, queryId });
return cache.get(cacheKey);
}
console.debug('Allowing request', { method, queryId });
const response = prev(method, payload);
console.debug('Adding request to cache', { method, queryId });
// Saving pending response to resolve
// duplicate requests later on
cache.add({ key: cacheKey, value: response });
// Returning response back to the pipeline
return response;
}
// Letting other requests to be processed as is
return prev(method, payload);
};
}
interface CacheOptions {
defaultTtl?: number;
}
type CacheKey = string;
interface CacheItem<ValueType> {
value: ValueType;
freshTill: number;
}
export class Cache<ValueType> {
private cache = new Map<
CacheKey,
CacheItem<ValueType>
>();
constructor(private readonly options: CacheOptions) {
}
public add(options: {
key: CacheKey;
value: ValueType;
ttl?: number;
}) {
const { key, value } = options;
const { defaultTtl } = this.options;
this.cache.set(key, {
value,
freshTill: Date.now() + (options.ttl || defaultTtl)
});
}
public has(key: CacheKey) {
return this.cache.has(key);
}
public get(key: CacheKey) {
return this.cache.get(key)?.value;
}
public evictExpired() {
const now = Date.now();
this.cache.forEach((item, key) => {
if (item.freshTill < now) {
this.cache.delete(key);
console.debug(`Item evicted: ${key}`);
}
});
}
}
const response1 = await context.answerCallbackQuery({
text: 'Hello!',
});
const response2 = await context.answerCallbackQuery();
if (response1 !== response2) {
throw new Error(`Response must be cached!`);
}
@KnorpelSenf
Copy link

KnorpelSenf commented Nov 26, 2021

I don't think you need to roll an entire cache implementation just for this simple task. I would reduce all of this to:

// coded here, so not tested
bot.on('callback_query', (ctx, next) => {
  let answered = false
  ctx.api.config.use((prev, method, payload, signal) => {
    if (method === 'answerCallbackQuery' &&
          'callback_query_id' in payload &&
          payload.callback_query_id === ctx.callbackQuery.id) {
      if (answered) return { ok: true, result: true }
      else answered = true
    }
    return prev(method, payload, signal)
  })
  return next()
})

That does the same thing, it is a little bit more memory-efficient under high load because the garbage collector can immediately free the memory after the update is done processing.

Disadvantage is obviously that my implementation doesn't cache across updates, but I consider it a highly unlikely case that anyone would answer the callback query delivered in one update while handing an unrelated update.

@wojpawlik
Copy link

This middleware could even call ctx.answerCallbackQuery() if it wasn't called before.

Also, 'callback_query_id' in payload check is redundant.

@KnorpelSenf
Copy link

KnorpelSenf commented Nov 26, 2021

I like the idea of answering the callback query automatically after next() resolves, if it was not done by downstream middleware.

The in check is not required at runtime, but the code does not compile without, see microsoft/TypeScript#1260

@slavafomin
Copy link
Author

Hmm, @KnorpelSenf wouldn't bot.on('callback_query', ctx => ctx.api.config.use() introduce a new transformer function with each request leading to a memory leak? I would expect it to be config.once() or something.

@slavafomin
Copy link
Author

I'm withdrawing my last question, I should read the documentation better. However, I still believe that this method name (ctx.api.config.use) should reflect the fact that this transformer is update-bound (temporary) for better readability and clarity.

@slavafomin
Copy link
Author

slavafomin commented Nov 26, 2021

By the way this part should be:

if (answered) return { ok: true, result: true }

@KnorpelSenf
Copy link

The renaming is not possible because ctx.api is an instance of exactly the same class as bot.api. It is important that it stays that way. We do not want to have two separate classes that both do the same thing as Api, just with different names.

Thanks for the correction, I updated the comment.

@KnorpelSenf
Copy link

Including automatic callback query answering suggested by @wojpawlik, this would look like the following:

bot.on('callback_query', async (ctx, next) => {
  let answered = false
  ctx.api.config.use((prev, method, payload, signal) => {
    if (method === 'answerCallbackQuery' &&
          'callback_query_id' in payload &&
          payload.callback_query_id === ctx.callbackQuery.id) {
      if (answered) return { ok: true, result: true }
      else answered = true
    }
    return prev(method, payload, signal)
  })
  await next()
  if (!answered) await ctx.answerCallbackQuery()
})

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment