Skip to content

Instantly share code, notes, and snippets.

@johnfkraus
Last active February 8, 2025 22:48
Show Gist options
  • Save johnfkraus/7c24736019cd4e7173ffd1516cabc84c to your computer and use it in GitHub Desktop.
Save johnfkraus/7c24736019cd4e7173ffd1516cabc84c to your computer and use it in GitHub Desktop.
Falcon LLM is susceptible to the conjunction fallacy.

Here are a couple of screenshots of AI hallucinations I induced in AWS SageMaker using this model: "huggingface-llm-falcon-7b-instruct-bf16".

The prompts are adapted from research by Kahneman & Tversky on the conjunction fallacy, which is when people think that two or more events are more likely to happen together than either event happening alone.

These hallucinations suggest that this Falcon LLM is susceptible to the conjunction fallacy.

image

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment