|for file in $(git diff --cached --name-only | grep -E '\.(js|jsx)$')|
|git show ":$file" | node_modules/.bin/eslint --stdin --stdin-filename "$file" # we only want to lint the staged changes, not any un-staged changes|
|if [ $? -ne 0 ]; then|
|echo "ESLint failed on staged file '$file'. Please check your code and try again. You can run ESLint manually via npm run eslint."|
|exit 1 # exit with failure status|
Has anyone figured out a workflow for using the --fix flag during this pre-commit hook?
What I've come across is that when the pre-commit hook is triggered for files that are staged, the new changes aren't included in the commit.
It makes sense why it wouldn't be included: this is changing a staged file, and those changes would also have to be staged.
lint-staged has a partly-working fix for this, however, and as they note, this doesn't work for partially-staged files, AKA files whose changes have been cherry-picked for a commit.
The @richistron one liner will lint the changed files in the local checkout, which is not necessarily what is being committed to Git if you have not staged every hunk in a file.
The scripts above works well enough for a handful of changes, but for large projects when things like a project-wide search & replace is performed, it takes ages because the files are sent one by one to eslint. Running over the whole project only takes 20 seconds, but the commit hook takes 4 minutes! Is there a way to batch the files instead of calling eslint for every file one by one?