These are some simple bash functions and scripts for making CSV/TSV files prettier on the command line
see http://stefaanlippens.net/pretty-csv.html for more information.
These are some simple bash functions and scripts for making CSV/TSV files prettier on the command line
see http://stefaanlippens.net/pretty-csv.html for more information.
##################################################### | |
# Bash functions to put in .bashrc or .bash_aliases # | |
##################################################### | |
# For Debian/Ubuntu | |
function pretty_csv { | |
column -t -s, -n "$@" | less -F -S -X -K | |
} | |
function pretty_tsv { | |
column -t -s $'\t' -n "$@" | less -F -S -X -K | |
} | |
# For non-Debian systems | |
function pretty_csv { | |
perl -pe 's/((?<=,)|(?<=^)),/ ,/g;' "$@" | column -t -s, | less -F -S -X -K | |
} | |
function pretty_tsv { | |
perl -pe 's/((?<=\t)|(?<=^))\t/ \t/g;' "$@" | column -t -s $'\t' | less -F -S -X -K | |
} |
#!/bin/bash | |
perl -pe 's/((?<=,)|(?<=^)),/ ,/g;' "$@" | column -t -s, | exec less -F -S -X -K |
#!/bin/bash | |
perl -pe 's/((?<=\t)|(?<=^))\t/ \t/g;' "$@" | column -t -s $'\t' | exec less -F -S -X -K |
@yangyxt: given the "line ... is too long" warnings, I just think that your file is corrupt or at least has very wide columns
These are fantastic, @soxofaan. You just saved me a lot of hassle in viewing some long-running query results.
Thank you so much for sharing them!
Just grabbed your script. It's very nice. Thanks for sharing.
Thanks, great stuff.
I have commas inside string fields like this:
a,b,c
4207416,"[0,9,3,3,0,0,0,0,0,0]",56.0
So threw in an awk to replace separators outside strings to semicolons ( using @John1024's answer ):
function pretty_csv {
perl -pe 's/((?<=,)|(?<=^)),/ ,/g;' "$@" \
| awk -F\" '{for (i=1; i<=NF; i+=2) gsub(/,/,";",$i)} 1' OFS='"' \
| column -t -s';' \
| less -F -S -X -K
}
which results in:
a b c
4207416 "[0,9,3,3,0,0,0,0,0,0]" 56.0
Hi, this is a great utility. Thank you.
When working with large files, we can use pager/less command to look through all the data. However, the header (i.e. first line) is lost when you scroll to the second page.
Is it possible to modify your function so that pretty_csv can output a table, with a "frozen" header- one that keeps the header when you scroll through the data.