Skip to content

Instantly share code, notes, and snippets.

@soxofaan
Last active January 19, 2024 17:48
  • Star 71 You must be signed in to star a gist
  • Fork 5 You must be signed in to fork a gist
Star You must be signed in to star a gist
Save soxofaan/af407f793382623d039805f50144af6e to your computer and use it in GitHub Desktop.
Simple pretty CSV and TSV file viewer.
#####################################################
# Bash functions to put in .bashrc or .bash_aliases #
#####################################################
# For Debian/Ubuntu
function pretty_csv {
column -t -s, -n "$@" | less -F -S -X -K
}
function pretty_tsv {
column -t -s $'\t' -n "$@" | less -F -S -X -K
}
# For non-Debian systems
function pretty_csv {
perl -pe 's/((?<=,)|(?<=^)),/ ,/g;' "$@" | column -t -s, | less -F -S -X -K
}
function pretty_tsv {
perl -pe 's/((?<=\t)|(?<=^))\t/ \t/g;' "$@" | column -t -s $'\t' | less -F -S -X -K
}
#!/bin/bash
perl -pe 's/((?<=,)|(?<=^)),/ ,/g;' "$@" | column -t -s, | exec less -F -S -X -K
#!/bin/bash
perl -pe 's/((?<=\t)|(?<=^))\t/ \t/g;' "$@" | column -t -s $'\t' | exec less -F -S -X -K
@heshriti
Copy link

Hi, this is a great utility. Thank you.

When working with large files, we can use pager/less command to look through all the data. However, the header (i.e. first line) is lost when you scroll to the second page.

Is it possible to modify your function so that pretty_csv can output a table, with a "frozen" header- one that keeps the header when you scroll through the data.

@yangyxt
Copy link

yangyxt commented Jun 17, 2020

Thank you for providing this! But I failed to present the tsv with pretty_tsv.sh script.
Here is what I get:
image
Here is what the table originally looks like:
image

It seems the table only show the first column with pretty_tsv.

@soxofaan
Copy link
Author

@yangyxt: given the "line ... is too long" warnings, I just think that your file is corrupt or at least has very wide columns

@Schoonology
Copy link

These are fantastic, @soxofaan. You just saved me a lot of hassle in viewing some long-running query results.

Thank you so much for sharing them!

@bellma-lilly
Copy link

Just grabbed your script. It's very nice. Thanks for sharing.

@regob
Copy link

regob commented Oct 5, 2023

Thanks, great stuff.
I have commas inside string fields like this:

a,b,c
4207416,"[0,9,3,3,0,0,0,0,0,0]",56.0

So threw in an awk to replace separators outside strings to semicolons ( using @John1024's answer ):

function pretty_csv {
    perl -pe 's/((?<=,)|(?<=^)),/ ,/g;' "$@" \
        | awk -F\" '{for (i=1; i<=NF; i+=2) gsub(/,/,";",$i)} 1' OFS='"' \
        | column -t -s';' \
        | less  -F -S -X -K
}

which results in:

a        b                        c
4207416  "[0,9,3,3,0,0,0,0,0,0]"  56.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment