Last active
August 29, 2015 14:04
-
-
Save tianchaijz/8c392566047f8dbc4319 to your computer and use it in GitHub Desktop.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
* ls *Page.h | while read one; do mv "$one" "$(echo "$one" | perl -ne 'print join("_", split(/(?=[A-Z])/))')"; done; | |
* for i in `ls *.t`; do perl -pi.bak -e 'print "\$ENV{TEST_NGINX_CHECK_LEAK} = 1;\n" if /HUP/' $i; done | |
* sudo watch -n 5 pkill -USR1 (-n -x) unxz | |
* TERM=xterm-256color emacs -nw | |
* for i in `seq 1 254` ; do arping -I enp9s0 -c 1 172.16.2.$i | grep reply ; done | |
* sudo tcpdump -i enp3s0 -nt -s 500 port domain | |
* du -sh .[^.]* | |
* To find all files modified in the last 24 hours (last full day) in a particular specific directory and its sub-directories: | |
* -> find /directory_path -mtime -1 -ls | |
* find . -mtime 0 | |
* use ctrl+z put vim in bg, use fg when needed! | |
* DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" # get the directory of the current scripts | |
* How to recursively find and list the latest modified files in a directory with subdirectories and times? | |
``` | |
Try this one: | |
#!/bin/bash | |
find $1 -type f -exec stat --format '%Y :%y %n' {} \; | sort -nr | cut -d: -f2- | head | |
Execute it with the path to the directory where it should start scanning recursively. If lots of files it may take a while before it returns anything. Performance can be improved if we don't support files with white spaces in the names. In this case you can use: | |
#!/bin/bash | |
find $1 -type f | xargs stat --format '%Y :%y %n' | sort -nr | cut -d: -f2- | head | |
which is a bit faster. | |
Your "fast method" should also be able to use print0 to support spaces and even linefeeds in filenames. Here's what I use: find $1 -type f -print0 | xargs -0 stat --format '%Y :%y %n' | sort -nr | cut -d: -f2- | head This still manages to be fast for me. | |
Some directories I was looking in didn't allow me to stat them, so I made the following changes (to the 'fast' one) so I didn't have to see the errors in my final output. find ${1} -type f | xargs stat --format '%Y :%y %n' 2>/dev/null | sort -nr | cut -d: -f2- | |
On Mac OS X it's not GNU's stat so command fails. You have to brew install coreutils and use gstat instead of stat – CharlesB Mar 28 '13 at 10:56 | |
You don't need to run stat since find PATH -type f -printf "%T@ %p\n"| sort -nr does the job. It's also a bit faster that way. | |
stat --printf="%y %n\n" $(ls -tr $(find * -type f)) | |
find . -type f -printf '%T@ %P\n' | sort -n | awk '{print $2}' | |
down vote | |
To find all files that file status was last changed N minutes ago: | |
find -cmin -N | |
find . | while read FILE;do ls -d -l "$FILE";done | |
# http://stackoverflow.com/questions/5566310/how-to-recursively-find-and-list-the-latest-modified-files-in-a-directory-with-s | |
``` | |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment