How to recover and sort photos from a broken disk or deleted photos
Every recovery attempt should start with an image of ths hard drive or sd card or CD/DVD, this way you can't make things any worse than they already are. For example ruining the data even more or complete hardware failure.
How to use ddresque (gddresque version)
Please reffer to the manual of ddresque, you can combine these commands according to your needs. You may aswell get everything the first try. Just see if it helps reducing the errsize. Do not change the log file or image destination, ddresque will fill in only the missing pices, info about missing pices is stored in the log file.
#First lets just try to get most of the stuff. ddresque -n -b 2048 /dev/sdN /location/to/image.dd /location/to/logfile.log #Now with chunking ddresque -r 2 -d /dev/sdN /location/to/image.dd /location/to/logfile.log #Now from the back to the front ddresque -R -A -n -b 2048 /dev/sdN /location/to/image.dd /location/to/logfile.log #Now from the back to the front again ddresque -R -r 2 -d /dev/sdN /location/to/image.dd /location/to/logfile.log #I would also reccomend using the -i option if you are up to it.
How to use foremost to restore jpg images from an image or drive
It is very straight forward. You can remove or add -t options if you want also other files than jpg.
foremost -t jpg -i /location/to/image.dd -o /location/to/store/found/images
Sort found images by dimensions into folders
So after recovering images from a broken disk I ended up with a pile of unsorted images. Sorting them into folders by image dimensions helped a lot to distinguish album art or thumbnails from actual pictures. You will also have to have imagemagic installed.
cd /folder/with/pictures ls -1 | while read line do size=`identify $line | cut -d" " -f3` if [ "" != "$size" ] then if [ ! -d $size ] then mkdir $size fi echo "Moving $line to folder $size" mv $line $size/ else echo "Skiping $line" fi done
Bonus script: remove folders with resolution less than N
Many of those restored files are obviously not photographs and I do not want them. Used this after the previous script, to speed up the process. Folder format has to be "<size>x<size>". This script would remove all files wider or higher than 200px.
cd /folder/with/pictures_folders ls -1 | while read line do x=`echo $line | cut -d"x" -f1` y=`echo $line | cut -d"x" -f2` if [ -d $line ] then if [ $x -lt 200 -o $y -lt 200 ] #change accordingly, currently 200px. then echo "Removing $line, size too small" rm -rf $line #remove this line, if you want to make a test run, recommended! fi fi done
Bonus script: remove files that have the same md5
Since i used the disk for backups, i restored tens of pictures that are the same. To solve this, i made a script that sorts files by size, then compares the last to the next with md5sum. If the files are the same, the last one is removed. Since it scans the same folder structure, then the same files are already in the same folders, because their dimensions are the same. This will speed up the process a lot.
lmd5="startvalue" lsize=0 lpic="none" ls -1 | while read line do ls -1S $line | while read pic do size=`stat -c%s "$line/$pic"` if [ $size -ne $lsize -a $size -ne 0 ] then lsize=$size lpic=$pic echo "New file $pic" else lmd5=`md5sum $line/$lpic | cut -d" " -f1` cmd5=`md5sum $line/$pic | cut -d" " -f1` echo "$lpic $lmd5 $pic $cmd5" if [ "$lmd5" == "$cmd5" -a "$lmd5" != "" ] then echo "File $line/$lpic and $line/$pic are the same, removing." rm -f $line/$pic #remove this line, if you want to make a test run, recommended! else echo "New file $pic" lsize=$size lpic=$pic fi fi done done