Our create on first access can lead to may junk sites. Here we look for safe strategies for removing them.
Our general strategy is to work within the .wiki directory, find empty or missing things, then remove the site that holds them. We use perl to convert find output to the commands we want and then pipe that into bash.
find ... | perl -ne ... | bash
We dry-run these commands by omitting the final bash.
# Empty
We feel free to discard sites with no pages.
find . -name pages -empty |\ perl -ne 'chop; s/\/pages//; print "rm -r $_\n"'
Beware: an empty ./pages will emit rm -r .
Beware: a subdomain named 'pages'
Beware: metacharacters like * in subdomain.
# Unclaimed
We might look inside unclaimed sites.
find . -name status \! -exec test -e '{}/owner.json' \; -print |\ perl -ne 'chop; s/\/status//; print "ls $_/pages\n"'