Lately, I’m finding an unanticipated (but not entirely surprising) use of Wikipedia as richer extended versions of Unix man pages. Who needs man man
when you have Manual page (Unix)? I came to this conclusion when I was looking at the wget man page in my terminal last night and I kept having to re-run man wget
as I paged past the options I wanted. Of course, man pages have been on the web for a long time (as the link in my prior sentence shows), but Wikipedia offers a lot more, at least in the case of wget
. You learn how to use wget, but you also get criticisms of wget, history, and information on the development and release cycle, among other things.
From now on, man pages will be the last place I look. If I need to know how to use wget again, it won’t be man wget
, it will be a search for “wget site:wikipedia.org“. Try “tar site:wikipedia.org” or “gzip site:wikipedia.org” to see it working with other commands.
Neat trick – I hadn’t tried that. I’ve been using Wikipedia to find out about domain name TLDs. I saw a domain ending in .st recently; a trip to http://en.wikipedia.org/wiki/.st told me that it was the ccTLD for São Tomé and Príncipe, and told me where to buy one as well.
I do not really see your point, as the Wikipedia page I’m getting contains way less information than the man page.
And actually, if set the environment variable PAGER to less, you *can* scroll upwards. I use the pager “most” which has the added benefit of displaying man pages in color.