Robots.txt is a file that tells crawling robots (for search engines, etc.) what to collect and what not to, so when your friend had the robots.txt file (which postnuke automatically generates) it keeps search engines from getting his other pages (members,etc).