Currently there are two beer rating sites on the 'net (at least that I'm aware of), RateBeer.com and BeerAdvocate.com. Personally I prefer the former, due in part because after several instances, I find one of the proprietors of the latter to be less than a nice guy. But they're both choc-full of good information insofar as content. Both contain information about what beer is what and where to find it, be it store/pub/etc. Links to maps, etc. A wealth of knowledge.
Then we come to ratings and the inherent problems associated with many users who may or may not have any Beer Judge Certification Program (BJCP) training whatsoever. With some users, it's "this beer tastes great" and it gets a high rating without further elaboration or analysis; likewise, an otherwise decent beer (or one that may just be a flawed batch and not the best example of the beer) may get a "this tastes like shit!" and congruent ratings.
To me, a user's ratings should carry a weight given by their experience. For example, a newb with only their limited personal tasting experience should carry little weight and should increase as the user gains experience, while someone with BJCP training (ie. a National Judge) should carry A LOT of weight. A nice descriptor of "aroma of honeysuckle and raisins" is good but it does little to tell about how the beer fits into it's style category.
Probably more peoples' choice than stylistic accuracy...
Luckily, there are ways with computers to address that sort of thing---to an extent. Both show ratings and the number of users who have rated them, and RateBeer even gives a standard deviation, straight from your Stat 101 class. But neither can address the discrepancy between a newb and a trained beer judge.
So enjoy the sites, use the info but take it with a grain of salt. Remember, human nature plays a huge part here.