BIP #11: The Results

by Kim on June 23, 2009 · 10 comments

2009bip-150x210Finally, the results!

The BIP Week #11 project was to dissect some book reviews by the numbers.The gist WAS to go to this Google Docs spreadsheet and fill in some questions based on two book reviews — one of your own, and one from a professional organization.

Questions included how many personal references there were, how many paragraphs was the review, and how many sentences were in the review.

Six people contributed to the spreadsheet, which isn’t as many as I would have liked, but there are still some interesting things I noticed when I did some averages and stuff:

  • Book blogger reviews had between 4 and 14 pargraphs in each review — most had around 7. Professional reviews, on teh other had, varied a lot in length. Some were just 1 paragraph, some were about 7-9, and others were 11.
  • Similarly, book blogger reviews all had about the same number of sentences (an average of 22.9, but the highest and lowers were 33 and 12). Professional reviews had a bigger range — the lowest had 6 sentences, while the longest had 70.
  • Book blogger reviews use, generally, fewer words — about 432 in a review compared to 549. Again, the professional reviews had a larger range of values.
  • Sentence length and paragraph length, when averaged, were about the same for blogger reviews and professional reviews.
  • Most professional reviews don’t include a rating system, but a lot of blogger reviews do.
  • The biggest difference between blogger reviews and professional reviews was something we already sort of knew — bloggers use a lot more personal references. From these reviews, bloggers averaged about 9.75 personal references per review. Professional reviews had only about .25.

One thing I noticed is that professional reviews are a lot more varied — some are just a paragraph, some are basically full-length articles. Blogger reviews tend to be a little more consistent — similar lengths, etc. If I had to do this over again I’d want more contributions, obviously, because then there would have been a bigger sample size for the averages.

I was also¬† hoping that the numbers for stuff like paragraph length would be more different, since blogger reviews are almost always online and online reading habits suggest that people scan online and therefore need shorter paragraphs.¬† Maybe this suggests that even though bloggers are online, we still think a lot about writing in a way that’s easy to read in print.

What do you think of these findings? Do they make sense to you? Are there other qualities we should have looked at to distinguish blogger reviews from professional reviews?

Previous post:

Next post: