Getting your content out in a blogging system is much more effective for SEO than by any other content management system, because there is a vast network of new web-crawlers out there competing on many levels to pick up and carry your content. Yahoo, Google and Technorati are competing to pick up all blog posts first.
News aggregation sites specializing in verticals are competing to pick up content on particular subjects. You can see this in technical topics like Java and Ajax right now, but the trend will trickle into every industry, as it is the lazy way to create those “authority sites” that are so important for SEO. In the past, spammers were doing “search scraping” to accomplish much the same thing, but mostly to have pages on which to hide AdSense.
Today, RSS feeds are being combined to make much more professional and sensible sites. But the same dynamic applies. They are attempting to intermediate communication. Not long ago, the concept of disintermediation was a buzzword, due to manufacturers’ sudden ability, thanks of course to the Internet, to sell directly without going through distributors or retail outlets. Today, content is a product, often being given out for free by bloggers who put entire articles into news feeds, and it is an invitation to intermediate. Just as with spamming, there is little to no cost or risk, and there is a great deal of up-side.
The answer is simply to set your blogging software to only include a portion of your post in the RSS feed. Whenever you post, all the blog crawlers will pick up the new content. The only full copy of the search-optimized content will exist on your site. And the news aggregation sites will get your small paragraph, which is fine because it will become a highly desirable non-reciprocal link.