An Argument FOR Content Shock: Improved Internet Filters, Worcester MA
single,single-post,postid-367,single-format-standard,wc-shortcodes-font-awesome-enabled,ajax_fade,page_not_loaded,,qode-theme-ver-6.3,wpb-js-composer js-comp-ver-4.3.5,vc_responsive

An Argument FOR Content Shock: Improved Internet Filters


23 Feb An Argument FOR Content Shock: Improved Internet Filters

I read an extremely well thought and well written blog post today arguing six reasons to avoid content shock. Content shock is best defined as content saturation and consumer resistance to it. Basically, consumers are so overwhelmed by the amount of “noise” the internet currently produces, they build a shield around themselves to protect themselves from the onslaught of content.

There is no doubt that content overload exists. I do however believe that quality content(meaning, relevant, engaging and transparent content in the form of text, images, and videos combined) will always prevail. At least with those consumers who truly desire to be informed when making purchasing decisions. Those building the shields mentioned in the article I consider to be tire kickers.

The article mentions that “improved content “filters” would help us focus on better and more relevant content.” The author then negates this argument in favor of content by stating,

But better filters will not create more hours in the day and, in fact, will hasten the decline for marginal content producers. So the “filter argument” actually reinforces the idea that the status quo is not sustainable for some businesses and the costs will go up for those who need to keep making it through ever-tightening filters.

To me, filters are the stepping stone to the content overload solution.

But they are not matured to the point of being effective     . . . at least not yet. Currently, filters serve to merely locate the websites that best meet the combination of keywords being searched and then pointing that information to the IP address of consumers who have searched those combinations.  Moving forward, filters should also be able to track and determine if that online user actually completed a call to action or made a purchase, etc.  With the advances we have in technology, spider bots should be able return information to the hypothetical Google file cabinet, regarding that particular consumers online activities and that online users case file read “case closed”.

For example.  I recently made an online search to purchase a vehicle and subsequently made said purchase.  I no longer need to be getting a boat load of emails about the latest discounts being offered by car manufacturers. Instead, I need information to obtain discounts on oil changes and suggested places to go for my routine maintenance – or even a fun website to purchase accessories for my new ride.  Again, what I do not need is more information about making a car purchase.  I made the purchase already.

The problem is the internet is showing us what it thinks we want to see, but not necessarily what we need to see. – Eli Pariser

This brings me to my second argument with limitations of search engine filters.

As web companies strive to tailor their services (including news and search results) to our personal tastes, there’s a dangerous unintended consequence: We get trapped in a “filter bubble” and don’t get exposed to information that could challenge or broaden our worldview.

Google tracks each online users search history and then determines what personally tailor your query results. They utilize multiple signals to personally tailor your search query results. Signals Google measures can range from what kind of computer you are on, what browser you are using and where you are located. Assumptions are made about each of these factors mentioned so your query result is being decided for you.  Facebook looks at what friends pages you are visiting more frequently.  If the friends pages you tend to visit are liberal and you don’t often look at your conservative friend pages, your conservative friends are edited out of your news feeds.

Develop smarter and transparent filters.  Don’t just show us what is relevant.  Show us what challenges our searches and offers another view or a better product, etc.

Today’s internet gatekeepers, like newspaper editors of yesteryear, need to encode that same kind of journalistic integrity into the algorithms they are writing. The consumers should be able to decide what gets through and what does not. Only then, will the issue of content overload become mute.

Hopefully, the junk spammy content will find itself the dying breed. And as far as I am concerned . . . not fast enough!

No Comments

Post A Comment