Somehow, for naysayers, the notion that tablets are limited to a consumption-only role in the computing ecosystem is enough to relegate them to also-ran status, and to eventual obsolescence.
This faction couldn't be more wrong, and recent developments in the tablet and mobile space are beginning to make it clear that even simple, good old fashioned content consumption will be changing greatly in the near future.
This sea change will pivot around three elements: content ingestion, contextual relevancy, and sensors. App developers are already beginning to use all three to enhance our overall experiences on tablet and mobile devices.
First consider the content provided by Flipboard or Pocket (the service formerly known as ReadItLater). Both take content from other media sources and present it more or less as their own. In the case of Pocket, users are flagging content for later consumption. But both services-as well as services like Zite, OnSwype, Pulse, and News 360 are stripping out some branding and ad elements from the content source.
And, if Flipboard's most recent announcement is any indication, these services will even begin ingesting curated audio from the likes of NPR, SoundCloud, and PRI for consumption.
If you're an editorial outlet, this is scary business as well as a gigantic grey area. Confusing matters more, it's also an opportunity of sorts. But at what cost? For a new outlet, it may be worth getting stories placed into a news aggregation service. But will it pay off long term if readers associate another over-arching brand with your content?
And what to do about the fact that for many readers, these aggregated environments are pleasant experiences because of the special formatting?
At least in the case of Pocket, the content is being selected by readers for later consumption. In the case of Flipboard, it's not, and advertisers and publishers will suffer reduced engagement because of it.
News 360 commented earlier this week, about the growing notion that news and content aggregators need to learn to play nice with content creators. At least the audio stories will be clearly identified via the usual call signing radio presenters make. All too often, this will not be the case for standard text content.
Okay, now add in the profound impact that sensors such as microphones and cameras will have on contextual awareness and the more targeted delivery of content based on this context. It gets interesting fast.
More and more, publishers are talking about second screen content that uses audio triggers to automatically queue up and access context-specific content directly related to the viewing (or listening) audience's participation. We're already seeing it with MTV's Watch Along app and more.
It's hard to imagine happily using a second screen experience while watching a movie. Who wants to miss scenes while they're looking up background information about the set design or director's instructions to the actors? (Short answer: People do want to see this stuff, just as a certain percentage of CD and Blu-Ray buyers actually watch the director's commentary.
However, it's easy to imagine using such automated contextual awareness in sports, video games, or other types of content.
And, as developers begin to figure out how these on-tablet sensors can work in conjunction with these devices' other strengths, this phenomenon appears to be the biggest game change that could transform our content consumption into something that's never been experienced before.
Another way sensors will affect contextual awareness is embodied by a Kickstarter project by Chameleon that aims to enable users to create and define entirely new adaptive home screens for Android devices based entirely on your GPS location, Wi-Fi network, and/or the time of day.
This last usage would even come in handy in corporate environments.
The news that ads on the home screen of the Kindle Fire could run in the $600k range. That's winning tablet revenue, although it is not clear whether or not Amazon will be offering an even lower-cost 'Special Offers' version of this e-reading tablet.
Tough tablet news swirling around the search engine giant this week. Google aims to launch five Android 5.0 "Jelly Bean" tablet and smartphones by the end of the year.
If you're keeping score, this means that if Google has its way, by the end of 2012, there will be four different OS variants on Google tablets. That's terrible. Many tablet owners haven't even got their hands on Android 4.0 Ice Cream Sandwich yet.
Not surprisingly, news broke that in the near future, Android tablet and smartphone manufacturers may begin to follow Amazon's lead and break from Google's rules governing modifications to the OS.
Finally, adding to the confusion, rumours also began floating that one of the aforementioned Jelly Bean Android tablets might be releasing a low-cost ($199) 7in tablet manufactured in partnership with Samsung.
I'm confused. Did Google's Android team miss the company's acquisition of Motorola earlier in the year? Shouldn't they be releasing prototypes in partnership with its own in-house tablet manufacturer?
This level of inconsistency can't help Google's Android efforts in any way, particularly with a Windows 8 launch coming shortly.
This is a little farther afield than most This Week In Tablets column wraps, but have you ever wondered what the future of tablet/touch computing might be in 10 years, or how mobile computing might transcend the relatively small 10in screen down the road?
Google's latest patent application, which posits a ring, bracelet, or fingernail control applied to an infrared reflective layer on a ring is a sure sign that mobile manufacturers and app developers are looking at improving the content browsing experience even further.
George is a founding editor of TabTimes and currently works for Wikia.com as Director of Programming.
Originally published by: TabTimes.