Friday, May 2, 2014

Quantifying Rambles

Recently I've come across a concept which oddly I had partaken as a matter of fact, part of life, encountered in my daily routines and more so pretty much everywhere else I turned around.

I'm talking about the myriad of little devices tracking here, there, everywhere, from time to calories, from steps we take to where we are and where we go, and the concept behind it I've come to know is called QS or quantifying self.

Being both a sci-fi fan and a techy, my mind inventoried a quick little list of hot wearable technology:
- watches
- cell phones
- fit bands
- tablets
- chips

.. and then raced ahead on what else is in prototype mode and/or about to hit the market..
- tracking rings and fit bracelets

.. all capturing, analyzing and measuring themselves or any one person in a given context or scenario.

So then my train of thought took a light curve and I began wondering what life would be like soon enough, what I'd be spending my time using in my working environment.  How would I handle all this semantic data physically, and what would connect me to the software I work with?  The concept is not new, but what's changing is the hardware.

From quantifying self to quantifying content, I think there will emerge a more proprietary standard platform of software to enable communication between devices and even between devices and people!  We are probably already seeing some of it reflected in the meta of responsive design, where HTML is now changing to account for context and descriptive attributes in its format as a language, founded on the principles of XML.

Already today, I can connect to SDL Tridion using my phone, my tablet, my laptop and so why not in the near future, from anything else I might happen to wear.  For instance, I could be walking in a botanical garden wearing the new Google glasses and presented with more in-depth information about the flowers I'm looking over feel inspired to create a new marketing campaign spun from African orchids.  I could press my bracelet and open a projection of the GUI which by now may have morphed into a semantic taxonomy of sorts, ready for me to begin quantifying content, tip-tap, quick selections, move on to typing a few key sentences and off it goes, workflow-ed for approval and review, prepped for publishing in a cloud.

Perhaps another even shorter path to my web_of_things Content Manager could come directly prepped with all the data, and I would simply instruct it at a touch to insert itself in the software for other tracking, management, analysis and use.

The technology is in motion, like spring, it has only just begun to blossom.  Soon enough intelligent-data-aware objects will drive much of what we are and do; set the wheels in motion and start thinking about what this might mean to you.  Start classifying your content and quantifying your future business.

1 comment:

  1. Nice post, Elena! Especially when everyone's talking about contextual delivery rather than creation it'll be interesting to see how "author" is even further redefined. I think "laziness" plus technology creates opportunities to further the Quantifying Self. For example, as a user, I don't want to go somewhere else to leave a comment/change/update/promote/for something here. :-)

    Beyond proprietary ways to connect, I think programmers and even the software itself will find ways to "understand" anything. Have you had a chance to check out Apache Stanbol?
    http://stanbol.apache.org/index.html

    Or try some plain text with a Stanbol enhancer:
    http://dev.iks-project.eu:8081/enhancer

    No format needed, just semantics. :-O

    ReplyDelete