Gravity and the Interest Graph

An interesting look at the use of big data, particularly semi-structured data like the twitter firehose, to tease out of an utterance both the nature of the utterance and the interests it ascribes to the author.

The personalized web is just an interest graph away ? Cloud Computing News:

Much as social graphs are maps of our social media connections that follow us across the web, interest graphs are maps of our interests. Some companies want them to follow us across the web, too, meaning that wherever we go, there we are. There?ll be no more need to search through news sites for the stories we want, or shopping sites for the products we want, because the site will know as soon as we hit its system who we are and what we like.

Take a look at the Gravity Labs site for a very interesting, guided tour of how they've used their ontologies to extract 'interest' analysis.


In Situ and in Sync

A small group of irregulars have been meeting on an ad hoc basis over the course of the past few months to consider mobile use of and influence on cloud-based (or, better said, network resident) services. From this group, Duncan Davidson of Bullpen Capital and Peter Christy of Internet Research Group have had a short conversation that was kicked off by a recent series of posts by Fred Wilson, venture capitalist and blogger, about the nature of content in use by mobile and less peripetetic devices.  The post in question, In Situ Content, prompted Peter to question "…whether remote acces to data and tools in the Cloud will ever be good enough."  He went on to support the idea of thick (or thicker) clients, and pointed to the recent Microsof Build Conference and which they showed off Windows 8 with Azure integration.

My thoughts in response.

I agree that for much of what wants to be done on a laptop/desktop, or on a tablet or smart phone for that matter, there is something 'thicker' than a dumb web browser required as the client.  

As the browsers acquire more heft, with both proprietary and more HTML5 based functionality, the browser becomes (at the very least) the fundament of any 'local client' technology.  Without this, the 'cloud applications' and the cloud storage of my data, my documents and collections will hit a wall.

In thinking about cloud storage of my personal possessions, I'd rather not think of things as 'documents'.  Rather, if one considers the active principle one of assembling and then 'rendering' a herd of data components (core data, meta-data, …) , we should use the term assemblage to replace the concept of document.  The assembly doesn't require physical proximity of data components, just a good and smoothly working linkage amongst them. 

I agree that, to the degree possible, the world will move toward the 'master data assemblage' (or what Duncan referred to as the ur-document) that resides in a cloud, and is sync'd for those situations where there is intermittent communication … which pretty much describes my iPhone and its use of AT&T Mobile's data plan.

The point is that it depends on accomplishing 'synchronization' well and correctly.  'Correctly', by the way, needs to cover issues of usability & user experience , safety (privacy & security), and economy (optimal utilization of compute, network and storage).  There is no single definition of 'correct' in these scenarios … the optimal recipe is going to completely depend on the context and the ability to adapt as the context changes for the mobile user.



Time to start … again.

I go through the inevitable phases of blogging:  I have something to say …  I start writing when I don't have anything new to say … I get self-conscious … I then have nothing I'm willing to write … and months go by.  

In point of fact I've had a lot to say, and haven't done a very good job of getting it put in place as something worth reading.  I've also been told recently by Laura Schewel, with whom I'm working at creating an awesome data company that I think too much when I write.  I'll do what I can to rectify that.  

In trying to simplify the organization of my thoughts, notes, research, general collections of mental detritus, I've gone to a much more limited set of tools, most of them simpler.  Simple text editors instead of extensive word processing.  Snippet handlers to automate dates and tagging.  Collections of work notes, call notes, personal journal entries, interesing research all in one kick-ass data base that's replicated to my other machines (and to the cloud) with no overt action on my part.   We'll see if that same kind of streamlining can be used for the 'public' journalling I do (or... want to do ...) here.  


The 451 Group considers enterprise security - the legacy of 2010 for 2011

Just prior to leaving on vacation for the holidays, I spent an evening reading analyst retrospective pieces on 2010 and predictions for 2011.  For the most part, they were pretty fluffy.  A few, however, stood out. One of the pieces to which I kept returning was the 451 Group's 2011 preview - Enterprise security by Josh Corman, Steve Coplan, Andrew Hay, Wendy Nather and Chris Hazelton. (Sorry, folks... it's behind a paywall.)  I will say that it took several careful readings.  In part, multiple readings were required because the piece is dense and covers a lot of territory.  I also found my attention being drawn to several related issues that have been on my mind for the best part of the year.  With apologies to my friends at the 451 Group, here are the four aspects that I retained, and the associations that were generated while I read (and re-read) the piece.


There's no single security market.  

Throughout the piece, the security pros at 451 have gone out of their way to make clear that there is no single 'security market.'  There are markets, market segments that operate as 'markets', and highly interested communities of interest which provide technologies and services in response.  

In continuation – and as the logical consequence – of what we predicted in last year's preview, a pronounced schism has formed in the information security market between those that fear the auditor more than the attacker and a minority that attempt to solve for both.

Security strategies are the basis for drawing distinctions between the markets. The most interesting aspect of their analysis is the way in which they've drawn distinctions.  Among the most important, for the industry and the investment community, the authors have started with a distinction between those consumers of security offerings who are interested in meeting the criteria for compliance (i.e. passing the audit, getting the credential and avoiding the penalty) and those consumers driven primarily by the need for better infosec visibility, better security coverage, avoidance of 'security events' and improved response to security problems.  By drawing the distinction between compliance-centric and improvement-centric segments of the market, the 451 Group makes clear that it's not a distinction between the rich and poor, nor the large and small.   

Using security services to reduce the scope of self-management. Although one can detect some opprobrium (look it up) leveled at the market that holds compliance as its guiding principle, the 451 Group makes clear that by focusing on improvement in the measurable result and improved ROI, this market has come to rely on reducing scope (and thus exposure) of their security environments.  This has resulted in finding someone 'outside' (and paying them) to take the responsibility.  In this age of "cloud" and "everything-as-a-service", the delegation of responsibility to an outsourced service becomes more than a tactic, and increasingly a strategy.

Using security services as an enrichment strategy. For the market that holds with the strategy that improvement is paramount, service also plays an increasing role.  Specifically, they point to the increased value and dependence of this market on security 'enrichment' from third party sources, and open source / community feeds. 


The vendors respond, with mixed results, to a multi-market environment.

It's the view of the authors that providers of data loss prevention (DLP) offerings sought the creation of a viable middle market by taking extensive and complex systems and 'simplifying' them. The idea seems to have been to package for those organizations with fewer resources and less expertise, the products available to larger organizations with budgets that supported hiring in the expertise. 

At the same time the world was learning of the state-sponsored espionage and sensitive government and private-sector documents making their way into Wikileaks, the data loss prevention (DLP) vendors were 'simplifying' their offerings. We've remarked that this may be giving the market what it asked for, but not what it needed. Information protection is hard, although the answer isn't to oversimplify it.

The expanded market didn't materialize, while the high end market remained stagnant because this market embraced the compliance-centric strategy, and can't seem to get beyond concern about their handling of personally identifiable information (PII).  So, to pick up the first theme, the DLP vendors got smoked in by middle market that was reducing exposure, and went flat in the high-end where compliance is king.

The short story (according to the 451) seems to be: It sux to be a DLP product provider.  Does that mean, however, that life will be better for those vendors that find themselves in the services business, taking responsibility off of the individual company's staff, and allowing them to reduce direct exposure?  


The markets for mobile endpoint security get spotlighted.

When considering security aspects of mobile access and mobile devices, the 451 Group makes clear once again that there are multiple markets.  One of the most interesting sections of the piece discusses the approaches offered by the industry to address mobile endpoint security. The authors contend that the security vendors, by mistaking multiple markets for a single market, have conflated the protection against malware and the proper service management of configuration, activity and applications.  Yes, they're all security issues, and, yes, they all require special attention when an enterprise uses mobile endpoints.  But, that doesn't mean they're all the same market, nor should they be considered as addressable with a Swiss Army Knife collection of cures.  

Sandboxing: Virtualization is not just for OS and apps. Among the most promising, but potentially mismanaged approaches to mobile endpoint security is the use of virtualization on mobile devices to segment and protect the management of execution (operating systems and applications).  What the 451 guys make quite clear is that sandboxing as a strategy can and should be applied as well as to segment and protect data when it ventures out into the mobile world.  As I read this section, this brought me to exclaim: 'What this world needs is workable data provenance!!'  (This is a longer story, and one on which I can hold forth interminably.  Let's just leave it at this: I see it becoming the norm to operate IT environments on the basis of meta-data and rely on master data management to reduce redundancy, improve accuracy and establish 'immediate' consistency. The same mechanisms should be available to IT for the purpose of retaining a tamper-resistant history of a workload or data record, thus providing an authoritative "life story" and chain of stewardship or custody.) 


Data security and identity solutions -- the coming reliance on entitlement by policy.

While there are at least another four or five big ideas covered in the article, the last one I'd like to call out relates to the changing role of identity technologies in the determination of who's entitled to use which data and for what purpose.  The authors do a great job spotlighting the data enablement value proposition:

We also expect greater integration with identity solutions, with policy determining who can access which data (ideally within which contexts). For appropriate use cases, we have seen large enterprises re-entertain information/enterprise rights management. At the end of the day, organizations create value out of sharing information, so solutions need to first support the needs of the business and, second, assure that vital collaboration can be done within acceptable bands of risk.

This becomes imperative when, through the use of one or more 'outsourced' providers of infrastructure or management services, it becomes necessary to control which attributes of data an application can access.   The 451 Group calls out a scenario in which vendors in the data security arena become more active in integrating authentication and single sign-on technologies, in order to provide a more complete service offering to which corporate IT can delegate responsibility for data security.


I have not done justice to the range of issues covered in the article.  I know for a fact that the authors agonized over how to cover the important issues in a relatively short survey piece.  And, there are at least four or five additional themes covered, which I've left out of may coverage. They're worthy, but didn't move me to thought the way these four did.  If you have access to the article, I recommend you read it. As I went through it again this evening while writing this post, I came across another couple of gems, but they'll have to wait for another day.

Page 1 ... 7 8 9 10 11 ... 14 Next 4 Entries »