« Cloud, Standards and Interoperable Activities | Main | Hoff to Cohen to ... Watching a Meme and its Wake »
Sunday
Feb152009

Provisioning, Intention and Intelligence

I've been working for weeks to find a simple explanation that distinguishes the more conventional approaches to datacenter management from those that incorporate "intention," particularly as it relates to provisioning.  Doing this simply has been no small task.  The effort has come about as a result of having a continuing conversation with Rich Pelavin, one of the co-founders at Replicate, about what he refers to as the "false promise of automation" in the datacenter.  So, here are some musings.  Not a lot of conclusions... more like observations.

What do I consider the "conventional"? When I think about how most virtual system management solutions address provisioning, it's generally predicated on the end-user's ability to set out (that is, declare or describe at the beginning) the ideal configuration in terms of how the individual elements are placed, how they are individually configured, and the sequence or order in which they are deployed.  The premise seems to be that, should the configuration drift too far away from the ideal, the "perfect" model configuration can be restored, returning to some pre-defined ideal... the same one each time the resurrection is invoked.

The fallacy of automation of which Rich P. speaks seems to be this:

Provisioning based on reinvoking a script or highly specific process assumes that the "ideal" state to which the datacenter (or portion of the datacenter) returns, is indeed the ideal. 

Who's the "authority" or how was it determined that the model state to which the datacenter will return is, in fact, correct or ideal? 
Is there anything in the automated provisioning system that can compare the objectives, intentions or policies of the end-user with those embodied in the script or the "ideal" configuration that results from its execution?


Provisioning based on reinvoking a script or recipe doesn't take into account a change in the environment (to which the datacenter as a whole needs to respond).

On what basis is the script or recipe reviewed to determine whether some aspect of the external environment or the internal status of the datacenter has changed in a fundamental manner?
Is there a cogent statement of intention, incorporated in either the automated system or the human process of review and revision, that regularly -- perhaps every time the script is invoked -- revisits the intentions, assumptions and the environments to determine whether the ideal configuration enbodied by the recipe is still the ideal?

Provisioning automation which relies on conditional logic assumes that the instrumentation available to it ... the information it has to work with in order to take one path or the other, or algorithmically establish the level or scalar value of some aspect ... is complete. 

That is, whatever the system doesn't see or can't know will not have bearing on the decision.  The assumption here is that the customer/end-user has instrumented the system to the maximum degree possible, and has given over the decision making to a process which has sufficiently complete information.


The more advanced provisioning automation systems defined by the reliance on simple rule sets and wrote procedures seem to rely on a belief (hope?) that the orchestration offered will exhibit some kind of emergent behavior.

Even if the provisioning is accomplished by an orchestration system that incorporates more sophisticated logical constructs ... or just the incorporation of conditional statements in the scripting ... the depth of definition in describing the ideal is still in question.

So, as I dig into the issue, these thoughts emerge.

First, the appropriate depth of target or objective definition needs to be established.  Specifically, the discourse has to be at the level of overall intent, and therefore at the level of datacenter strategy, rather than administrative operations or sub-system tactics.

Most automated provisioning systems operate at the level of tactics and the manipulation of datacenter elements.  The more advanced, incorporate conditional logic and instrumented sensing that provide an operations point of view, based on the administrative domain (i.e., the network configuration, the storage system configuration, the arrangement/placement of virtual machines for purposes of "load" balancing).
Second, insisting that the entire datacenter strategy actually CAN be automated, leaving its operations in the hands of the "emergent behavior" that comes from the repetition of simple processes, results in either a lot of mistaken configuration, or an overly constrained, highly repetitive and therefore rather inflexible operation.

The ability to repeat simple processes is not the equivalent to an "intelligence".  It doesn't constitute the capability for abstract thought or clear intention.  This argues that the notion of an assistant or advisor to the human datacenter administrators, and (even more to the point) an advisory system that "speaks the language" of all the necessary administrative domains (servers, applications, networks, storage, ...) is the best option open to us.


Third, intention and the strategy involved in operating on the basis of intention need to be systemic or (oh.. no!!) holistic.

The intent of the datacenter's configuration and the data on which the configuration decisions are to be made need to consider the dependencies of all the datacenter's systems, their interactions and the metrics on which the datacenter's performance is objectively assessed.

Reader Comments

There are no comments for this journal entry. To create a new comment, use the form below.

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
All HTML will be escaped. Hyperlinks will be created for URLs automatically.