About Us | Login | Follow CITO Research:

Business Intelligence and the Data Center

Business intelligence image

A commenter on my Forbes.com JargonSpy column this week (How to Fix Apps Marketplaces) made a creative suggestion about monitoring the use of applications that points in many interesting directions that have been on my mind. Brent Frei said in a comment to the post:

...one specific suggestion would be to rank utility. iTunes knows how many times you’ve listened to a song, and via single-sign-on, Google knows how many times you’ve logged into the Marketplace’s apps post initial install. Useful apps will register well in the frequency and duration of access.

This is a huge idea that would vault Google Apps Marketplace way ahead of other alternatives in terms of helping determine which apps are actually getting uptake. If the statistics showed average logins per user or a chart of showing distribution users by the number of times they had logged in or some other type of normalized view, then you would easily be able to see if an application like Cordys Process Factory was really finding favor or just getting downloaded a lot.

This idea connected to two streams of thought I've been having lately. The first is about operational intelligence (see the problem statement Operational Intelligence for CITOs), which is the art of using streams of data, often from machine or other systems, to get a better real-time view of what is happening in a business. Brent's idea is an application of operational intelligence to the data center.

The stream of data about who logged in to which applications could provide a better view of which applications are really getting traction. I've been thinking about machine data quite a bit because I've been writing for Evolved Media about Splunk, a system for searching, distilling, and understanding machine data that is quite popular in IT departments.

Splunk takes the idea of grep and piping data from one command to another and generalizes it, completes it, and puts it on steroids. Using Splunk, you can search all the logs on all the web servers in a data center from one console using a search language that allows you to find specific types of records and also to create dashboards that summarize activity.

I've been doing research about how Splunk can not only be used for helping data center professionals but also for finding business insights from machine data (see the problem statement Finding Business Insights in Machine Data). What interests me is what you can find out from the different sources of machine data that would be of use to running the data center or the business.

The next natural next step is to start telling the story of how a data set like a web log went from being a source of information for break/fix to a way to better understand consumer behavior. Brent's observation is that a log of single signon data could be used for understanding application usage.

The challenge is to understand the data sets, not to do the processing of the data, which Splunk makes pretty easy once you know what you are looking for and why it matters. Another approach to monitoring application usage is to analyze the clickstreams through applications using a technology like Knoa.

But operational intelligence is a supplement to business intelligence, not a replacement. In a post on The Coming Crisis in IT Management, I worry about the prospects for IT managers to handle the increasing complexity of the data center which will be made much harder to manage as virtualization, cloud computing, and the adoption of SaaS applications make everything more dynamic and complex.

The remedy for managing such complexity is a better model of what is happening in the data center, in other words, business intelligence.

Past generations of data center tools were built for monitoring the technical aspects of static operational environments. The modern sets of tools like Service-Now.com or Quest Software's Foglight offer much richer models.

One company, nlyte Software, is betting its future on the premise that what data centers need is data center infrastructure management solutions that are supercharged with business intelligence capabilities.

Applying business intelligence to the data center means two things:

    • Creating a unified model of physical, virtual, and public cloud-based assets that can be tracked and analyzed.
    • Using that model to monitor the business activity flowing through the applications and systems so that decisions about data center activities can be made with the full business impact in mind.

Both nlyte's chairman and CEO have deep BI backgrounds. I think the next stop in this thread of thought is to find out what business intelligence in the data center means to them.