Gartner BI Summit Part 2
January 29, 2009
As promised in the Mini-Summary, which was written in some haste to appease those who weren’t enjoying the 24-hour party city that isn’t The Hague, a little (in fact, rather a lot) more on what went on at the Gartner BI Summit. In the Mini-Summary, I covered the keynote, in somewhat light detail. It was probably enough to give a flavour.
I’ll outline the sessions that I attended so you know I wasn’t in the Hague for the crack. It’ll serve as an aide-memoire for me too. It was great to meet up with some of the folks I met on Twitter and also others that I first met at the conference. On with the summit.
Data Integration Architecture and Technology: Optimizing Data Access, Alignment and Delivery – Ted Friedman – Gartner
This is an area of interest to me, as one of the products I look after is firmly in this space. A very good presentation containing plenty of survey-based facts, and a case study on Pfizer, who have a ‘data integration shared services’ organization. I suppose this is a DI version of the Business Intelligence Competency Centre.
ETL is still by far the largest area of DI, with replication, federation and EAI following. In addition, standardization of DI tools/architecture within organizations is still some way off.
The high-level message was that Data Integration is an absolute foundation of any data operations, whether BI or otherwise. Without good DI, you just end up with the old GIGO scenario. Not too much new for me, as to be expected, but Ted did put the kibosh on the BI as a service by reflecting my own personal view that in most cases, these data environments are ‘too diverse’ to lend themselves to easily to the SAAS model due to being hamstrung by the data integration piece of the puzzle. Narrow, specialized solutions can work, as well as simple data environments. However, as was pressed home later in the conference, that’s not the main reason BIaaS will not be as popular as many are projecting.
This session started with Timo mashing up some Obama data in Xcelsius and was generally designed to show that SAP Business Objects still has some innovation to show, even now it is part of the Walldorf Borg. The main highlight (from their point of view) was Polestar. I took a very quick look at the site, but was diverted by the typos “quentity” and “dectect” as well as noting it was not tested on IE8, so I left it for another day. Looks interesting though.
SAP generously conceded that less than 50% of business data exists in SAP. I am assuming they mean within organizations running SAP. Even then, that’s probably an underestimation. To that end SAP are introducing federation capabilities.
The Role of BI in Challenging Economic Conditions – Panel Discussion
The panel consisted of some large customers from around Europe. They were giving their views on how the climate affected their BI activities. Key point here include reducing the number of tools and vendors in the BI stack, squeezing licence costs – either by forcing prices down via negotiation, redeploying seldom used licenses or other BI ROI audit activities. Naturally, I imagine some licenses will become available as headcount shrinks this year.
The customers were focusing their BI efforts more on costs than on revenue and margins, which were previously the focus. In this uncertain environment, the speed of decision making is critical and some of the selection criteria for BI tools and initiatives have changed a lot. One of the customers noted that they used to talk about the look and feel, get down to details such as fonts etc, now its “how much, how fast to implement?”
BI is going to be more tactical for the short term, with small-scope projects targeted at answering key questions quickly.
Emerging Technologies and BI: Impact and Adoption for Powerful Applications – Kurt Schlegel – Gartner
This session looked at the macro trends in BI, which were as follows:
- Interactive Visualization (didn’t DI-Diver do this back in the late 90’s?)
- In-Memory Analytics
- BI Integrated Search (they showed Cognos here, but strange there was no mention of MOSS 2007 / BDC which does this quite nicely)
- SaaS (showed a good example where the SaaS provider had a ton of industry information that could be leveraged for decision making, rather than just some generic solution shipping in-house data back and forth)
- SOA / Mashups
- Predictive Modelling
- Social Software
None of this was new to me, but there were some good case studies to illustrate them and the SaaS example was the most realistic I’d seen from a business benefits point of view.
Using Corporate Performance Management to understand the drivers of Profitability and Deliver Success – Nigel Rayner – Gartner
This was an area I wasn’t too familiar with, but Nigel Rayner did an extremely good job in pitching the information and delivery as to not overwhelm novices, but not oversimplify and thus bore the teeth off seasoned practitioners.
Kicked off with increasing CEO turnover, then how the market measures CEO performance. Most organizations don’t have a handle on what actually drives profitability, which is where CPM can help with profitability modelling and optimization. The whale curve was discussed and Activity Based Costing.
A key point that was made is that BI is very often separate from financial systems and CPM links the two together.
Driving Business Performance in a Turbulent Economy with Microsoft BI – Kristina Kerr – Microsoft , Ran Segoli – Falck Healthcare
MS BI case study, focusing on the cost-effectiveness and speed to implement of Microsoft BI. I have had a lot of exposure to the stack and other case studies, so didn’t make notes. Sorry.
Does BI=Better Decision Making – Gareth Herschel – Gartner
Really enjoyed this session, for the main reason that this was a welcome step back from BI per se, and looking at decision making in general. He looked more at the theory of decision-making first, then linked that to BI.
The first area was predicting (root cause) or managing events, if this can be done effectively, then the increased speed of detection can allow more time to make appropriate decisions, especially as the more time you have, the more options you have available. This ties in to CEP (complex event processing) and BAM (business activity monitoring). In addition, data mining can assist in predicting events and scenarios.
This is a discipline that must be constantly reviewed, as what happens when prediction and analysis disagrees with reality? Either the situation has changed, or you didn’t understand it correctly in the first place.
He went through 4 key approaches to decision making and their rating of explicable vs instinctive and experience required.
- Recognition Primed
- Thin-slicing (“Blink”)
This fed in to information delivery methods. This would be selective displays such as dashboards, alerts/traffic lights, or holistic displays such as visualization, which are more ‘decision-primed’ than data-centric displays such as tabular representations.
It was clear that he saw visualization and very narrow, selective displays as the best way to aid decision-making.
In my opinion, all that’s fine and dandy, if you’re measuring and delivering the right targeted data 100% of the time, otherwise it is very easy to be blindsided.
Would certainly seek him out at other Gartner events for some thought-provoking content.
Gareth made some good book recommendations:
Various Dan Gilbert stuff on Emotional Decision Making – This is his TED talk.
A very good session, surprising at least one of the open source advocates in the audience with it’s upbeat message. A highlight was Donald Feinberg’s prediction that Unix is dead and the funeral is in 30 years. This is in response to Unix ceding to Linux in the DBMS world. It appears Gartner have relaxed their usual criteria in order to give OSS a chance to be evaluated based on support subscription revenue.
Feinberg also strongly recommended that anyone using Open Source must get a support subscription, to do otherwise being tantamount to lunacy.
On to the BI side of OSS and market penetration is low, with less than 2% of Gartner-surveyed customers using it. However, a growth area with small ISVs using it as an OEM strategy for their BI requirements.
The functionality gaps are getting smaller between commercial and OSS, with Reporting, Analysis, Dashboarding and Data Mining all now being offered, but still no Interactive Visualization, Predictive Modelling, Mobile play, Search or Performance Management.
On the DI side, other than the Talend/Bitterer argument, it’s not hotting up too quickly. DI is mostly limited to straight ETL of fairly meagre data volumes, daily batches of around 100K records.
Functionality gaps here are in the following areas: Metadata management, Federation/EII, Replication/Sync, Changed Data Capture, Unstructured Content, Native App Connectivity and Data Profiling/Quality.
An overarching issue to adoption in all areas is the lack of skills.
An interesting scenario that was floated was the creation of an open source stack vendor, namely Sun, snapping up some of the OSS BI players.
The Right Information to the Right Decision-Makers — Metadata Does It – Mark Beyer – Gartner
This was a useful presentation for me, as I am familiar with metadata, but not the systems and processes used to manage it. So the definition of metadata as per Gartner is data ABOUT data, not data about data. Metadata describes and improves data, unlocking the value of data.
I knew some classic metadata SNAFUs such as projects where measurements across country-separated teams were in metric or imperial, leading to untold costs.
Some others that Mark mentioned were very amusing, such as the data members of Gender. I can’t recall the exact figures, but one government organization had 21.
On to why metadata matters in decision making – it can be an indicator of data quality, it can indicate data latency and can provide a taxonomy of how to combine data from different areas.
In addition, metadata can help provide a business context of the data, in addition to mapping importance, user base and various other elements to give an idea of how critical data may be and the effects of improving that data or the impact of any changes in the generation or handling of the data.
Obviously SOX and Basel II also put increased pressure in managing metadata for the purposes of compliance, governance and lineage.
I think the takeaway for me was this, in terms of key questions that metadata should seek to answer.
- What are the physical attributes of the data (type, categorization etc) ?
- Where does it come from?
- Who uses it?
- Why do they use it?
- How often do they use it?
- What is it’s quality level?
- How much is it worth?
Stupidly, I ran out of paper, so had to take some notes on the phone. I don’t like doing that as it looks like you’re texting people, or Twittering. So, I limited myself to the bare minimum.
Performancepoint is weak with respect to the competition. I guess it’s even weaker now they’ve ditched planning.
Donald Feinberg is not a fan of SaaS BI. A view I agree with, party due to the data integration issues in the real world, as highlighted by Ted Friedman, earlier in the week. So, Donald decided to do a straw poll on who would be interested in/ implementing SaaS BI. I think there might have been 1 person, but possibly zero. There goes a bunch of predictions for 2009. The reason for this retiscence was one of trust, they just don’t want to throw all this over the firewall.
Another straw poll was the consolidation to a single vendor, most are doing this and very few said they were going to buy from a pure play vendor. I suppose you have to take into account the self-selecting group of end users at a Gartner BI summit though.
BI Professional – Caught in a Cube? – Robert Vos – Haselhoff
Entertaining presentation, but I was suffering with a bad cold and insufficient coffee, so didn’t get the full benefit. He did help me wake up fully for the next presentation, so can’t have been all bad. No talk of vendors and technology per se here, more stepping back and looking at strategy, organizational elements and making BI work from a people perspective.
This was an interactive session. Like a mock exam for BI folks where a bunch of people were randomly put in groups and asked to design a BI strategy. The results were pretty good and Andy Bitterer’s wish that they didn’t start naming vendors was fulfilled. However, I did note an issue with people really thinking details first, rather than strategy first. I also found it slightly strange that the CEO did not tend to come up as a contender for involvement. I saw more of this in Nigel Rayner’s CPM presentation, with CPM giving the CEO insight into profitability, so it seems to me to make absolute sense to have CEO involvement in the BI strategy, since the BI goals need to be aligned with the business goals. Some others did pick up on the alignment, but still saw it as in the CIO remit. All in all a pretty good showing, but the IT and ‘the business’ lines were still visible, if somewhat more hazy than before.
I took a LOT of notes in this session, so I’ll try and boil it down. Typical situation is a bunch of folks in the boardroom, all claiming different numbers. This leads to risky decision-making if unnoticed and a huge time sink reconciling when it is noticed.
Once again, there is a turf aspect involved, with data being considered IT’s problem, so they should be responsible for data quality. However, IT is not the customer for the data, so don’t really feel the pain that the business feels from data quality issues. In addition, IT don’t know the business rules or the domain expertise. It’s not a pure technology problem, but IT need to be involved to make it work.
There were some examples of the costs of bad data quality, leading to working out ROI for investment. With Sarbox et al, of course there is a new cost involved for the CEO/CFO, the one of going to jail if the numbers are wrong.
Another aspect of the ROI was based on the level of data quality, it may be that 80% is enough, especially when the move to 100% is astronomically expensive. The return on incremental improvements needs to assessed.
So, who’s on the hook for DQ then? Data stewards, who are seen as people that take care of the data, rather than owning it (the organisation owns it) they should know the content and be part of the business function, rather than IT.
An example to show how exposing data quality within an organisation was a DQ ‘scorecard’. This gives an idea of the quality, in terms of completeness, duplication, audited accuracy etc. A problem that I see with this is a kind of data quality hubris versus a data quality cynicism. If it works well, then the scorecard can give the right level of confidence to the decision makers, but if not, then it could lead to overconfidence and less auditing.
So, operationally the key elements are:
- Data Profiling / Monitoring – e.g. how many records are complete.
- Cleansing – de-duping & grouping
- Standardization – rationalizing SSN formats, phone nos etc
- Identification & Matching (not 100% sure here, I see some of this in cleansing)
- Enrichment – bringing in external data sources, e.g. D&B to add more value to the data.
Ideally DQ should be services, which are then reusable and repeatable – used by many different data sources. SOA model, although SOA is supposed to be dead isn’t it? Who knows, maybe the term has died – the technology and approach certainly lives on.
Lastly DQ ‘Firewalls’ were discussed. This is a set of controls used to stop people/systems from poisoning the well. Inbound data is analyzed and given the elbow if it isn’t up to snuff. It even incorporates a ‘grass’ element, where DQ criminals are identified and notified.
The conference starting to take its toll by this point, a flu-like cold and no more tablets left. Add that to a few pretty late nights, notably with folks from Sybase, Kognitio, the BeyeNetwork, end-users and even Gartner (not analysts, I hasten to add) and the writing is on the wall. Deciphering my handwriting is like translating hieroglyphics written by a 3 year old.
So, the summary of this session is ultra-short.
- BI MQ SAP/BO moved down a little, counterintuitive to some.
- DI MQ Data services / SOA capability is key. Tools need to supply and potentially consume metadata to play well in a real world environment. Currently 54 vendors ‘eligible’ for this MQ
- DQ MQ Pace of convergence between DI and DQ is increasing, it will become critical. Acquisitions will increase from DI vendors having to fill out their feature sets.
Overcoming The BIg Discrepancy: How You Can Do Better With BI – Nigel Rayner
I made a herculean effort to stay conscious in this session, mainly because I had enjoyed Nigel’s CPM session and he proved also to be a very nice chap when we chatted over a cup of coffee earlier in the week. In addition, I had paid for the 3rd day, so was going to extract every drop of value😉
Nigel kicked off with “the downturn”, of course. The message was do not hit the panic button. BI and PM will play a key role in navigating the downturn:
- Restoring business trust
- Understanding why, what and where to cut
- Identifying opportunities of business growth, or which parts of the business to protect
There was some reality also, in that it is unlikely that “Big BI” projects will be approved in the short term and you will need to do more with what you already have.
The plan of attack is the 3 I’s – Initiatives, Investments and Individual Actions
- BI/PM Competency Centre
- BI and PM Platform Standards
- Enterprisewide Metrics Framework
- Inject Analytics into Business Processes
Prioritization of investments is critical. Targeted short-term, cost-effective investments are the order of the day. Some suggestions include:
- Data Quality
- Data Mining
- Interactive Dashboards
- CPM Suites
There was a mention of ‘Spreadsheet hell’ being addressed by CPM.
- Take advantage of key skills as companies undertake knee-jerk cost-cutting, AKA get good laid-off people on the cheap.
- Redeploy key employees to tactical, short term roles rather than RIF-ing them.
- Respect “conspicuous frugality” but don’t be defined by it.
- Learn from others (i.e BI award winners, case studies, social networks)
- Evangelize BI
Then, it was a mad rush for the taxi to the station.
For more, detailed coverage of the event, check out Timo Elliott’s blog post.