Keys to Business Intelligence by Jai Krishna Ponnappan


                 

               Historically, business intelligence has promised a lot of “Yes,” but the reality has been filled with “Nos.” The promises are enormously compelling. Companies collect vast amounts of information about markets, customers, operations, and financial performance. Harnessing this information to drive better business  results can have tremendous impact. Some corporations have achieved impressive gains after investing millions of dollars and multiple years of effort into building traditional analytical systems.



                   However, these success stories are frustratingly few and far between. Traditional BI, long the only option, can be prohibitively costly and complex. For companies without millions of dollars to invest, the options have been few and unattractive.   Further, even when these investments of time and resources can be made, they don’t guarantee success. For too many companies, BI doesn’t deliver on its promises -- it is too costly, too complicated, too difficult to scale and extend. The end result is a reality in which only a small minority of employees have access to BI. According to Gartner, only 20% of employees use BI today. This falls far short of the potential transformative capabilities of BI throughout a company. It’s time for BI that says Yes. Yes to the requirements of your budget, business, and business users. Yes to fewer compromises. This whitepaper first looks at the fundamental requirements that a BI solution should deliver to your company. Next, this whitepaper covers the 11 Key Questions that you should be asking of a future BI technology partner. When  the BI provider can answer Yes to all of these questions, you have BI that is capable of fulfilling your analytical and reporting needs both today and over time – it is flexible, powerful, and efficient. It is BI that says Yes.


Business Insight—Four Foundational Requirements

Any organization investing in business intelligence needs to define the capabilities
that will help them to win against the strongest competitors in their market.
Here are the four bedrock requirements that should define the core capabilities
of your solution:



Historical analysis and reporting.

Fundamentally, BI should give you insight into both business performance and
the drivers of that performance. An understanding of business influencers and
results is the foundation for successful, proactive decision making. Technically,
this capability requires the mapping and analysis of data over multiple years.
This can also often mean the modeling and manipulation of hundreds of millions
of database rows.


Forecasting and future projection.

While understanding historical data is a first step, it is also vital to project those
findings into the future. For example, once you know how different types of
sales deals have progressed in the past, you can examine current opportunities
from that perspective and make future forecasts. The ability to forecast and
align your business resources accordingly are key to success.

Ability to integrate information from multiple business functions.

Strategic insight often requires data from multiple systems. For example, operational
results require a financial perspective to show the full picture. Sales management
benefits from a comprehensive view of the demand funnel. Targeted, customized
marketing efforts require analysis compiled from customer, marketing, and
purchasing data. Your solution needs to be able to easily integrate information
from multiple sources in order to get answers to broad business questions.
Easily explored reporting and analysis.

Decision makers need to understand overarching business views and trends. 

They also need to examine increasing levels of detail to understand what actions can
be taken to achieve further success. It’s not enough to simply have a report; if
that report is not explorable, it might raise critical issues but not satisfy the need
to know more detail in order to make a decision. A full range of drill-down and
drill-across capabilities make it possible for decisionmakers to fully understand
an issue at hand and make critical decisions.

These four capabilities form the foundation of a powerful business intelligence
solution that can answer the critical questions facing your business. If a solution
cannot meet one of these requirements, your solution will not have the full
range of analytical capability that you will need to be competitive.



If the solution that you are considering meets the Four Foundational Requirements,
it is time to delve more deeply. The following twelve questions will help you to
assess your options and ensure that you are getting a robust, powerful solution
that meets your business requirements.

Can I get a comprehensive view of my business?


Even seemingly basic questions can require data from a variety
of operational systems, 3rd party data sources, and spreadsheets.



Even basic business questions such as “Which marketing campaigns generated
the most revenue this year?” or “Did the product redesign have the desired
effect on part inventory levels?” could require data from different operational
systems, 3rd party or partner sources, databases, and individual spreadsheets. As
a result, a core BI requirement is the ability to access, acquire, and integrate data
from multiple sources.
Traditional BI solutions provide this capability, but it can be arduous to implement
and maintain. Traditional BI accesses multiple data sources with complex and
expensive ETL systems that bring data together into one physical database.
Unfortunately, this database is totally disconnected from the world of the
business user. This requires another round of programming to connect the
physical data with the business user model.
A more modern solution enables you to:
• Experience a powerful, usable solution. Traditional solutions build from
the bottom up. A more modern approach starts instead from the top - the
logical business model. It then works downwards to manage the physical
data that is required to deliver these business views. This “top down”
approach manages the complexity that results from integrating multiple
data sources – so that the solution is both powerful and easy to use.
• Analyze information from all types of data assets. Data are provided to the
business in a variety of ways. Your BI solution needs to extract information
from corporate systems, stand-alone databases, flat files, XML files, and
even spreadsheets.
• Access remote or secured databases. Traditional BI uses an ETL process to
extract data out of a source database and place it into a data warehouse,
while some SaaS providers can only access data that is uploaded to their
servers. The more sophisticated SaaS BI providers can both upload data
and access local databases; that is, it allows you to access and analyze data
without actually uploading it. This is accomplished via real-time queries
against the database.
• Manage the required metadata. In addition to the management of data
sources, multi-source BI requires the management of all accompanying
metadata, the information about the data itself.



Does it provide full features at an affordable price?


Modern BI solutions make it possible even for smaller
organizations to afford a comprehensive BI solution.



Traditional BI solutions were often affordable only to the largest companies,
which had the large budget, IT staff, and resources required for initial deployment
and ongoing maintenance. Departments of enterprises and SMBs were
effectively priced out of the market.
Recently, the attractiveness of the midmarket has resulted in new “midsize”
solutions from traditional players and from new vendors. The catch, however, is
that the lower price often only purchases a “crippled” or partial solution.

So how can a smaller organization get a true BI solution? A fully deployed BI
solution must include the following: ETL and scheduling, database, metadata
management, banded/pixel perfect reporting, dashboards, e-mail alerts, and
OLAP slice-and-dice functionality.
Look for a solution that:

• Delivers a full BI solution, not parts of one. The license should include
everything that you need for a true solution: ETL, database management,
meta-data management, OLAP slice-and-dice query generation, banded
reporting, ad hoc reporting, and visual dashboards. A solution that has all
of the necessary components, already integrated for you, will deliver the
fastest, greatest value to your organization.
• Has easy-to-understand, affordable pricing. Traditional solutions have
many cost components – hardware, software, consultants, in-house IT,
support, and ongoing maintenance. As a result, the pricing is both high
and difficult to track fully. Modern solutions, such as ones delivered
software-as-a-service (SaaS), have more transparent and affordable pricing.
SaaS pricing is more comprehensive – the cost of hardware, software, and
support is in one monthly number. SaaS pricing is also more affordable,
since it leverages a shared cost structure, and these lower costs are spread
over time. This makes it easier to deploy and maintain a BI solution.


Can I start seeing value within 90 days?


In order for BI to be effective, it has to be deployed quickly
enough to address the critical issues that you are currently
facing.


Time to value is a prime determinant of the ROI of a business intelligence
deployment. Traditional BI solutions have struggled to deliver value to
stakeholders within a desirable timeframe. Due to challenges such as high
upfront capital expenditures, extensive IT resource requirements, and lengthy
development schedules, many traditional BI projects have taken over 12 to 18
months to complete.

Modern BI solutions can dramatically reduce the time to value by making use of
the following:

• Fully integrated solutions, from ETL to analytical engine to reporting engine
• Automation of standard processes
• Use of templates for typical reporting requirements, such as sales reporting,
financial reporting, etc.
• Software-as-a-service (SaaS) or on-demand, delivery models
• Leveraging of existing data warehousing investments
Modern solutions can also enable processes and approaches for BI deployment
that increase the likelihood of success. These include:
• Proving success incrementally and iteratively – avoiding the “Big
Bang.” In the earlier days of BI, customers were tempted to create a “big
bang” solution, since the cost and effort of creating the initial solution
and updating it over time were so high. Today, a BI solution offering a
fully integrated architecture – one with all of the components already
provided, working together -- allows companies to focus on initial highneed
projects, prove success, and expand or adapt over time. This ability
to iterate over time provides value more quickly, lowers ongoing cost, and
increases the likelihood of success.
• Deploying to the existing infrastructure; avoiding major infrastructure
upgrades. The second major reason that traditional solutions are slow to
deploy is that they often require an additional investment in new hardware or
software. This lengthens timeframes, since a major capital purchase requires
a financial approval process that can take up to a full year of review and
approval. If a solution can leverage the existing infrastructure, this process
step is avoided. Also, if the solution itself is more affordable, or, like SaaS
solutions, offered as a subscription (which can be charged to operating
expenses, not capital budgets), this budgeting process step can be bypassed
or shortened.
• Deploying with the IT team you have. The construction of a traditional
BI solution requires many specialized resources like data modelers and
ETL specialists. Any plan that requires these professionals will confront
resource bottlenecks.


Can I be assured that my data is secure
and available?


Data security and availability are key requirements of a BI
project.


Data security and availability are key requirements for any IT system. You need
a BI solution that matches the same high levels of performance, reliability, and
security that you expect of the other systems in your portfolio.
Security is fundamental, since the data your business uses is critical to competitive
advantage, effective operations, and consumer or patient privacy. Availability
is also critical, since you need to be able to make decisions in a timely manner,
addressing issues as they emerge. Your system needs to be ready to respond
when you need it.
Your BI system should:
• Provide high availability. If the solution that you are considering is a
traditional, on-premise one, how often is it down for maintenance or
updates? How reliably is it available, given your configuration? If the
solution that you are considering is a SaaS solution, what is the uptime
guaranteed in subscription contracts? You will want to be sure that your
solution will be available 99% of the time, if it is on-premise or SaaS.
• Be built on high performance hardware. If you are selecting a SaaS
vendor, make sure that their solution is operating on high performance
hardware that will provide the necessary reliability and availability that you
seek. If you are selecting an on-premise vendor, make sure that you are
making the appropriate investments into the type and quantity of hardware
that will provide high reliability and will also scale over time.
• Provide flexible security models. Most deployments have varying levels
of feature access, depending on the user’s role. Some users may only be
able to view a subset of reports, such as sales reports, while others will have
full access to all data, reports, and administration features. The solution
needs to ensure that users have access appropriate to their role. This will
require features such as defining row and column filters to limit data to
those individuals and groups who require it.
• Have SAS 70 data center certification (SaaS providers only). If you
are reviewing SaaS vendors, be sure that the data center where the
information will be stored has SAS 70 Certification. This represents that a
service organization has been through an in-depth audit of their control
objectives and control activities, which include controls over information
technology and related processes.


Can I proceed with limited IT resources?


Traditional IT solutions can monopolize IT resources. Modern solutions have a
lighter IT footprint, so that IT can focus on higher priorities.



Traditional BI solutions require significant IT resources up front for deployment,
as well as a high level of ongoing resources for maintenance, support, and
report creation and updating.
These intense IT requirements often limited the use of BI by smaller and midsize
organizations, which didn’t have a deep IT bench, or departments of enterprises,
which didn’t get enough allocation of IT resources.
Worse, IT resources were often required for report creation or updating. This led
to long lines outside of the IT department by business managers who wanted new
or better reporting. IT was swamped, and unable to focus on other priorities.

Modern solutions have a lighter IT footprint, which allows IT to focus on high
priority projects, and also ensures that business users get their questions answered
quickly and independently of IT. Look for a solution that:

• Minimizes IT resource requirements. Reducing the upfront and ongoing
IT resource requirements both saves money and increases the speed of
deployment. SaaS based solutions, for example, require less IT resources
since the solution is provided as a service – there is no hardware to buy, no
software components to cobble together. Updates happen automatically,
so IT maintenance burdens are dramatically reduced.
• Respects IT standards and expertise. An organization’s IT team is
fundamental to the company’s ongoing operational success. The solution
should meet IT requirements for security, availability, and compatibility with
other systems.
• Empowers the end users. When end users are more self sufficient, the
demands on IT are lighter, and IT can better prioritize their activities. Ideally,
trained users can define reports, dashboards, and alerts on their own,
without any Java programming or scripting. IT can oversee critical data
management functions without getting bogged down in time consuming
user-facing report definitions.




Does it avoid risky integrations?



A solution that is already integrated has far lower deployment and ongoing risk
than a solution that starts as standalone components.




Another major contributor to the high risk in traditional BI solution development
is the large number of products and technologies that must be bolted together
to get a full solution. To start with, an ETL product is used to manage the
task of extracting data, transforming it for analysis and inserting it into the
warehouse. These tools are very technical and require expensive programmers
with specialized training.
But vendors have menacing gaps within their own product suites. Most BI suites
have been created from acquired technologies with only loose integration
between the capabilities. Most enterprise BI vendors require you to use separate
technologies for OLAP, reporting, dashboards, and even on-line access to data.
These separate products each require configuration and support.
Modern vendors take an entirely different approach to solving the technology
problem. They:
• Deliver all key functionality in one solution. A fully integrated BI
platform means you have one solution to master and all of your metadata
is encapsulated in one place.
• Require your staff to learn one technology and toolset. A single
solution has one set of commands, syntax, and data structures throughout.
Once your users have been quickly trained to develop applications on
the underlying platform, they will be fully equipped to create all types of
customer facing functionality.
• Avoid custom coding. Because Birst connects everything within one
application, you eliminate all of the situations where you would be required
to use custom java code to script data or custom reports.
• Ease vendor management. Birst reduces the number of responsible
parties to the magic number of one. You won’t have to live with finger
pointing and cross vendor diagnostics when you have a problem. Birst provides
a unique answer to the ultimate need that you have for accountability.




Can business users easily create and explore
their own dashboards and reports?


Ideally, the solution can meet your current needs and easily scale to meet
future needs – even when that’s expanding from ten to ten thousand users.



Knowledge and speed are critical to solving business challenges. While BI
provides the information, it is the business manager who provides the timely
response to the new information. When insight is in the hands of business
professionals who can make a difference, organizations can achieve great success.
For this reason, it’s vital for a BI solution to make it easy for business users, not just
IT users, to analyze and explore information. The more BI becomes “pervasive” in
an organization, the more agile and proactive a business can become.
Achieving a solution that is easy for business users to “self serve” is challenging,
however. A solution has to be powerful enough to manage complexity and
make it simple for the end user.

To ensure that you have a solution from which your business users can ”self
serve,” look for one that:

• Is easy to learn and use. Users should be able to come up to speed
on the system within days, not months. The solution itself should take
advantage of user interface standards – dragging and dropping, dropdown
boxes, highlighting – that are already familiar to a web savvy audience. The
vendor should also provide adequate online, webinar, or in-person training
to ensure that your user base can take best advantage of the solution.
• Makes it easy to explore data and new information. A report is of
limited use if you can’t easily dig for more details or find the drivers of
why a result happened as it did. Dashboards and reports that allow you to
“drill” into deeper details, filter information to the exact data set that you
need, or reset information to desired parameters make it possible for you
to truly explore your data.
• Delivers quick responses; allows users to hone in on interesting data.
Even if the solution is analyzing gigabytes of information from across
multiple tables and data sources, answers need to be delivered quickly to
the user. Responsiveness, when combined with easy data exploration,
allows users to continue asking questions, refining them with each answer,
to hone in analyzing the exact issue of interest.
• Makes the complex easy. In order to make BI approachable for business
users, the solution needs to manage complexity to make analysis easier to
conduct. For example, one of the most complicated aspects of BI is dealing
with time variables. Every company has its own approach, and many business
questions include complex time nuances. Modern solutions can simplify
this complexity, allowing users to simply select options from a menu. Rather
than figuring out how to create formulas on their own, time-based reports
can be created with ease.


Can the solution scale to a large, diverse
user base?


The demands of your business can change rapidly and dramatically. Your BI solution
needs to keep pace.



Even BI projects with modest initial goals can eventually become huge deployments,
and you want to make sure that your solution can handle whatever the future
holds. If you are a midsize business with ambitions to grow significantly larger,
or a department of a large organization that realizes that your solution may
become a standard for the entire company – you want to make sure that your
solution can handle large, diverse groups of users, even if that’s not where
you’re starting.
A modern, SaaS architecture is highly flexible and scalable. It allows organizations
to start small, but add users quickly and at large scale. To be future proof, you
want your solution to:
• Quickly and easily scale to thousands of users. If your user base grows
from ten people to thousands in a short period of time, you want to be sure
that you can handle that growth in stride, without a major re-architecting
of the solution or the use of the full efforts of your IT team. This has two
benefits –since you only have to pay for what you need today, and you
only have to pay for what you need tomorrow, too. You don’t have to
pay upfront for “shelfware” that may or may not get used. The solution
should be able to add on users quickly, without a serious degradation in
performance, and without major resource and time requirements.
• Support multiple roles. As deployments get larger, users tend to fall into
different categories – super users, average users, occasional users. They
may have different demands on data, or have different security levels. Your
solution has to be able to easily accommodate these different types of users,
their access patterns, feature needs, and the ability to easily administer
them all.
• Grow without resetting. Scale should be organic and evolutionary, not
disruptive. You should be able to expand easily, without having to make
significant new investments in infrastructure or supporting headcount.
It should be a natural expansion, not a complete reconstruction of the
existing implementation.



Can the solution keep up with my business
as its needs change?


When information can be easily and securely shared with partners, vendors, or
customers, the entire business ecosystem is more efficient and effective.


A changing business landscape can challenge every company’s key systems,
but BI solutions confront even bigger obstacles than most. First, because of
their historical perspective, they must rationalize data across every version of
the business over a period of several years. BI cannot just move on to the next
release—it must accommodate the next release, as well as every prior iteration.
Second, much of the value of BI is to make sense of changing measures of
business effectiveness. Changes in customers, competitors, product offerings,
suppliers, and business units are all the target of your BI effort. A successful
solution must accommodate easily a dynamic business environment, rather than
requiring major reconstruction of data and functionality with each new major
product update.

A successful solution must:

• Add new data sources without requiring a major reset of the
solution. As your BI solution demonstrates its value with initial projects,
demand will increase to analyze more data sources. Your system should
be architected in such a way that it can accommodate this data easily and
seamlessly, without significant IT intervention or recoding of the solution.
• Be able to evaluate changes over time. To be effective, a BI solution
must model the many changes that happen over time. Looking at data
from an historical perspective requires a technology that can provide
meaningful views across data that is constantly changing.
• Offer business users self-service, so that they can answer their own
questions quickly and easily. Successful BI solutions become popular
solutions. If IT intervention is required for every new report request or
report update request, organizations end up with angry business users and
choked up IT request queues. When business users are empowered to
build and update their own reports and dashboards, the business is agile
and the IT agenda is focused on priorities. SaaS BI solutions, which have
the lightest requirements of IT teams, are particularly helpful on this point.
Heavy IT footprint solutions, such as open source software, can create
substantial IT backlogs over time.



Can the solution easily serve my
entire ecosystem?

Increasingly, organizations function by working with a network of suppliers, retailers,
partners, and channel resellers. Empowering these participants in your ecosystem with
timely information and analysis is a key to making this network function smoothly.
Achieving this extended view of information brings additional challenges to
your BI system, however. It requires a solution that can be easily and securely
accessed anywhere in the world. It also requires that information be tailored to
the level of access required – suppliers may have different views from logistics
partners. It may also require the effective delivery of information to a broad
array of devices - not just desktops and laptops, but mobile phones or tablet
computers as well.

A solution that serves your entire ecosystem should:

• Deliver a solution globally. While your direct employees may be
concentrated in one locale, your extended network is probably national or
global. Because of this, your solution must be accessible from any point
in the world where it is needed. While delivering a system like this in the
traditional method is complicated and prohibitively expensive, it can be
achieved fairly easily with SaaS solutions, which are available anywhere
there is an internet connection.
• Provide for multiple levels of access, with high security. The solution
should be able to control for which type and amount of data gets seen,
as well as which partners have the ability to add data or create their own
reports. Users could vary from people who only get alerts, people who can
see reports, and people who have full access to the solution. All should be
protected with the highest level of information security.
• Integrate partner data. Your strategic partners demand higher levels of
data integration. In the same way that your sales and marketing teams
want a unified view of the demand generation funnel, your partners
will want to see how, for example, your finished good inventory level
expectations match with their production capacity or parts inventories.
• Deliver to all types of devices. Supply chain users may be on the factory
floor. Sales users may be in transit, and executive users could be anywhere.
Keeping your ecosystem in synch requires that information be consumed
by the most convenient device, whether this is a desktop, laptop, mobile
phone, or tablet computer. SaaS solutions have another advantage here,
since they are accessed through a modern browser, so they can be easily
adapted to be consumed by small format devices.


Is the solution provider dedicated to my
ongoing success in BI?


A solution provider that is dedicated to providing best in class BI for its customers, 
both now and in the future, is a better bet.



Are the BI provider’s technology, incentives and motivations aligned with your
ongoing needs as a BI customer? Many traditional BI solutions have core
technology developed over two decades ago. These products were architected
in the age of thick clients, mainframe applications, and Unix database servers.
While these products have been updated with veneers of modern technology,
they still retain their older technology foundations.
Also, many traditional BI solutions have a business model that focuses on the
initial sale, not ongoing success. In the traditional software model, the initial
implementation is the largest payment to the software vendor. So completing
the initial sale is paramount, instead of ensuring satisfaction over the full
customer lifetime.

Companies deserve better than this. They deserve a company that is dedicated
to long term customer success. A modern vendor:

• Starts with a modern, standards-based architecture. Unlike traditional vendors
that continue to market what are essentially legacy products, modern vendors have
technology that is fully aligned with the cloud-based realities of today.
• Supports seamless, regular upgrades. Once a traditional BI solution
is deployed, it can be complicated, time consuming, and disruptive to
upgrade the solution, even when the new features are very desirable.
With a SaaS solution, new features and functionality are added regularly
and seamlessly, so that you can quickly experience the benefits of new
development while avoiding downtime and disruptions.
• Lives and dies by BI. The BI product category has come to be dominated
by technology giants that generate the majority of their revenues by doing
other things besides BI. As a result, the focus on business intelligence
innovation and customer satisfaction has declined. A vendor solely focused on
business intelligence is more dedicated to innovation and customer success in BI
• Is successful when the customer is successful – now and in the future.
SaaS vendors have a subscription model. They make their money over
the lifetime of a customer relationship, so their incentive is to ensure that
companies are up and running quickly, and satisfied with the ongoing
solution today, tomorrow, and five years from now. This is a significant
departure from the traditional model, where customers paid a significant
amount up front, but were left to manage deployment and maintenance
themselves; customer satisfaction concerns were left to the customer
themselves, and satisfaction was often low.

~Jai Krishna Ponnappan

Sustainable Success Starts with Agile: Best Practices to Create Agility in Your Organization.

             While it might frustrate us when we can’t control it, change really should be seen as opportunity in disguise. Enterprises everywhere recognize that to turn opportunity into advantage, business and IT agility is more important than ever. Fifty-six percent of IT executives in a recent survey1 put agility as the top factor in creating sustainable competitive advantage—ahead of innovation.

But creating true agility is hard. In the same survey, just one in four respondents said their enterprise’s IT was successful at improving and sustaining high levels of agility. Of the more crucial IT goals—optimization, risk management, agility and innovation—agility had the lowest rate of realization among enterprises.

True agility in today’s Instant-On Enterprise involves reducing complexity on an application and infrastructure level, aggressively pursuing standardization and automation, and getting control of IT data to create a performance-driven culture. While the challenges are great, getting it right enables enterprises to capitalize on change by reducing time to market and lowering costs.

Four drivers for agility

At its simplest, agility refers to an organization’s ability to manage change. But there isn’t always agreement on what agility means to the enterprise. Keith Macbeath, senior principal consultant in HP Software Professional Services, meets frequently with customers seeking to increase IT performance. Typically, he says, drivers for agility include the following:

IT financial management:
Understanding the levers that affect cost allow your enterprise to be more nimble. For instance, if your business is cyclical, moving to a variable-cost model in IT can help sustain profitability even through a down cycle.

Improved time to market:
This gets to the heart of agility: delivering products and services faster.

Ensuring availability:
"Availability isn't a problem ... until it is. Then it's at the top of the CIO's agenda," says Macbeath. Being able to rapidly triage, troubleshoot and restore service is as important to agility as the ability to get the service deployed in the first place.

Responding to a significant business event:
In the merger of HP customer United Airlines with Continental Airlines, the faster the IT integration, the more significant the savings.


Important factors for creating sustainable competitive advantage

The importance of measurement and benchmarking

Business agility starts with a holistic view of IT data, says Myles Suer, senior manager in HP Software’s Planning and Governance product team. "Getting timely access to data that allows you to drive to performance goals and then make adjustments when you don't meet those goals represents the biggest transformation possibility for IT," he says.

For CIOs focused on transforming their business, the key is gaining access to accurate, up-to-date metrics that are based on best-practice KPIs. Trustworthy metrics let CIOs control IT through exception-based management and understand what it will take to improve performance.
  
What to monitor and measure

To achieve greater agility, look to three broad areas of improvement and put in place monitoring programs to track against performance goals.

Standardization:

"That's the first step," Suer says. Ask yourself: What percentage of your applications run on standard infrastructure? How many sets of standard infrastructure do you have? "You want a standard technology foundation to meet 80 percent of your business requirements," says Macbeath. It may seem counterintuitive, but standardization makes a company more efficient and more agile than competitors. Standardization is also a prerequisite for automation. Public cloud providers aggressively pursue both. Taking the same approach for enterprise services means you can begin to realize cloud efficiency gains.

Simplicity:

"Agility is the inverse of complexity," says Macbeath. Your goal is to measure and reduce the number of integration points and interfaces in your architecture. The key is decreasing the number of platforms managed. Application rationalization and infrastructure standardization programs reach for low-hanging fruit, such as retiring little-used and duplicative applications and unusual, hard-to-support infrastructure. Longer term, your enterprise needs to tackle the more difficult task of reducing the number of interfaces between applications. Macbeath recommends driving to a common data model between applications. This reduces support costs and makes it faster and cheaper to integrate new functionality into an existing environment.

Service responsiveness:
This can be as simple as tracking help-desk response time, mean time to repair, escalations and so on. "Once you're systematically tracking responsiveness, you can move to a more sophisticated level of cost-to-quality tradeoffs," Macbeath says.

Translating agility into business success

In working with HP customers, Macbeath sees numerous examples of organizations that are successfully increasing their agility.

For example, a European bank working with HP has established an online system to show internal customers exactly how much it costs for one "unit" of Linux computing to run an application at multiple service levels: silver or gold. "Retail banking is a very cost-competitive business," Macbeath says. "Allowing internal customers to see the cost-performance tradeoffs for themselves and have the numbers right there makes it possible for business and IT to work together to make decisions to benefit the business."

Pursuing agility by way of automation let another HP customer, a global telecom company, reach new sources of revenue. Instantly deploying wireless hotspots provides a key capability for the company; by automating this process, IT was able to push out more than 40,000 new hotspots. The company turned on a new source of revenue almost immediately.

Other organizations are finding the same path to greater agility:

Standardization and automation combined with measuring results to drive performance. Through this process, their IT departments are demonstrating greater value for the business while delivering greater speed and transparency.

Extend the Principles of Agile to Deliver Maximum Business Value.

            The Agile movement has already helped thousands of IT organizations develop applications at light speed. But to maximize Agile’s impact on the business, you have to push its principles further throughout the enterprise until Agile delivery becomes continuous delivery, and continuous delivery leads to true business agility. The four stages that lead from Agile development to the agile enterprise can revolutionize the way business value reaches customers’ hands.

Stage 1: Agile development

Agile initially emerged as a collection of software development practices in which solutions evolve incrementally through the collaboration of self-organizing teams. Because early adoption grew out of grassroots, developer-led initiatives, Agile matured first within the developer space. It manifested in practices designed to shorten the feedback loop and uncover problems earlier in the cycle so they could be addressed sooner. It made coders fast and flexible.

Stage 2: Agile delivery

In many IT shops that had been organized in traditional functional silos, testers initially stood on the sidelines and wondered what Agile meant for them. The effect was a pseudo-Agile process in which code was developed in iterations but then tossed over the wall for QA teams to test the same way they always had.

The IT organizations that have had the best results with Agile have torn down the silos between development and QA to create truly cross-functional teams. To allow testers to keep up with the accelerated development pace and not sacrifice quality for velocity, they have revamped their approach to incorporate more robust test automation that can be run on a near-continuous basis. They have also adopted advanced techniques, such as service level (API) testing and exploratory testing to mitigate risk as early as possible.

Many organizations adopting Agile today are at this level. But as Agile adoption continues to expand and leaders see the value it can bring to delivery, momentum is gathering to break down the next level of organizational silos.
Four stages to an agile enterprise

Stage 3: Continuous delivery

Agile delivery teams have made great strides, but what is the point of rapidly building new features if those features just sit and wait to be released? To capitalize on the advances in delivery and better translate them into business value, it’s time to extend Agile development principles to deployment.

Continuous delivery, which is enabled by the DevOps movement, extends the Agile mind-set to incorporate IT operations. It focuses on what is ultimately important—shorter cycles for actually putting functionality in the hands of users. It relies not only on better collaboration between the two disciplines, but on comprehensive automation of the build, test and deployment process, so that—at the extreme level—every code change that passes automated functional, performance and security testing could be immediately deployed into production. This level of integration and collaboration between delivery and operations allows releases to be driven by business need rather than operational constraints.

Blurring the lines between development and operations isn’t easy, but it is critical. According to Forrester Research, some traditional process models can “encourage handoffs and demarcation of responsibility and ownership rather than collaboration, teaming, and shared goals. This may lead each party to think that the other is trying to pull a fast one, leading to confrontational situations and heavy reliance on paperwork and formal documentation. By sharing process and building a cross-functional team inclusive of operations, IT organizations can remove the disconnects.”1

The best way to start moving toward continuous delivery is to begin embracing DevOps principles. Include operations staff in demo meetings and planning sessions, share assets (test, build and deployment) and automate as much as possible. Examine KPIs and incentives to ensure alignment across the two functions. You’ll know you’re making progress when you see overall cycle times, from idea inception to released functionality, decrease dramatically.

Stage 4: Business agility

Although few organizations have achieved this stage, it’s what all are aiming for. True business agility happens when the entire organization can continuously analyze results and adapt accordingly.

Here’s how it happens: Deploying smaller increments more frequently not only puts features into the hands of the users at a much faster rate, but it also facilitates a steady stream of user feedback. User feedback lets you apply the Agile concept of “inspect and adapt” at the business level. It provides timely insight into shifting user preferences and helps you make precise course corrections and more informed decisions.

Business agility requires more than just IT. 

Achieving true agility will require a cultural shift within your organization before you can fully take advantage of the opportunity it presents. Business stakeholders must be empowered to make decisions quickly without legacy processes—such as annual budgeting—standing in their way. Other functions, such as sales, marketing and customer support, also play vital roles and need to be tied in effectively.

If you’re already operating at Stage 3, it’s time to look ahead. Start by asking yourself these questions:

Which of your core business applications could benefit or gain differentiation from constant user feedback?
What non-IT areas of your organization would need to be better integrated to get full, end-to-end agility?
What legacy processes might need to be revisited and transformed?

When you start to achieve more of the business objectives that drove your projects in the first place—increased revenue, faster ROI, improved customer satisfaction—you’ll know you’re approaching true business agility.

Why You Should Take this Social Enterprise Seriously

Salesforce Presents New Social Enterprise with Chatter, Mobility and Data

Organizations should take this social enterprise seriously, and especially those that see social media as essential to their future and Salesforce as well

At the Dreamforce conference, Salesforce.com (NYSE:CRM) CEO Marc Benioff unveiled the latest evolution of the company’s strategy and supporting technology for cloud computing and mobile technologies.

Its aim is to enable businesses to engage with customers and prospects via social media channels – what Salesforce calls the “social enterprise” – and empower employee and customer social networks to operate individually and together. Note I did not mention CRM, which doesn’t have a role in this platform for basic interactions with prospects and customers and is accompanied by a large ecosystem of partners that provide dedicated marketing and contact center applications. As summarized in its announcement, Salesforce’s strategy is clearly different from that of others in the applications market, including Oracle and SAP, which have products for the cloud computing environment and have made strides into integrating collaboration and social media capabilities into their applications.

Salesforce.com’s social enterprise is a big step forward from the strategy it talked about at last year’s Dreamforce, and is now focused helping companies build social profiles of employees and customers that can managed and augmented with information about the individuals from other social networks. The company’s partners are also working on such capabilities. For example, software from Reachable can present the relationships among individuals in a social graph. This week Roambi introduced analytic and mobile integration with Salesforce Chatter, which is another advance in what my colleague David Menninger calls the consumerization of collaborative BI.

Salesforce says significant technological improvements in the coming Winter 2012 release and later will make Chatter a true social business tool with many methods to chat, share, approve and otherwise enable the collaborative process. Another addition slated for Chatter is the ability to include presence information in the chat the way most instant messaging networks do today in a feature the company calls Chatter Now.

Users will be able to embed and share video, graphics and other kinds of files. Users who have too much traffic in their feeds will be able to filter content based on keywords. Chatter Approvals will be able to handle prompted interactions. With Chatter Customer Groups users will be able to invite and interact with customers or external people in private discussion groups. Those looking to build custom enhancements to the software can employ the Chatter API using REST and streaming that can be embedded with Chatter to interact with applications. Last year iWay Software demonstrated third-party integration to take any systems event and publish it into a Chatter feed.

Surprisingly, iWay, which is an Information Builders company, was not at Dreamforce this year, but it has paved the way for enterprise notifications in Salesforce’s social enterprise efforts. Salesforce also is targeting organizations using Microsoft SharePoint; they will be able to use Chatter instead of Microsoft’s messenger technology. Salesforce with Chatter Connect will be able to integrate its feeds with other environments to make a more seamless social and collaborative environment.

The expanded capabilities for Chatter, and Salesforce’s enhanced profiles of customers and prospects, will be integrated with the Service Cloud in an application called Chatter Service to help improve customer interactions for contact centers. Chatter Service will also be able to integrate into Facebook. This could create a new class of customer self-service for organizations that want to move their initial interactions into social media, and move questions and comments into more formalized customer service channels for resolution.

To address the full needs of a contact center though beyond social media will create other applications for which Salesforce has plenty of partners exhibiting including Contactual, inContact, Interactive Intelligence, Five9 and LiveOps. Salesforce showed how it can add value to the Sales Cloud with its advances in Chatter, but we do not expect to see a more integrated set of methods till 2012. Until then you can use Chatter by itself to interact with your sales team and the new versions will now be a good reason to evaluate them for what they call social sales. If you are looking to address the broader set of sales activities and processes beyond SFA, Salesforce has plenty of partners that should be considered if you care about efficiency and achieving sales quotas and targets.

Salesforce.com’s social enterprise direction will require simpler access to applications from smartphones and tablets. The company has created an engine to transform its applications to operate in an HTML5 environment so they can be utilized on smartphones and tablets from Apple, Android devices, RIM hardware and Microsoft’s, too. Salesforce calls this its Touch approach and will release it sometime in the future. You can sign up to be notified when that happens with Salesforce’s Touch. This will be a significant new option for anyone operating in the Salesforce.com environment.

One of the key pushes by Salesforce is database.com, which is designed to securely store organizations’ data in the cloud; it can be used by applications running on the company’s force.com platform for cloud apps as well as new social enterprise offerings that will come out in 2012. Our upcoming research in business data in the cloud will unveil more challenges and opportunity for improvement to support technology like database.com. This offering makes it easy to provision a database and get started. Its pricing and capabilities suggest that database.com is a transactional centralized data service. It’s not clear whether it will be useful for business analytics, which our research finds to be a major need in organizations today.

Business analytics has not been one of Salesforce strengths which its customers can attest which is why the portfolio of partners providing these capabilities is quite significant. Many of them also depend on their own database technology for analytics that operates in the cloud. If database.com is not able to support the analytics needs within the database its potential and impact to its customers could be hindered.

The challenge with database.com is that if you are trying to do automated data integration efficiently, including migration, synchronization or replication across clouds of data under applications or to the enterprise, you will need a separate product, and while Salesforce has many partners, none are part of the announcement or listed on the database.com website. As David Menninger has pointed out, integrating information from diverse clouds of applications requires work. Our newly completed research in Business Data in the Cloud will enumerate those challenges. If you are looking for help in dealing with integration in cloud and enterprise, consider Dell Boomi, Informatica, Pervasive and SnapLogic, with its dedicated data integration technologies.

Salesforce.com makes the social enterprise interesting, and it is taking the lead in advancing these kinds of interactions, especially with business-to-consumer companies, which need the most help in dealing with social media. Its ecosystem of partners and the ability to integrate with consumer social media give Salesforce an early advantage in the market.

If you are looking for new core applications in marketing, sales and customer service, you will need to invest in the partners to make this happen. It’s not easy to determine how to get the full value in your Salesforce.com investments, but it is worth the effort. If you are in sales or customer service departments or are trying to get a great mobile strategy, come and let us know as we can definitely help get what you need today. Organizations should take this social enterprise seriously, and especially those that see social media as essential to their future and Salesforce as well.


Kinect Turns Any Surface Into a Touch Screen

              Researchers combine a Kinect sensor with a pico projector to expand the possibilities for interactive screens.

A new prototype can transform a notebook into a notebook computer, a wall into an interactive display, and the palm of your hand into a smart phone display. In fact, researchers at Microsoft and Carnegie Mellon University say their new shoulder-mounted device, called OmniTouch, can turn any nearby surface into an ad hoc interactive touch screen.

Hands-on "screen": A proof-of-concept system allows smart phones to use virtually any surface as a touch-based interactive display.


OmniTouch works by bringing together a miniature projector and an infrared depth camera, similar to the kind used in Microsoft's Kinect game console, to create a shoulder-worn system designed to interface with mobile devices such as smart phones, says co-inventor Chris Harrison, a postgraduate researcher at Carnegie Mellon's Human-Computer Interaction Institute in Pittsburgh and a former intern at Microsoft Research. Instead of relying on screens, buttons, or keys, the system monitors the user's environment for any available surfaces and projects an interactive display onto one or more of them.

OmniTouch does this automatically, using the depth information provided by the camera to build a 3-D model of the environment, says Harrison. The camera acquires depth information about the scene by emitting a patterned beam of infrared light and using the reflections to calculate where surfaces are in the room. This eliminates the need for external calibration markers. The system rebuilds the model dynamically as the user or the surface moves—for example, the position of a hand or the angle or orientation of a book—so the size, shape, and position of these projections match those of the improvised display surfaces, he says. OmniTouch "figures out what's in front you and fits everything on to it."

The system also monitors the environment for anything cylindrical and roughly finger-sized to work out when the user is interacting with it, again using depth information to determine when a finger or fingers make contact with a surface. This lets users interact with arbitrary surfaces just as they would a touch screen, says Harrison. Similarly, objects and icons on the ad hoc "screens" can be swiped and pinched to scroll and zoom, much like on a traditional touch screen. In one demonstration art application, for example, OmniTouch used a nearby wall or table as a canvas and the palm of the user's hand as the color palette.


The shoulder-mounted setup is completely impractical, admits Hrvoje Benko, a researcher in Natural Interaction Research group at Microsoft Research in Redmond, Washington, who also worked on the project, along with colleague Andrew Wilson. "But it's not where you mount it that counts," he says. "The core motivation was to push this idea of turning any available surface into an interactive surface." All the components used in OmniTouch are off the shelf and shrinking all the time. "So I don't think we're so far from it being made into a pendant or attached to glasses," says Benko.

Duncan Brumby, a researcher at the University College London Interaction Center, in England, calls OmniTouch a fun and novel form of interaction. The screen sizes of mobile devices can be limiting, he says. "There's a growing interest in this area of having ubiquitous, intangible displays embedded in the environment," he says. And although new generations of smart phones tend to have increasingly higher-quality displays, Brumby reckons users would be willing to put up with lower-quality projected images, given the right applications.

Precisely which applications is hard to predict, says Harrison. "It's an enabling technology, just like touch screens. Touch screens themselves aren't that exciting," he says—it's what you do with them. But the team has built several sample applications; one allows users to virtually annotate a physical document, and another incorporates hand gestures to allow OmniTouch to infer whether the information being displayed should be made public or kept private.

"Using surfaces like this is not novel," says Pranav Mistry a researcher at MIT's Media Labs. Indeed, two years ago, Mistry demonstrated a system called SixthSense, which projected displays from a pendant onto nearby surfaces. In the original version, Mistry used markers to detect the user's fingers, but he says that since then, he has also been using a depth camera. "The novelty here [with OmniTouch] is the technology," he says. "The new thing is the accuracy and making it more robust."

Indeed, the OmniTouch team tested the system on 12 subjects to see how it compared with traditional touch screens. Presenting their findings this week at the ACM Symposium on User Interface Software and Technology in Santa Barbara, the team showed it was possible to display incredibly small buttons, just 16.2 millimeters in size, before users had trouble clicking on them. With a traditional touch screen, the lower limit is typically around 15 millimeters, says Harrison.

Reorganize Your Past, Online

A Web service developed by Microsoft Research lets people curate their own personal history.

Microsoft researchers are set to launch Project Greenwich, a website that helps users assemble and chronologically organize content about a person, event, or any other subject. The site, to launch in beta on October 31, allows users to archive uploaded items, such as photos and scans of objects, alongside links to existing Web content around a horizontal timeline marked with dates. Different timelines can be combined and displayed on the same page or merged.

Project Greenwich users attach images, maps, and other visual content, plus accompanying text, to relevant dates on their timelines. Each entry, which a viewer can click to see in full, is illustrated with thumbnail pictures in chronological order to show it in the context of other entries, and potentially alongside other timelines.

"We are interested in the creative act of reflecting on the past," says Richard Banks, lead designer on the project. "Actually sitting down and spending time creatively thinking about the past by making a photo album or a timeline is very different to existing online content being ordered chronologically."

The website was provisionally called Timelines by its developers at Microsoft Research Cambridge, in England. This was hastily changed due to the similarity to Facebook's new Timeline feature, which allows users to scroll chronologically through pictures, updates, and event listings related to their lives or those of their friends. Facebook's format uses indicators such as the numbers of comments that content has attracted to automatically highlight key events. Users can also manually choose what should be included and left out.


Banks, who is a principal interaction designer in the Computer Mediated Living group at Microsoft Research Cambridge, was partly inspired by a suitcase of around 200 photographs left by his grandfather Ken Cook, who flew bombers over Germany during World War II, on his death in 2006. As an example of what can be done using Project Greenwich, he has created representations of his grandfather's life merged with historical content on the Web about the war and the British Royal Air Force. "The ability to merge different timelines about, say, people and events creates interesting contrasts between authoritative and personal versions of events," adds Banks.

For several years, Banks has been studying how the increasing digitization of our lives affects how we deal with the past, remember the dead, and create memorabilia. Project Greenwich builds on previous projects developed by the Computer Mediated Living group. One, called Family Archive, was an interactive tabletop touch screen with an integrated camera. It was designed to allow family members to organize their digital memorabilia alongside scans of physical artifacts. Banks also previously developed a prototype called Timecard, a digital photo frame with an interface that allowed users to create timelines about the people or events featured in the displayed images.


The beta launch will allow the team to gather data on how people think about time and reconstruct the past, what elements they choose to include, and the ways they contrast personal and existing content. This information will shape its ultimate form and potentially also inform the way other future Microsoft products are designed. Project Greenwich may ultimately become a commercial Microsoft website, or it could be integrated into new versions of existing products or entirely new products. This will be decided later, partly based on the response to the beta launch. Banks expects to include new features in later versions, such as the ability to print timelines created on the site; to embed them in documents, such as blogs or homework; and to allow for multiple individuals to contribute to the same entry.

Mobility Matters: Delivering Needed Information at the Right Time

           Demand for BI as a means to get maximum value from information has never been higher as businesses increasingly compete in real time and require integrated information from across the enterprise. An old saw says BI gets "the right information to the right people at the right time." It's really time to add "via the right medium" to that mix.

Automating business decisions is one path to BI maturity. Triggering actions automatically, based on changes in corporate data, can arise beneficially from a solid understanding of how decisions are made today.

But we also know that many decisions are multifaceted, and a knowledge worker's analysis will continue to be a part of effective business intelligence.


Effective analysis is getting more complicated for knowledge workers. The more involved aspects require understanding what is happening and combining that with summarized historical data to build a set of possible actions. These decision "analytics" are the basis of competitive advantage for organizations today. Once calculated, they are put to effective use, utilizing the best medium available for real-time delivery.

Like water, information and analytics must flow through the path of least resistance, utilizing the deployment option that turns the information into valuable business action most quickly.
BI Deployment Option History

The accepted paradigm of effective business intelligence must change. Once, BI was exclusively made of reports built by IT from overnight-batch-loaded data warehouses, which replicated a single or small set of source systems. Those reports were deployed to the personal computers of end users in what now seems, in hindsight, a very heavy-handed and resource- intensive process.

Delivering a report to a user's personal computer on a regular basis is nowhere near the pinnacle of BI achievement. Shops still operating with this mentality are leaving tremendous value on the table.

Today the norm for BI involves "zero footprint" Web-based delivery of reports. This improvement allows information to reach many more users. A parallel development is that, while the detailed transactional data is necessary to be accessible on a drill-through basis, it is the rapid availability of summary level information that activates the process.

The majority of users have become more accustomed to a targeted presentation layer. Dashboards represent an advanced form of information delivery, second only to operational BI as a mature approach to disseminating information. Acting on dashboards that already have a certain amount of knowledge worker intelligence built in moves the organization to real-time competitiveness.

When you add mobility to the picture, notifications take on a whole new utility. Notifications can be sent to mobile devices when the dashboard data changes - or, rather, changes in a way that is meaningful to the knowledge worker. Mobile applications provide useful notifications and remove email as an extra step in the process.

Notifications, combined with the ability to immediately access the information wherever the knowledge worker is, provide many advantages over a dashboard, which requires Web browser technologies.


Business Mobility

If the past several years have brought a sea change in social culture, it is clearly in the use of mobile devices. What was once a phone for verbal communication has become a favored medium for email, Web access, music, podcasts, photos and information about weather, dining and travel. This has all happened in a short period of time and the trend will surely continue.

As case studies attest, mobile devices and tablet computers are highly useful business tools. The flexibility that they offer to knowledge workers no longer tied to a physical location is now seen to be as necessary in business. Achieving real-time business mobility helps information flow like water through the path of least resistance. It facilitates less hunting and gathering on the part of the information consumer by delivering summarized information, supported by detail by BI systems.

The biggest business factor driving the need for mobility is the real-time nature of new business needs. Out-of-stock conditions, customer complaints and fraud are not optimally solved with reporting. In order to address these issues, information cannot be out of date. Fortunes can be gained and lost by suboptimal business decision timing. Data warehouses built with a personal computer usage layer create a distinct challenge to the optimal timing of decisions.

The preferred timing of intelligence gathering is during the immediate occurrence of a trigger event. Business is becoming a world where the players are always plugged in, and expect and need to make immediate decisions. Therefore, business units - and the systems they utilize for real-time decision-making - need high quality, high performing, corporately arbitrated information in real-time.

For competitive parity, it is imperative that the information management function not only responds to business needs, but also brings solutions to the table in real-time and ensures that they are visibly exposed for decision-makers.

Real-time information delivered to an abandoned desktop or to the Web that inherently involves multiple steps for the user to derive value does not exploit the potential of available on-time information. Businesspeople need easy access to accurate information delivered to them at any time or place. The answer is mobile BI, utilizing the mobile device as the data access layer.

Getting the Most out of Business Analytics

             The expanding managerial movement to adopt analytics is being spurred by the needs for improved organizational performance and a sharpened competitive edge. Now that the benefits of applying analytics for insights and better decisions are being accepted, the next question is: How should an organization get the maximum yield and benefits from business analytics?

Carlson’s Law: Bottom-Up versus Top-Down Ideas

A trend with applying analytics is a demonstration of “Carlson’s Law,” posited by Curtis Carlson, the CEO of SRI International in Silicon Valley. It states that: “In a world where so many people now have access to education and cheap tools of innovation, innovation that happens from the bottom up tends to be chaotic but smart. Innovation that happens from the top down tends to be orderly but dumb.” As a result, says Carlson, the sweet spot for innovation today is “moving down,” closer to the people, not up, because all the people together are smarter than anyone alone, and all the people now have the tools to invent and collaborate.

A generally accepted way to drive the adoption of analytics is with executive team sponsorship and formally establishing a competency center for analytics. Unfortunately, the conditions are not always right for these. Executives are often distracted with fire-fighting or office politics. And creating a competency center requires foresight and willpower from executives, which can be a limiting factor.


I am a believer in Carlson’s Law because I have observed it in the adoption of enterprise performance management methodologies. These methodologies include strategy maps, balanced scorecards, customer profitability analysis, risk management, and driver-based rolling financial budgets and forecasts. Passionate middle-management champions drive change involving analytics more often compared to executives. Why? Middle managers ask themselves, “How long do we want to perpetuate gaining understanding and making decisions the way we do now – with little or no analytical insight or hypothesis testing?”
Pursuing Unachievable Accomplishments

Leadership does not only exist at the top of the organizational chart. Leadership can be present in individuals below the C-suite positions. This is possible because a key dimension of leadership is the art of getting a group of people to accomplish something that each individual could not do alone. Leadership does not require formal authority and command-and-control behavior.

There are hundreds or maybe even thousands of books and articles about leadership, yet some highly respected people believe there is a shortage of leadership. For example, in “Where Have All the Leaders Gone?” former Chrysler CEO Lee Iacocca describes with anger the sad state of leadership in the U.S. today.

My simple model of leadership has three components:

Care: Followers believe that leaders care about them and their organization.
Trust and hope: Followers believe that supporting a leader will improve things.
Mission: Followers want leaders to answer the question “Where do we want to go?” so that they can help answer “How will we get there?”

Executive leaders must communicate the third component; however, middle-manager champions can exhibit the first two.
Analysts Can be Leaders

Experienced analysts realize that applying analytics is not like searching for a diamond in a coal mine or flogging data until it confesses the truth. Instead, they first speculate that two or more things are related, or that some underlying behavior is driving a pattern to be seen in various data. They apply business analytics more to confirm a hypothesis than to randomly explore. This requires easy and flexible access to data, the ability to manipulate the data, and software to support the process. This is a form of leadership.

Leaders require moral, not physical, courage. An example of physical courage is rescuing someone from a fire. That is noble, but is not leadership. Moral courage is almost the opposite of a rescue. It is doing something not immediately highly valued and potentially perceived as sticking your nose in other’s business. Ultimately it is seen as a helpful contribution to organizational performance improvement.

There are hundreds of examples of applying analytics. One is to identify the most attractive types of customers to retain, grow, win back or acquire. Others involve risk management, warranty claim analysis, credit scoring, demand forecasting, clinical drug trials, insurance claims analysis, distribution route optimization, fraud detection, and retail markdown and assortment planning. The list is endless.

The investigation and discovery of what will align an organization’s actions with the executive team’s strategy for execution will not come from a CEO with a bullhorn or a whip. Better insights and their resulting decisions will come from analytical competency. Analysts can demonstrate leadership with the passion and desire to solve problems and discern answers – the power to know.

Prioritizing Your IT Concerns

              Results of a recent survey reveal six areas of priority for CIOs and their organizations: information security and privacy, virtualization and cloud computing, social media integration, data classification and management, regulatory compliance, and vendor management.


Protiviti, a global consulting firm, identified top areas of concern for IT leaders — 7 percent of respondents represented the insurance industry — based on competencies they cited as most in need of improvement in its “Information Technology Capabilities and Needs Survey.”

More than 200 IT professionals — including CIOs, CTOs, chief security officers and IT VPs, directors and managers — were asked to assess their skills and professional development priorities through questions covering three major categories: technical knowledge, process capabilities and organizational capabilities. After analyzing the responses, Protiviti concluded that virtualization and social media integration clearly stand out as the top areas in need of improvement in terms of technical knowledge. And related competencies such as cloud computing and social media security are also top “Need to Improve” areas for IT departments.


Specific concerns identified in the report include:

Some firms have vague or out-of-date social media policies in place that are unenforceable if inappropriate activity occurs.
The volume and pace of regulatory change has been significant in recent years, and there are a number of regulatory issues that require IT involvement, including Dodd-Frank, Sarbanes-Oxley, Basel II, Solvency II and PCI-DSS. "IT must be an active part of compliance management, which typically involves developing, implementing or integrating tools and platforms to achieve active compliance and risk management," said Kurt Underwood, managing director and head of Protiviti's IT consulting practice.

For every law and regulatory requirement, the company must also ask: What portion of my data does this affect? How do I classify and manage this data in accordance with the law? It also is important to note that, as a byproduct of the proliferation of new and emerging technologies, there are rapidly growing volumes of data being generated daily. By ranking, managing and classifying this data as a top "Need to Improve" competency, respondents may be saying they and their organizations are having difficulty understanding the increasingly complex regulatory landscape and how to comply with various new laws.

With more and more organizations transitioning to virtualized solutions as well as applications and activities in the cloud, external service-level agreements (SLAs) with an array of third-party vendors and other providers are a key concern for IT executives. Similarly, determining a sound strategy and approach for outsourcing and off shoring are another critical area of focus, particularly given that many companies continue to seek innovative ways to save costs. However, many of these organizations lack clarity or direction about how to accomplish this effectively while continuing to deliver a high level of service and maintain compliance with company policies, applicable laws and regulations.

Because data breaches are costly and affect not just operations but also brand reputation, information security is another top priority for IT executives. Key considerations for leaders to consider are: How robust are our information security measures? Is our organization in compliance with industry standards for security and privacy as well as applicable laws and regulations, and do we have efficient systems and processes for tracking compliance?

Google’s mobile revenue? Depends how you do the math.

            Google wowed Wall Street with the revelation that its mobile business is generating revenue at a run rate of over $2.5 billion.

Not bad for a business that’s still in its infancy, and which was operating at a $1 billion run rate at this time last year.

Of course, a run rate is not the same as revenue that’s been booked – it’s simply a way of extrapolating what a full year’s worth of revenue will be, assuming the current rate of revenue holds steady.

So what is Google’s actual mobile revenue right now?

Many Wall Street analysts estimated on Friday that Google generated $625 million in mobile revenue in the recently-ended quarter -– a not unreasonable assumption, given that four quarters’ worth of $625 million totals $2.5 billion. (And since Google said the run rate was more than $2.5 billion, perhaps $626 million for the quarter would be an even more reasonable estimate).

Not so fast, says BGC Partners analyst Colin Gillis.

There’s no guarantee that Google based its run rate on a full quarter’s worth of revenue.

They could have taken mobile revenue from the last month and multiplied it by 12, said Gillis. They could even have used their best single day of mobile revenue and multiplied by 365, he noted.

As a result, Gillis estimates that Google’s mobile revenue in Q3 was probably closer to $500 million or $550 million.

“We have no idea what that number really is,” he said.

The application delivery trifecta: Agile, composite and cloud

          Modern delivery initiatives are changing the creation of enterprise apps. But without proper orchestration, these initiatives can pull the enterprise in opposing directions. 

Three delivery trends are reorienting the way enterprise IT management leaders bring applications to market: Agile development, composite applications1 and cloud computing. These trends share a single, primary aim, and it isn’t just cost reduction. The chief objective of these initiatives is better, faster outcomes. All three seek to strip the latencies from traditional delivery and provide results that are more aligned with, and more responsive to, the business.
If your enterprise is like most, you’re probably pursuing all three. But are you doing so in concert? Pursued together, these trends present certain harmonies that significantly magnify their benefits.
  • Composite and cloud development prize the durability of services, ensuring that components perform well while remaining secure and resilient. Because both cloud and composite facilitate the exposure of services and subcomponents for use by other applications, their shared priority is for components to remain trustworthy in a variety of different contexts.
  • Agile and composite align in their aim for modularity. They resist large and change-resistant monoliths—whether in terms of project plans or application architectures—in favor of discrete, bounded units that can be built, tested and delivered to production quickly.
  • Agile and cloud overlap in the aim of responsiveness to change. Agile projects are designed to anticipate rather than resist change and to be able to pivot accordingly. The “always on” aspect of the cloud can facilitate this aim by reducing or eliminating the time to provision application environments, a key source of latency and inter-departmental squabbles.
But problems arise when these initiatives are pursued in silos, as independent strategies. This tendency means that instead of harmony, the risks in each trend begin to amplify the risks of the others. As one example, Agile software development’s focus on velocity can come to antagonize the thoughtful architectural planning that good composite applications require. In pursuit of speed, developers may code in an ad hoc manner ("cowboy coding"), which leads to a proliferation of redundant or poor-quality composite services.

The magic of the trifecta 

 

World-class delivery organizations harness the collective promise of Agile, composite and cloud development by seeing each in the context of the other. These organizations maximize the shared opportunities and minimize what could otherwise become hostilities among the initiatives. In this way, each initiative amplifies the value of the others.
Consider some examples of how these harmonies might be achieved:
  • Harmony between composite and cloud: In this case, enterprise technical policy lays out standards for reusable services so that any new service is suitable for either internal use or the public cloud. In this way, standards for security and resilience would be included alongside performance—achieving the durability that both cloud and composite applications require of their services to ensure service trust and adoption.
  • Harmony between Agile and composite: At the same time, the organization might define certain principles to guide how coarse- or fine-grained a composite service should be, ensuring that developers aren’t unnecessarily bogged down in architectural debates or tempted by shortcuts. In this way, Agile software development teams can produce composite services that meet the requirements for enterprise reuse, without a return to endless planning sessions.
  • Harmony between composite and cloud: Given the interdependencies in building composite applications, where key services or systems are frequently either under development or available only at certain times, the organizations might seek to virtualize these dependencies—by finding alternate services already available in the cloud, or by investing in service virtualization software that mimics the behaviors of dependent application services.

Putting it all together

There’s little question that these three trends have improved the way we build and deliver apps. But the real value is in using all three together as part of the same strategic aim: to deliver better applications faster.

Innovation or maintenance: The right choice to save your business

              Innovation and simplification have become the mantra of high-performing enterprises, yet maintenance and administration costs consume the bulk of most organizations’ IT budgets, hampering innovation and increasing risk. Here’s how IT leaders can get out of the maintenance-spending trap.

On average, a typical IT organization’s spending on maintenance and administration is likely to consume at least 70 percent of its annual budget, but higher percentages are increasingly common. While the percentage itself is not that important, the result is. According to a recent survey, only 34 percent of global CIOs think they have achieved anything close to their innovation potential.1 The root cause? It’s likely that, with flat or reduced budgets, IT leaders’ innovation projects have been forced to make do with the scraps left over after they’re done keeping the lights on.

If this sounds familiar, you’re not alone. In January 2011, InformationWeek’s Global CIO Top 10 CIO Issues for 2011 noted failure to address the 80/20 spending trap as No. 2 on CIOs’ priority list.
Total IT Budget as Percentage of Revenue loading



Fortunately, some companies have proven that it is possible to drive better innovation performance and “flip the mix” between maintenance and innovation. In 2006, the HP IT team sent shockwaves through the industry by announcing that HP would embark on an ambitious project to transform IT with the goal of delivering improved innovation while delivering increased quality and dramatically lower cost as a percentage of revenue.

HP has applied the experience gained as part of its own transformation to help others achieve similar performance, with the Flemish government and Italian Ministry of Education among the ranks of HP customers that have significantly reduced their maintenance costs while delivering game-changing innovation.

“Doing nothing” is not an option

HP realized that maintenance costs were growing but, to maintain competitiveness, IT’s budget could not. Without an IT transformation to address the maintenance imbalance, innovation would grind to a halt.

The drive to innovate is not limited to technology leaders like HP; it’s an imperative for any enterprise that needs to perform better, one that can translate to the top and bottom line. For example, if your only competitors are organizations with IT maintenance costs of 80 percent to 90 percent, then shifting your own ratio even a little can create significant competitive advantage. However you should also consider that startups or companies in emerging markets often jump immediately to modern IT solutions such as converged infrastructure, cloud and SaaS, bypassing ownership and systems-building altogether. What happens as these emerging players, with near-zero maintenance costs, become your competitors?

And it’s not just the private sector that’s under pressure to rethink IT delivery. The U.S. federal government also recognizes the imperative to cut maintenance costs. In a recent interview with Fortune magazine, departing U.S. CIO Vivek Kundra remarked that “you would get laughed out of the room” if you went to your board and asked for millions of dollars to build out systems like email, finance and a data center for web site hosting. And yet, he said, that’s essentially what the U.S. government, like many companies, is doing.

Running the business of IT

Making true shifts in spending requires a new mindset in IT.

William Dupley, chief solutions manager in HP Canada’s Office of the CTO, likens this moment in IT to the state of the auto industry in the 1980s, when Japanese management techniques shook up American manufacturing. Out of that painful episode came an improved approach to quality management that IT leaders would do well to adopt.

“The three key strategies that need to be applied to IT are Six Sigma, Lean and Theory of Constraints,” Dupley says. These eliminate defects or flaws, non-value-add work and bottlenecks, respectively.

HP accomplished its reduction in maintenance costs by building an integrated system management architecture and data warehouse. HP IT analyzed labor and eliminated root causes (many of them due to old systems), just as if IT were a manufacturing process. Over three years, HP IT eliminated 75 percent of its application portfolio and standardized much of the rest. The team modernized data centers, infrastructure and system management technologies, and made extensive use of automation. The result: HP was able to bring its IT spending from about 4 percent of revenue down to less than 2 percent.2 By 2008, 70 percent of HP IT employees’ time was being spent on new development, with just 30 percent going to IT support.3

Visibility first

Effectively leading such far-reaching changes requires visibility across all IT resources and assets. Without an understanding of how all the interconnected parts work on one another, it’s likely that you’ll end up pulling levers without knowing what the outcome will be, jeopardizing not just IT’s transformation, but the day-to-day operations of your enterprise.

The key to a successful change-acceleration program is understanding where you are currently—your maturity level, the business challenges ahead—and then using that holistic understanding to move toward your enterprise’ goals. (Start by taking HP’s CIO assessment.)

“The real issue,” says Piet Loubser, senior director responsible for HP’s IT Performance Suite, “is the ability of CIOs to understand all the influences and drivers on their entire organization.”

Instead of pursuing an arbitrary benchmark, Loubser counsels IT leaders to focus on understanding the interconnected nature of IT and the business they’re in. “The problem with chasing ‘best-in-class’ is that it may not be the right number for your company at this point in time,” Loubser says. “And without the tools to measure and test it, you don’t know if it’s working.”

The important thing is to understand where value is derived. Says Dupley, “If you’ve got old applications that are critical to your business, that took years to create and are difficult to duplicate, you may have no choice but to have higher maintenance costs.”
Finding your best-in-class spending mix

Visibility into the ramifications of change lets you start adjusting your spending mix. “Best-in-class” typically varies by industry, but HP’s research has found that average maintenance and administration spending is roughly twice that of best-in-class. So how do you move your spending closer to best-in-class?

Start measuring: Remembering that “you can’t manage what you can’t measure,” look for line of sight into everything that affects your business. Start looking at KPIs within four broad buckets: business value created by IT, customer satisfaction, operational excellence and future orientation. HP created the HP Executive Scorecard and Financial Planning and Analysis solutions to give IT leaders an integrated perspective on the overall performance and efficiency of their organizations. (Watch this video to see how it works.)

Set targets: Armed with fact-based actuals and benchmarks, identify bottlenecks or poor performing areas and set targets to drive improvements. Says Loubser, “HP is able to control maintenance spend because we’re continually monitoring our own performance against those of our peers and competitors.”

Compare to where you want to be: Work with strategic advisors to assess relevant best-in-class metrics for your market conditions and the steps you’ll take to get there. Discuss your efforts with a community of peers.

You get what you inspect, not what you expect

Once you have a holistic understanding of all the drivers within your organization, you can use scorecard metrics to effect change.

The most critical aspect, Dupley says, is that “IT must move to a model using metrics to influence the future.” He adds, “You want to build a metrics system to create behavior, not just to report.”

With reporting systems such as an executive scorecard in place, IT leaders can combine lagging metrics (the past) with leading metrics (the future) to build insight that helps them make informed decisions sooner. Dupley counsels senior execs (VP and above) to use leading metrics to manage a year out.

Unleashing innovation

Combine a quality management mindset with visibility and measurement tools and you unleash tremendous potential for innovation within your organization.

For instance, HP customer Delta reduced applications testing times by 52 percent and allowed its testers to devote 90 percent of their time to business innovation instead of verifying legacy applications. Seagate found it more cost-effective to transition its internal email system to a third-party cloud, protecting outsourcing ROI as well as enforcing cloud service SLAs by using an application performance management system from HP. Another of HP’s customers, a Canadian university, transitioned 32 legacy email systems to the cloud, freeing up related staff.

By significantly reducing maintenance and administration costs, these and other IT organizations make IT transformation more than just a technology initiative—it’s a business strategy.