IT Trends and Comments for 2012

January EventSource Newsletter

By: Rich Ptak, Managing Partner, Ptak, Noel & Associates LLC

The beginning of a new year marks a time of reflection on the past and anticipation of the future. The result for analysts, pundits and authors is a near irresistible urge to identify important trends in their areas of expertise (real or imagined). I am no exception, so here are my thoughts on what we’ll see in the next year in the areas of application and evolution of Information Technology.

The past few years have been marked by a significant maturity in understanding of the capabilities, demands and expectations of educated consumers in the application of IT in their business and personal lives. The evolution in capability, ease of and ubiquity of availability and access, accelerated dramatically. This resulted from the combination of past trends, industry economics and general IT maturation driving its application into new areas while speeding and facilitating benefit realization.

These effects will continue into 2012 as a result of the following trends:

  1.  Customers buy solutions, not technologies. IT solution providers, regardless of size or product form (hardware, software, services) have become more sensitive and responsive to the needs of their target markets. Business buyers want immediate solutions to their problem with minimal complexity in its application. They do not want ‘tool kits’ or 75 per cent-complete products.  The best and most successful recognize and respond to this demand for comprehensive solutions to their customers’ expectations and demands. The emergence of affordable, fully integrated, modular and comprehensive solutions that address identifiable business and operational problems out-of-the-box will continue and become more competitive as more intelligence and power are embedded in IT solutions. The Prism Microsystems EventTracker product family provides a good example of how vendors are creating solutions in this model. It is true that some solutions will stand and operate on their own. However, an increasingly complex and evolving environment requires that solutions be able to co-exist and interoperate with data, products and services from many sources.
  2. Private, public and hybrid Clouds continue to grow in number and application spreading across all market segments. Service providers and vendors are in race to make Clouds more accessible, secure and functional. Consumers of Cloud services will continue to be even more selective and careful as they choose their providers/supplier/partners. High on their list will be concerns for stability, security and interoperability. The issue of stability tips the preference toward private and hybrid solutions. (We have already seen very public and dramatic failures from big vendor Cloud suppliers; there will be more.) However, a combination of improved architectures and customer interest in achieving very real Cloud/IaaS/PaaS/SaaS financial, operational and competitive benefits will maintain adoption rates. These also drive the following trend.
  3.  Standards and reference architectures will become more important as Clouds (public, private, and hybrid) proliferate. As business and IT consumers pursue the potential benefits of Cloud/IaaS/ PaaS/ SaaS, etc. it is becoming increasingly obvious that the link between applications/services and the underlying infrastructure must be broken. The big advantage, as well as the fundamental challenge is how to assure easy portability and access to any and all Cloud services.  But, this must be done in a way that allows Cloud solution systems to interoperate and co-exist with traditional structures.  You must provide a structure that allows for the creation, publication, access, use and release of assets in all environments. Vendors must cooperate to create multi-vendor standards and architectures to meet these expectations. This is a natural evolution of the pursuit of standards and techniques that disconnect the implementation of a service from its operational underpinnings. The effort goes back to the earliest days of creating machine independent languages (Cobol, Fortran, etc.) and all Open Systems and architectures (e.g. Unix). This new degree of structural dependence is just implemented at a higher level of abstraction.  The Cloud Standards Customer Council acts as an advocacy group for end-users interested in accelerating successful Clouds. They are addressing the standards, security and interoperability issues surrounding the transition to a Cloud operating environment. One example of a service implementation architecture we see as being particularly worthy of note is the OASIS-sponsored Topology and Orchestration Specification for Cloud Applications(TOSCA).
  4. Use of sophisticated analytics as a business and competitive tool spreads far and wide. The application of analytics to data to solve tough business and operational problems will accelerate as vendors compete to make sophisticated analytics engines easier to access and use, more flexible in application and the results easier to understand and implement. IT has provided the rest of the enterprise with mountains of data. The challenge has been in getting useful information and insight. Operations Research, simulation and analytics have been around and in use for decades (even centuries). Yet, their use has been limited to very large companies. Today’s more powerful computers, the ability to collect and process big streams of live data, combined with concentrated efforts by vendors to wrap accessible user interfaces around the analytics will provide tools to a wider audience. The power of IT servers allows the user to avoid underlying complexities and will do more so over time.
  5. Increasingly integrated, intelligent, real-time end-to-end management solutions enable high-end, high-value services. Think of Cisco Prime™ Collaboration Manager which provides proactive monitoring and corrective action based on the potential impact on the end-user. Predictive analysis is applied down to the event level (data logs provide significant insight) – and analytics to identify problem correlation and/or causation. The primary goal is prediction to avoid problems. Identifying correlated events can be as effective as or even more effective than recognizing cause in providing an early warning.  The fact is that while knowledge of causation is necessary for repair, both correlation and causation work for predictive problem avoidance.
  6. APM (Application Performance Management) converges on BPM (Business Process Management). The definition of APM is expanding to include a focus on the end-user to infrastructure performance optimization as a prime motivator for corrective action. Business managers care about infrastructure performance only to the extent it negatively impacts the service experience. They want high quality services guaranteed. BPM focuses on getting processes right so things are done correctly and efficiently.  IT cares about infrastructure, so traditionally, this is where APM has focused. The emphasis will continue shifting toward the consumer, blurring the lines between APM and BPM. BMC provides one example of the impact by adding analytic and functional capabilities to Application Performance Management to speed root cause as well as impact analysis.  Enhanced real-time predictive analytics are specifically used to improve the user’s interactive experience by more quickly alerting IT staff to infrastructure behaviors that can disrupt service delivery.
  7. The impact of the consumerization of IT will continue to become more significant. Consumers of services are increasingly intolerant of making any concessions to the idiosyncrasies of their access devices (iPad, iPod, Smartphone, etc.). They expect a consistent experience regardless of what is used to access data. Such expectations increase the pressure for service, software and platform standards, as well as drive the evolution of device capabilities and design. Previous efforts generally focused on the ‘mechanics’ and ‘ergonomics’ of the interface. Today the focus is increasingly on consistency of access, ‘look-and-feel’ and performance.  One example is the growing interest in and ability to deliver what AppSense is calling ‘user-centric IT’, where the user has consistent access to all of their desktop resources wherever they are and on whatever device or platform they use. Technology will increasingly and automatically detect, adapt to and serve the user. This goes beyond the existing concept of ‘application aware’ devices to one that associates and binds the user with a consistent, cross-platform experience.
  8.  Virtualization – acts as a ‘gateway’ step to the Cloud and fully ‘service’ infrastructure.  Virtualization will continue to be subsumed by Cloud. Virtualization is now recognized as an enabling technology and a necessary building block to Cloud implementations. It is the first step toward achieving a truly adaptive infrastructure that operates with the flexibility, reliability and robustness to respond to the evolving and changing needs of the business and consumer of IT services. Storage, servers and networks have been virtualized. The focus is shifting to providing applications and services as fully virtualized resources. The increasingly complex nature of ever more sophisticated services acts to accelerate and reinforce this trend.

There you have it: eight trends and influences IT will have to deal with in 2012. I expect to be commenting more on these efforts this year. Your comments, questions and discussion around any of these are welcome. I can be reached at