The short answer is, “yes,” but I believe it is conditional to certain changes that have to happen first. Let me focus on what prevents mashups from evolving into the enterprise space right now. According to Gartner, corporate developers spend approximately 65 percent of their effort building bridges between applications. Think about information that’s stored in a typical enterprise and how it interacts with information on the Web. The problem each enterprise faces is how to let customers, partners, vendors, and everyone else with interest in a company use the company’s data and services within their own enterprise applications. A sweet spot is a reliable way for the information to be used inside and outside of the enterprise easily. If you look closely, there is a disconnect between a huge amount of infrastructure being built for the future of “cloud computing,” and the capabilities of today’s companies to effectively link data between on-premise and on-demand applications.
But there are three problems with existing middleware products in the enterprise market:
a) The tools are expensive.
b) The tools are complex to install.
c) The tools are difficult to implement and maintain, and most require a serious work effort (architecture, configuration, programming, and custom coding and testing) before delivering any value to the end user.
What kind of middleware does Enterprise 2.0 need in the area of information integration?
I wrote about mashup building software tools in this post. Today these tools lack enterprise data integration capabilities, especially in the area of batch-type data integration, where large chunks of data (in the tens of megabytes, or even gigabytes range) have to be moved in and out. What we see now is mostly a web scrapping/RSS/light-weight XML approach to creating situational applications, or mashups. Enterprise 2.0 needs middleware that can access data both on a local network (like MS SQL/MySQL/Oracle databases and files), and on-demand applications and systems in “the cloud.” The tools should be much easier to implement and cost 5x to 10x less compared with traditional information integration tools available to the enterprise market from companies like Informatica, TIBCO, etc.
I separate all information integration into two distinct categories: batch and real-time. These categories have been powered by what we know as Extract-Transform-Load (ETL) and Enterprise Service Bus (ESB) respectively, where ESB inherited EII/EAI with the emergence of SOA. Designed for cloud computing, Middleware 2.0 across both of the above categories should come in as many as three packages:
b) Built into mashup building tools, such as Dappit, Teqlo
c) Built into emerging enterprise software available on-demand, such as DabbleDB, Blist, and Swivel.
During the next two to five years, Middleware 2.0 will evolve to quickly and inexpensively link “the cloud” and the existing on-demand and on-premise applications and databases across the enterprise.
My next post will be about the long tail of information integration. I will take on how the technology adoption of data integration tools looks today, and how it will look in the near future.