CSA Explains… XML – Presentation Synopsis

timjsmith

Tim J. Smith, PhD
Founder and CEO, Wiglaf Pricing

Published August 22, 2002

XML has been lauded as the language that will enable networked computing to become a full reality. It is the extensible mark up language of the Web and machine-to-machine communication with embedded meta-data to describe the informational content of its own message package. Any language that is self interpreting and therefore platform independent, creates a platform for low-cost business-to-business communication, and is backed by the largest names in software, should have our attention.

Carl Franklin of Triton-Tek, Mike Bushman of Kanbay, and Jonathan King of Epigraph provided a roadmap for technologist and entrepreneurs to understand XML and the potential that it unlocks on Friday, Aug 9th at the CSA Explains … XML event hosted by Kanbay in Rosemont.

Mr. Franklin initiated the presentation with a technical review of XML. Providing a brief description of XML cannot be accomplished without a discussion of meta-data. Meta-data is data about data. Meta-data provides a description of the data that is contained in a message sent from one application to another. For instance, given “123 Main Street” as the data in a message, the meta-data describes whether this is a ship-to address, bill-to address, or some other piece of information. Without meta-data, the informational value of a message is often difficult to ascertain.

Mr. Franklin highlighted two of the downsides of XML. First, XML is bulky. It is a verbose language. It requires fat pipes. To communicate XML messages between applications, the data throughput mechanism needs to be able to handle high-speed data transmissions. For many business applications, the fat-pipe requirement does not represent a difficulty. However, for business-to-consumer applications, this fat-pipe requirement of XML can make it an unacceptable solution. Second, software application-to-application communication, such as back-office legacy integration, may require other tools to be used in conjunction with XML. WebLogic, MQSeries, WebSphere, or another third-party message handlers are often required to manage message queuing and other issues in creating robust solutions and ensuring that no data is dropped in the communication.

Mr. Bushman followed Mr. Franklin with a case study of a business-to-business application Kanbay created for a Household International, a major financial company in the Chicago area. The difficulty the client was facing was the need to integrate with new customer merchants two to three times per month. Household International desired to create an XML solution in order to enable their clients that use Internet or Intranet applications a level of control over their environments, including the look-and-feel of their client service applications and the ability to manage customer service without maintaining a direct customer connection. Kanbay created a solution to their needs in within 4 months for approximately $300,000 using domain objects to host the message information. The result is estimated to save Household International $1.75 million per year.

Given such strong business ROI, one would expect rapid adoption of XML technology. However, adoption has been slow. Addressing this issue, Jonathan King of Epigraph discussed some of the trends and visionary possibilities of XML. He noted that XML, like other technologies, has been over-estimated in the short-run but is likely to be under-estimated in the long-run. He described a world where machine-to-machine communication enables the vision of Minority Report wherein Tom Cruise walks through the city and every billboard creates a special message for Tom. Yet Mr. King also added that the industry is facing a significant barrier to achieving this vision. Prior to the full deployment of XML, businesses must adopt meta-data standards to lower the risks of developing an XML application that becomes an isolated island in the sea of communication. Unfortunately, the discussion of standards wars and the returns captured by companies that control the standards creation was left for another discussion.

The May Report, TECH BUSINESS BRIEFS, Aug, 22, 2002

CSA Source Code, Sept. 5, 2002

Posted in:
Tagged:

About The Author

timjsmith
Tim J. Smith, PhD, is the founder and CEO of Wiglaf Pricing, an Adjunct Professor of Marketing and Economics at DePaul University, and the author of Pricing Done Right (Wiley 2016) and Pricing Strategy (Cengage 2012). At Wiglaf Pricing, Tim leads client engagements. Smith’s popular business book, Pricing Done Right: The Pricing Framework Proven Successful by the World’s Most Profitable Companies, was noted by Dennis Stone, CEO of Overhead Door Corp, as "Essential reading… While many books cover the concepts of pricing, Pricing Done Right goes the additional step of applying the concepts in the real world." Tim’s textbook, Pricing Strategy: Setting Price Levels, Managing Price Discounts, & Establishing Price Structures, has been described by independent reviewers as “the most comprehensive pricing strategy book” on the market. As well as serving as the Academic Advisor to the Professional Pricing Society’s Certified Pricing Professional program, Tim is a member of the American Marketing Association and American Physical Society. He holds a BS in Physics and Chemistry from Southern Methodist University, a BA in Mathematics from Southern Methodist University, a PhD in Physical Chemistry from the University of Chicago, and an MBA with high honors in Strategy and Marketing from the University of Chicago GSB.