The Internet spins a vast web of information across the globe. Data and information flow freely — available to anyone for learning, understanding and analysis. Organizations can cooperate across departments, regions and countries. ERP II and ECM herald the era of intra- and inter-business collaboration. Sounds wonderful – what is the problem? The problem is as old as mainframe vs. PC and Windows vs. Macintosh. Data can move freely but are not standardized. Data streams have no universal meanings; consequently, data are not understood by all systems, analyzed easily, translated across different languages and human readable, among other things. Specialized hardware and software is needed for data decoding, and if the required tools are not available, then you are out of luck. This problem is not only confined to the Internet. A great deal of money (by one estimate, almost 20% of the U.S. gross national product) is spent on generating new information, and more than 90% of this information is in documents, not in databases. Businesses in the U.S. produce approximately 100 billion documents per year. This information is stored in various formats across a range of computer systems. These disparate storage formats cause severe problems in accessing, searching and distributing this information. Any solution (a combination of information technology products and services) that manages information across diverse software and hardware platforms must address a few key requirements. First, these solutions should be transparent to users. The technical details should not be handled by users. Second, users should be able to save data and information in the desired format; for example, databases, text files or proprietary formats. Third, a solution must intelligently retrieve data and information. This solution should be knowledgeable regarding meaning of the information itself. Finally, such solution should be capable of providing the desired output — print, screen, Web or CD/DVD format.