Reach Us +44-175-271-2024
All submissions of the EM system will be redirected to Online Manuscript Submission System. Authors are requested to submit articles directly to Online Manuscript Submission System of respective journal.

Data Warehousing, Electronic Commerce and Technological Learning: Successes and Failures from Government and Private Industry and Lessons Learned for 21st Century Electronic Government

By Elias G. Carayannis, PhD, MBA, BScEE
Program on the Management of Science, Technology and Innovation
Department of Management Science
School of Business and Public Management
The George Washington University, Washington, DC 20052
Email: caraye@gwis2.circ.gwu.edu

Visit for more related articles at Journal of Internet Banking and Commerce

Current Key Issues of Information Technology (IT) in Government & Industry

The current key issues for utilizing information technology in implementing electronic government policies and practices are:

a. Systems Integration

b. Business Process Reengineering

c. Data Warehousing

d. Data Mining

e. Client-Server Architecture, Internet, Intranets, Extranets, and Electronic Commerce

f. Virtual / Knowledge Enterprise, Technological Learning and Organizational Intelligence

a. Systems Integration

Systems integration -- and by extension network management -- is a big problem with all agencies with systems at more than one site (Bob Deller, Principal Analyst, Global Systems and Strategies, Vienna, VA). In the process, government network administrators are being asked to maintain security and eliminate redundancy, while simultaneously enhancing internetwork access to more power users.

For systems integrators, solving this problem represents a multibillion-dollar opportunity, especially for high-end projects, such as the Joint Computer-Aided Acquisition and Logistics Support program, which requires integration of multimedia applications (voice, data and graphics), as well as secure multivendor systems and interoperability between agencies and private sector contractors.

Systems integrators can help customers understand the software tools and hardware platforms available today to solve their problem. It's not enough to walk the cus tomer through the process of sizing the problem by asking the right questions. You also need to recommend the right pieces. There is no one vendor today that solves the whole problem.

Service agencies with many offices throughout the country, such as the Internal Revenue Service and the Social Security Administration, are under excruciating pressure to do a better job of sharing data and maintaining up-to-date information on distributed databases on new client/server architectures.

In the Pentagon, efforts to harmonize disparate networks within the armed services and between purchasing units and trading partners are creating similar requirements.

larger percentage of the federal government's systems integration budget, which will grow from $3.6 billion in 1996 to $4.6 billion in 2001, is thus expected to be allocated to network management.

Every agency appears to have some emphasis toward modernization, really means responding to the mandate from the National Performance Review to perform better at lower cost. That means developing common ways of processing and thus doing something once instead of 40 or 50 times, hence integrating all of your networks: "We think the issue of network and systems management is increasing dramatically in importance," (Jack Haggerty, Operations Manager, Computer Sciences Corporation).

As government agencies move to client/server architectures, they are exposed to distributed environments. When they start dealing with distributed data and replication, while being able to keep things in sync with accurate data, network management becomes key.

Larger professional services companies, including CSC, expect to get a significant amount of work over the next few years, building centralized network management systems that will allow government administrators to monitor highly heterogeneous and geographically dispersed networks from centralized locations.

In these post-Cold War days, even the growing federal information technology budget has its limits, and government agencies are finding that customized engineering solutions -- commonly implemented 10 years ago -- are too expensive today. As a result, commercial network management solutions are finding a new, vibrant market in the federal government: "The main problem with the government [in terms of systems integration and network management] is that there is no single engine driving the government train. Each agency appears to be pursuing its own strategy," says GSS' Deller.

You see this reflected with the IRS' tax modernization program today. If you pick up any recent GAO [General Accounting Office] report on the tax system modernization, it will tell you that the IRS has still not corrected management and technical weaknesses that revolve around systems integration. The Veterans Administration's Veterans Benefits Program is suffering also..

b. Business Process Reengineering

Business Process Reengineering (BPR)is "the fundamental rethinking and radical redesign of business processes to achieve dramatic improvements in critical contemporary measures of performance, such as cost, quality, service, and speed." (M. Hammer and J. Champy, Reengineering the Corporation: A Manifesto for Business Revolution (New York: Harper Business, 1993), p. 32).

Reengineering has been around in many different forms of business transformation aimed at improving performance for a long time. The common underlying thread of these different instances is an emphasis on facilitating and enhancing intra- and inter-organizational learning: "... reengineering is not new. Organizations have done zero-based redesigning of certain operations for many years. Hammer and Champy say they observed reengineering already happening at companies like Ford Motor, Hallmark, and Taco Bell. Their contribution was to corral collective business experience on process redesign and give it a name" (Chemical Week, 24 November 1993).

Reengineering is a current management consulting fad. It has at its core, however, the classical concept of the radical transformation of a business (business reinvention).

This transformation takes place through learning focused on business processes, each one being "a collection of activities that takes one or more kinds of input and creates an output that is of value to the customer". Thus reengineering should focus on a small number of core processes, namely on the maximum leverage points ". . . to make the greatest impact and make the largest improvement". On the concept of leverage points, see P. Senge, The Fifth Discipline: The Art and Practice of the Learning Organization (New York: Doubleday, 1990)

c. Data Warehousing

A Data Warehouse is a subject-oriented corporate database which addresses the problem of having multiple data models implemented on multiple platforms and architecture in the enterprise. The Data Warehouse is differentiated from other corporate database systems in temrs of six key features:

a) It is separate from the operational systems in the enterprise and populated by data in these systems

b) It is available entirely for the task of making data available to be interrogated by business users

c) It is integrated on the basis of a standard enterprise model

d) It is timestamped and associated with defined periods of time

e) It is subject-oriented, mostly on the basis of the "Customer"

f) It is accessible to users who have limited knowledge of computer systems or data structures (Kelly, 1994)

The goal of the Data Warehouse is to transcend the internally focused control culture by using internal and external data patterns to identify events which trigger strategic decisions

Organizations are increasingly viewing their data as a valuable asset that can be used as a competitive weapon. Using data this way, that is, proactively, is a paradigm shift in the use of IT and specifically in the use of database management.

One path along which he thinks the data warehouse and the development of data mining tools (used to uncover previously hidden or undiscovered data patterns or relationships in the warehouse) will evolve is temporary data marts, organized collections of data that are project specific. He gives an example of the companies that put advertisements in the Sunday newspapers in a particular market for a specific product.

Companies, whether they are Sears or Procter & Gamble, want to figure out how that ad will impact sales. The data generated could seed a data mart for a specific purpose, but it would be a temporary data mart, just for one project and for one particular time and market: "Someone in the organization must ensure that power users play a part in the process. The user voice is critical," explains Abbey. "Since you're looking at a 24- to 72-month project, you also need to find subject specialists, especially IT professionals".

If the membership on the data warehousing team changes as the project moves ahead, management needs to make sure that team members toe the line, that they have corporate direction to stay on task and corporate commitment.

It is very important to clearly state the roles and responsibilities of the data warehousing team, especially in having infotech staff reorient themselves to meet user needs. An example is the high level of customer service that Hertz offers based on "knowledge, care and trust," phrased in the questions: "Do they know what they are talking about? Do they care about me? And do I trust them?" Hertz has collapsed a 40-plus minute transaction into a few minutes. They have taken the information they have learned about each individual as a customer and customized their services to meet that individual's needs. Many customers are willing to pay a premium for this level of service.

d. Data Mining

Beneath the standard steps in building a data warehouse, that is, forming a team, finding, gathering, cleaning and transforming the data, organizing it, automating the process of moving the data to areas for quick access, is the task of uncovering previously overlooked information nuggets. This is the role of data mining tools. It's an area that most experts agree is currently up for grabs and represents the next major development for the data warehousing industry.

Most products are too early to market and it's really unclear whether they do a lot of good. For the most part, the tools look for pre-defined relationships, for example, between a person's age and the products they buy. In this case, it is nothing special that you could not do with statistical analysis.

The critical issue, is to look at the database as a whole and find different, if not surprising, relationships. Among the fruitful areas identified, are consumer marketing using scanner data and the financial industry, analyzing stock market data, especially in trying to uncover time patterns and sequences.

We need to allow the data to retain its imprecision. Not everything can be reduced to computerized algorithms; the binary universe does not correspond to reality. We need to make computers deal with impreciseness.

e. Client-Server Architecture, Internet, Intranets, Extranets, and Electronic Commerce

Client-Server Architecture used to be so simple, a systems administrator at one of the largest federal bureaucracies nostalgically explains. In the not-so-old days, before the client/server revolution changed everything, data-processing jockeys at the major government agencies gathered, stored and doled out mission-critical information in an orderly manner from huge, imposing mainframes.

In those days, users were mostly gray-collar technicians rather than white-collar knowledge workers. Their modest data acquisition and processing requests -- from "dumb terminals" no less -- were consistent with their clerical status. Managing the limited number of users was a straightforward process. Application requirements rarely changed. And securing these closed proprietary systems was a cinch.

Today, spurred by many modernization initiatives, the entire federal government is moving mission-critical applications away from legacy systems toward a client/server environment that has been embraced by the commercial marketplace for years. The orderly hierarchical management schemes that prevailed in the mainframe age are collapsing into chaos, as local area networks manufactured by a variety of vendors pop up like mushrooms in even the most sensitive government agencies.

Internet, Intranets, Extranets

Internet, the global network of computer networks, is becoming increasingly compared and contrasted with intranets, the internal corporate computer networks, in terms of business relevance, competitive import, and long term impact on helping re-engineer the work of firms around the world.

An emerging, "co-opetitive" worldview sees the two communication system architectures as complementary to each other: intranets picking up where internet leaves.

Spanning the Real and Virtual Commerce Universe:

Internets plus Intranets = Extra-nets

The rise of extra-nets is a natural result of the quest to achieve greater inter-organizational coordination through information sharing. These extra-nets will evolve naturally into the "virtual corporation" model envisioned by numerous previous works. The challenge however, is and will increasingly lie, in designing and regulating constructively inter- and intra-net commercial activities.

Federal Express allows its customers to access its internal database through public Internet gateways, a system called "extranets", when they want to track package delivery. It does so by an "information bridge" between the internet and its proprietary intranet.

Electronic Commerce

Inter- and intra-net information infrastructures will integrate and leverage the value added across the value chain of electronic and physical commerce. The implementation of intranets supports new principles in information systems management in corporations, including the democratization of data access and the empowerment of employees through information.

E-commerce Critical Issues:

The critical issues that will determine the long term viability of electronic commerce are:

a) feasibility: the extent to which technology -- bandwidth availability and information reliability, tractability, and security -- will be able to sustain exponentially increasing demands worldwide.

b) acceptability: the extent to which different cultures and ways of doing business worldwide will accomodate this new mode of transacting in terms of its nature (not face-to-face), speed, asynchroneity, and unidimensionality of transactions.

c) profitability: the extent to which this new way of doing business will allow for profit margins to exist at all: no intermediaries, instant access to sellers, global reach of buyers. It is too early to pass judgment on any of these points: the technology is very young and also expanding so quickly that technology maturation is proceeding very slowly.

The significance of privacy, security, and intellectual property rights protection as predicates for the successful diffusion, adoption, and commercial success of internet-related technologies both for this country and worldwide, especially in places with less democratic political insitutions and free-market economies.

The differentiation between the internet as a global network of computer networks and the intranets, corporate-based computer networks that are protected by "firewalls" and involve well-defined communities as potentially more promising technology platforms for fostering internet-related commerce will intensify.

Intranet commerce has surpassed Internet commerce in terms of revenue and already more than half of the Web sites worldwide are commercial in nature.

Strategic alliances such as the recent alliance of Cybercash with Netscape to post Cybercash's electronic wallet on the internet for using its electronic money or "cheets" to facilitate electronic commerce are mutliplying. More than 100 banks are supporting this technology and its users can have monies deducted from their accounts automatically while the security of the transactions is guaranteed.

Entry of new competitors such as the recent cyber market set up by Federal Express to allow commercial cybertransactions is becoming increasingly frequent: Fedex believes it is in the business of hauling information not just traditional mail packages.

e. Virtual / Knowledge Enterprise, Technological Learning and Organizational Intelligence

What makes for a truly robust and resilient technology-driven organization over the long and the very long run, is an organization-wide learning strategy of simultaneity with three components: strategic, tactical, and operational (Carayannis, 1992, 1993, 1994, 1995, 1995, 1996, 1997). Such a strategy encourages and facilitates continuous learning and unlearning from experience (operational component), learning how-to-learn (tactical component), and learning new paradigms of thought (strategic component).

Intelligence is the possession of knowledge and the capacity for the processing creation of knowledge, whereas tacit knowledge is personal knowledge that is hard to formulate or encode. The organizational intelligence/learning process is a continuous cycle if activities that include: environmental sensing/scanning, remembering, perceiving, interpreting, and adapting. (Wei-Choo, 1995)

That knowledge has become the resource rather than a resource, is what makes our society "post-capitalist". This fact changes - fundamentally - the structure of society and it creates new social and economic dynamics (Drucker, 1993). The "post-capitalist" organization is a virtual, network, intelligent organization. In the intelligent organization, the three domain experts groups form a knowledge pyramid that supports learning.

Focus on the Government Performance Review Act (GPRA) of 1993 and the Information Technology Management Reform Act (ITMRA) of 1996

The federal government's track record in delivering high value information technology solutions at acceptable cost is not a good one. Federal agencies lack adequate processes and reliable data to manage investments in information technology (GAO, 1996).

The Government Performance Review Act (GPRA) focuses attention on defining mission goals and objectives, measuring and evaluating performance, and reporting on results. Budgets based on performance information provided under GPRA should include clear treatment of IT capital expenditures and its impact on agency operations.

The Information Technology Management Reform Act (ITMRA) supports GPRA by requiring that performance measures be used to indicate how technology effectively supports mission goals, objectives, and outcomes. ITMRA requires significant changes to the way government agencies manage and acquire information technology.

ITMRA emphasizes:

• senior executive involvement in information management decisions, the establishment of Chief Information Officers (CIOs) as members of executive management teams,

• investment control and capital planning, business process reengineering, and

• the use of performance measures to ensure accountability for IT spending results.

• ITMRA makes important changes designed to streamline the IT acquisition process, such as eliminating the GSA's central acquisition authority. ITMRA empowers agencies with ways to spend money wiser, not faster.

Other related reform acts are:

• The Federal Acquisition Reform Act (FARA) and the Federal Acquisition Streamlining Act (FASA) which are focused on removing barriers to agencies obtaining products and services from outside sources in a timely, costefficient manner and,

• The Paperwork Reduction Act (PRA) which focuses on the need for an overall information resources management strategic planning framework, with IT decisions linked directly to mission needs and priorities.

Case Studies of IT Successes and Failures from Government and Private Industry: Compare and Contrast

Re-Engineering BASF: Project Vision '96 and ISO 9000 Certification.

BASF Corporation, the U.S. component of BASF AG, embarked in mid-1993, on a corporate-wide, $60 million business process reengineering project called Vision '96 which involves about 100 people. Along with this project underway in the U.S., similar projects have been going on in other countries where BASF AG is present, such as Canada.

In the case of BASF, eight core business processes have been identified. The entire reengineering effort rotates around teams of employees (Process Expert Groups or PEG's) charged with designing and implementing the necessary changes in each of the core business processes. In their quest to reengineer the firm, the PEG's ignore "what is" and focus instead on "what should be" in a business, thus facilitating the radical core business process redesign, from "ASIS" to "TO-BE" processes.

The philosophy behind core process redesign is exemplified by a description of such a Formula Management process "TO-BE" at BASF : Formula Management "TO-BE" is considered a communication process rather than an approval process.

This paradigm shift enables us to focus on information flow and disregard for the moment who provides what approval and possible loop holes involved . . . The approval process employs sequential, parallel, and feedback communication structures for distribution of data.

Feedback structures are of utmost importance as they provide capability for error correction and the ability to engage in a learning process. The significance of the PEG's in making Vision '96 a success, is brought forth by Mike Bushell, manager of production control at BASF Corporation and also a member of the Replenishment PEG: "We are studying how the work of each individual PEG affects the other 'to-be' processes . . . My deliverable, for example, is an input to another process. When you put them all together, is the complete process efficient?".

Such attention to reengineering communication and quality systems within BASF has also benefited the firm in its efforts to gain ISO 9000 certification, which is the International Standard for Quality Management systems granted when a company demonstrates that it has a system which serves to ensure consistent quality in its products and services.

The ISO 9000 certification programs serve as "tools" for institutionalizing within the organization a culture of introspection and root cause analysis. These programs include ongoing corrective action to encourage individuals and organizations to understand better the root causes of their "failure" to conform to a target norm. They can thus develop corrective strategies and learn both how to solve problems better and how to diagnose better the strengths and weaknesses of their own problem solving.

As part of its commitment to continuous quality improvement through employee empowerment and learning, the management of BASF's Geismar, Louisiana, plant obtained in September 1993 the ISO 9002 Quality Management System Certificate of Compliance after two years of preparation.

For BASF AG, the vocabulary of reengineering, the electronic computing and telecommunication "tools" used to carry it out, and the fact that this large reengineering effort is taking place in a very large U.S. subsidiary are of recent vintage.

It is clear, however, that such programs also carry on long-standing traditions within the firm of strategic learning, flexibility, and rapid response to crises, in this case, brought on by very intense competition, and an ever-increasing rate of change in markets, technologies, and political-economic conditions.

Vision '96, the complete replacement of BASF Corporation's business infrastructure, is probably one of the most complex projects ever undertaken by the company. It will be implemented in stages over the next several years."

The PEGs members at BASF Canada, the first BASF AG subsidiary to start implementing Vision '96, said this about reengineering: "Vision '96 will give us the tools that make information available - on line, direct, up-to-date and accurate - that will enable us to improve substantially the service we give to our customers." (BASF Information (October 1993).

Data Mining Case Studies: Benefits and Lessons Learned

A Japanese automobile manufacturer was trying to reduce warranty expenditures. As he noted, there is room in that process for unscrupulous dealers to charge both the company and the customer for repairs while the vehicle is under warranty. The manufacturer was interested in identifying the likelihood of fraudulent claims and uncovering them. Using data mining tools, they found relationships in cases of fraud among the type of work performed on the vehicles, the hours of labor and the type of problem. Automobile insurance companies are continually trying to determine how best to set rates and how to attract customers from their competitors. One approach is to separate the good from bad drivers, for example, those with strong safety records. There are already very well-known criteria for this exercise. However, using data mining tools, one insurance company found out that people with good credit ratings have fewer accidents, a relationship previously unknown.

DataMind Corporation, Redwood City, California, with its Agent Network Technology, offers two products, the DataMind DataCruncher, which is a server-based data mining engine for Unix and Microsoft Windows-NT platforms, and the DataMind Professional Edition, for end users to define data mining studies and view data mining results. According to AJ Brown, vice president of marketing, the company's product strategy is to offer a client/server solution: "We offer an easy-to-use tool. We take a single algorithm approach in the belief that it will satisfy 90 percent of data mining queries that end users have. It's an open algorithm in the sense that it can be modified".

"With products like ours, systems integrators need to start getting expertise in specific industry segments. Since we can modify our product, we need to hear from them that this is what we need, for example, for the financial market. Frankly, at this point, we do not know where the best applications will be with data mining.", says Brown. DataMind's products started shipping in May and they have been working on pilots with a number of companies, mainly in the telecommunications, retail and health-care industries.

On a completely different plane in data warehousing and data mining is NCR Corp., Dayton, Ohio, which recently was given a 1996 Best Practices Award by the Data Warehousing Institute, Bethesda, Md., for its data warehouse application for retailing giant Wal-Mart Stores Inc.

The company focuses its computer product and service efforts on the retail, financial and communications industries and could easily claim to have started the data warehousing industry in 1985, according to Boston-based Patricia Seybold Group.

Among NCR's recent initiatives are the Scalable Data Warehouse framework, a blueprint for building a data warehouse solution through one-stop shopping, and an agreement with Knowledge Discovery One, Richardson, Texas, to deliver a data mining consulting program, called Voyager.

For Joe Wenig, general manager of NCR's Data Warehousing Professional Services, data mining is a new process for answering questions, not a solution by itself: "Obviously, the more details you have, the more likely you are to pick up patterns. Equally important to detailed data is understanding the nature of the problem you are looking to solve. You are unlikely to find one tool that runs the gamut of the data mining space," says Wenig.

The typical engagement for NCR moves through four phases:

• First is the basic, high-level problem identification.

• Second is detailed analysis and design.

• Third is data collection.

• Fourth is the data mining, using sophisticated querying techniques.

The tools will mature to the extent that they will be as simple as using Excel to find an average. Driving that development is end user awareness. Customers today are coming to the realization that data mining is the future. To a large degree, most of the tools are leading edge. It's not clear what types of problems they are best used to solve.

There will also be enormous opportunity for systems integrators as users try to get a handle on how data mining tools can be used in their company and industry.

Lessons Learned and Recommendations for 21st Century Government

The lessons learned for 21st century electronic government based on this survey of government policies and industry practices in information technology implementations are that the main challenges on the road to electronic government are the following:

• Get top agency executives, not just IRM officials involved.

• Strong, consistent direction is needed in CIO appointments.

• Attention needed to building new IT skills, not just leveraging existing ones.

• Besides government-wide efforts, OMB and agencies must keep focused on internal implementation steps.

• Continue to emphasize an intergated, not selective management approach.

• Congressional support and oversight is essential.

• Closely monitor the caliber and organizational placement of CIO candidates for departments and agencies.

• Focus on the evaluation of results.

• Monitor how well agencies are institutionalizing processes and regularly validating cost, return, and risk data used to support IT investment decisions.

• Get the right people asking and answering the right questions.

Copyright © 2024 Research and Reviews, All Rights Reserved

www.jffactory.net