Business Intelligence Best Practices for Success
- josephmwoodside
- Jan 1, 2011
- 12 min read
Business Intelligence (BI) is in a high adoption and high growth area, as users quickly value the capabilities and increasingly demand more features to compete in today’s economic climate. However, from a return on investment (ROI) standpoint, BI is similar to ERP and CRM, in that it has a poor risk/reward profile, as it regularly runs into cost overruns, due to scope creep and limitless requests for support from end-users (Bernard 2009). Unlike operational systems which often have specific requirements and implementation completion timelines, BI environments are constantly evolving to meet business and information requirements (Moss 2007). Given the complexity of most system implementations, no single measure exists for Business Intelligence success. In order to effectively evaluate BI success, measures are developed to identify critical implementation factors based on the research objectives and investigation (Wixom 2001). As an organization progresses in BI maturity, the value of its activities expands. Successful organizations increasingly utilize analytical approaches to identify and enact modest improvements that increase profitability and return on business intelligence investments. This paper presents several key findings, lessons learned, success evaluation methods, and best practices as identified through prior literature review and a formal empirical study, which extends and enhances prior literature and understanding of BI.
Road to Business Intelligence
Beginning in the late 1960s, experiments began with Decision Support Systems (DSS), utilizing computers to analyze data and offer decision-making support. DSS were typically used for narrowly focused activities such as production planning, investment management, and transportation applications, and several inputs are required to prepare the analysis (Leidner 1993). With the introduction of software applications such as SAS and SPSS in the 1970s, statistical software became more available and accessible to end users. Despite this introduction, DSS did not prosper and evolved into Executive Support Systems (ESS). (Davenport 2007; Ranjan 2008).
ESS were utilized by executives for viewing firm performance and focused less on decision making support. The feature found in most ESS was single database access with current organizational information, in an easy to access manner. ESS usage was also found to be positively related to problem identification, decision making, and analysis. Other ESS features included non-keyboard interface, organizational database, drill-down capabilities, trend analysis, exception reports, graphics, and critical information monitoring. The focus of an ESS was on the organizations day-to-day activities as well as marketplace indicators. A DSS by contrast was intended to allow on demand decisions and routine analysis (Leidner 1993). ESS were referred to as high-risk/high-return systems, as the systems serve executives whose information needs are complex, but also have greater influence. ESS provided executives easy to use information that supports their critical success objectives. ESS also failed to enjoy widespread usage due to resistance by executives to hands-on usage (Rainer 1995).
Firms have since made major investments in such systems as enterprise resource planning (ERP), supply chain management (SCM) and customer relationship management (CRM), yet struggle to achieve competitive advantage. Firms need streamlined access and analysis of the underlying information in order to make operational decisions. Strategic organizations sought to improve efficiency through faster and better-informed decision making, and looked to technology to enhance strategic and tactical results to improve time to market, connectivity, integration, and visibility into their business. Unbeknownst to users at the time, this data from these systems was a significant organizational asset, which would later be leveraged for success and competitive advantage. In order to realize these benefits however, the data must be developed into an enterprise wide unified view. Construction and integration of knowledge is a key to succeeding in the competitive global market. Information Technology moved to support day-to-day operations and all aspects of decision making, with differentiation through technology becoming increasingly important. New generations of technology savvy users and executives were finding ways to utilize previously untapped information. The entire field emerged as Business Intelligence (BI) and includes collection, management, and reporting of decision making data and information. BI capabilities have consistently been identified as the number one technology priority for organizations according to current industry surveys (Davenport 2007; Ranjan 2008).
Business Intelligence
BI is not a single product, application, program, user, area, or system, rather an architecture of integrated systems that provide users with easy access to and storage of information for decision making and learning. Competitive pressures cause organizations to continually improve and adapt in order to be successful in the ever changing business environment, and information is required by employees throughout all levels of the organization for ongoing decision making (Ranjan 2008; Saha 2007). Business Intelligence refers to applications and technologies used to gather, capture, access, consolidate, and analyze information to improve decision making by various horizontal and vertical levels of stakeholders. These systems capture important metrics on business operations, as well as providing a mechanism for improved decision making. At the various levels these information items may include documents, calendars, wikis, links, reports, dashboards, scorecards, search, databases, lists, user knowledge, and much more. For example, these technologies can help coordinate projects, calendars, schedules, discuss ideas, review documents, share information, keep in touch with others, utilize Key Performance Indicators (KPI) to gauge operational status, and generate reporting information on-demand. The BI process is one that allows large amounts of disparate data to come together into a single repository and turn that data into meaningful information for decision support processes. BI can include various forms of analysis, data mining, scorecards, dashboards, metrics, reporting, portals, data warehouse, OLAP, decision support, knowledge management, etc. This information is available to all levels of the organization and associated stakeholders, on-demand, and in an easy-to-use fashion (Moss 2007; Ranjan 2008; Saha 2007).
Research Model
During interviews with executives, managers, and professional staff, before, during, and after BI implementation and from review of prior literature, a path-analytic BI implementation success model was proposed. Literature review from information systems was utilized to develop the research model and relationships between those areas. Prior literature has studied information success using multiple methods. Researchers are instructed to utilize appropriate success measures based on the objectives of the study and investigation (Wixom 2001).
Collaborative Culture
Team work is an important aspect throughout any implementation project. This requires close cooperation between all department areas and business and technical teams, including top management, consultants, end-users, and vendors (Bhatti 2005). Aligning resources with business strategies is also important, as typically there is limited alignment between BI strategy and business strategy (Williams 2007). Organizational learning through shared visions and commitment to learning is an important aspect to collaborative culture and can be achieved through team work and alignment between units in support of a common objective and set of goals.
H1: There is a positive relationship between collaborative culture and implementation success
Customization
Business Process Re-engineering (BPR) involves rethinking and redesigning business processes to improve key performance measures such as cost and quality of service. Most organizations are required to modify their existing business processes to fit the application software as a way to limit customizations (Bhatti 2005). Arnott describes this similarly as the degree of fit between the organization and the software and hardware (Arnott 2008). Many times an implementation may not meet expectations, due to an underestimation of change management complexities and encountering resistance to change (Bhatti 2005). In many cases, end users must be trained in the new paradigm after and during BI implementation. Users must also be trained in understanding and adjusting to changes in business processes (Williams 2007). BI platforms should be selected which allow for customizations to occur dynamically through the user interface, this resolves prior system issues which occurred at time of upgrade of customized applications or modules. The flexibility of most BI platforms allow for customizations which previously were unavailable in the scope of an enterprise application.
H2: There is a positive relationship between customization and implementation success
Communication
Wide information sharing and understanding must occur by all stakeholders throughout the implementation stages and beyond. Communication should start as early as possible to gain organizational understanding and acceptance (Bhatti 2005). Beyond development of BI and user training, the vision must be marketed and communicated. The BI applications must be viewed as mission critical and all users share that vision (Williams 2007). Communication should be started early in the form of several announcements and organizational newsletters, including email, meeting, and Intranet announcements. Kick-off meetings with key personnel and staff resources should also take place, along with regularly recurring meetings.
H3: There is a positive relationship between communication and implementation success
Project Management
Organizations should use structured and formal approach for BI projects. Many projects fail to adequately account for organizational requirements, resources, and funding necessary to support a successful BI implementation (Williams 2007). PM includes coordinating, scheduling, scope, and monitoring activities and resources in line with the project objectives. PM is also responsible for the overall implementation process and developing organizational support. The DW/BI systems should be developed iteratively building to a complete application set. (Arnott 2008; Bhatti 2005). Agile methodology was adopted, having a formal project methodology and a formal project management office for oversight and project tracking is critical to the project’s success. It is important to establish critical success indicators and metrics from project inception, to ensure expectations are met and exceeded. To allow improved deployment speed, an iterative methodology with rapid prototyping should be employed. Parallel user sub-groups should be established to allow continuous feedback from rapid prototyping and to reduce periods of inactivity.
H4: There is a positive relationship between project management and implementation success
Resources
Resources can include financial, people, hardware, software and time for project completion. It is also important to fund new activities required as a result of BI implementation such as meta data management. Resource issues often have a negative impact on implementation success (Arnott 2008; Williams 2007; Wixom 2001). Consultants are often required due to a knowledge gap and complexity of new systems. User involvement can occur through requirements gathering, implementation participation, and use after go-live (Bhatti 2005). Dedicated resources should be assigned to avoid inevitable competing projects and priorities. Dedicated consultant resources should be allocated to improve timeline and knowledge transfer for new technology areas. Though consultants should only be utilized as a temporary solution, as knowledge-loss occurs with continued usage. Dedicated department or area based resources should be assigned for local subject matter experts and knowledge diffusion.
H5: There is a positive relationship between resources and implementation success
Top Management Support
Top management provides the required resources in a direct or indirect manner through financing, as well as the power and support. Top management is also responsible for setting a clear direction, overall project objectives, project guidance, representation, and establishing these throughout the organization (Arnott 2008; Bhatti 2005). Sponsorship across the entire management team, allows others in the organization to support the project through reducing political resistance, and facilitate participation. Top management support must include top management champions and are viewed similarly (Wixom 2001). All top management should be advised of the project by the sponsors, and any issues or concerns addressed initially. It is vital for the sponsors to continually update top management early in the project, and as components are released to end-users.
H6: There is a positive relationship between top management support and implementation success
Training
Training end-users is important to improve knowledge and appropriate use of the system. General BI concepts, components, demonstration, and use are key training areas. Training should also include process changes and overall flow of information and integration (Bhatti 2005). Training also includes the standards and policies that must be followed for the new BI applications, to optimize use of BI by end-users (Williams 2007). Training modules and materials should be developed prior to the initial go-live, along governance plans, such as best practices, content and technical standards, and policies and procedures should be developed for training purposes. In addition, proactive monitoring of training issues should be immediately addressed to avoid long-term paradigm creation in early stages. New users should be required to take established training and existing users on an annual basis and directed to the training materials as questions arise.
H7: There is a positive relationship between training and implementation success
Vertical Integration
In prior BI implementations multiple vendor solutions required purchase, as one vendor did not provide a fully integrated solution, also other companies chose a best of breed approach based on vendor offerings. Today, several vendors offer completely integrated solutions with equitable offerings across services. In one survey organizations utilized an average of 3.2 vendors with 8-13 tools (Howson 2008). For the small-medium business scenario, utilizing a vertical architecture with a single vendor was identified as a critical success factor. Due to resource and funding requirements, the ability to match expert skillsets with a best of breed approach is not possible. The use of a single vendor also improves delivery time through ease of installation and avoids integration issues that commonly arise when utilizing multiple vendor solutions vendors. Here, a vertical architecture is defined as having a single vendor designed and developed BI platform, which includes Knowledge Management, Content Management, Performance Management, End-User Tools, Querying / Reporting, Analysis, and Database Management System.
H8: There is a positive relationship between vertical architecture and implementation success
Success Factors
BI project implementation success is measured through perceived success, whether the project completed timely, completed on budget, and overall satisfaction with the BI (Howson 2008; Wixom 2001).
Research Methodology
A survey was development followed Moore and Benbasat's (1991) identified stages of item creation, scale development, and testing: Item creation, in which existing items were utilized from prior literature, then additional items added to those components which fit the definitions. Scale development, where similar categories of items were created and refined as needed. Testing, in which sample surveys were conducted, and then was followed by revisions and larger distribution (Moore 1991). The final survey was randomized to reduce order effects. The survey uses a seven point likert scale, and measures the level of agreement with each statement, with 1 being strong disagreement and 7 strong agreement. The survey was administered to a national healthcare organization who had recently completed a BI implementation. The survey was reviewed by subject matter experts in and across organizational levels as part of a pre-test to resolve any concerns or address any identified discrepancies, as well as re-word identified items, or remove non-weighting or duplicative manifest variables. A total of 141 responses were received with 105 (75%) non-managerial users, and 36 (25%) managerial users. Multiple geographic locations were included, along with multiple staff functions and department areas. Overall 148 responses were received, with 7 responses removed that were incomplete past general user information, for a total of 141 usable responses. 106 respondents were non-managerial users, with the remaining 35 respondents holding a supervisory position through executive position. Users from multiple geographic office locations and functional areas were represented within the study group, as well as users with short and long-term tenure at the firm.
Data Analysis
A path analysis is employed using SmartPLS 2.0 software to analyze the results and determine model fit (Ringle 2005). A model with significant loadings is developed. Significance of relationships was determined between implementation factors and success factors. The final model specification includes the supported paths, all paths were supported for significance at p = 0.01. Results showed that 73.8% of the variability in implementation success is explained from the model.
The average variance extracted is the average communality for the latent factors in the model. AVE is utilized for convergent validity, and should be greater than or equal to 0.5, which the latent factors exceed. Composite reliability is also utilized as Cronbach’s alpha commonly underestimates or overestimates reliability. Composite reliability follows similarly to Cronbach’s alpha, with 0.80 to be considered good, 0.70 to be considered acceptable, and 0.60 to be considered for exploratory requirements. Composite reliability exceed 0.90 for this model. R-Square displays the effect size measure, and is not shown for exogenous constructs. An R-Square of 0.67 is considered substantial, 0.33 considered moderate, and 0.19 considered weak. The R-Square for this model is 0.73. Cronbach’s alpha should be equal to at least 0.80 to be considered good, 0.70 to be considered acceptable, and 0.60 to be considered for exploratory requirements. For short scales, Cronbach’s alpha may be biased, this study utilized 7 point scales for all questions measured, and Cronbach's alpha exceed 0.80 (Garson 2010; Ringle 2005). Hypothesis 1-8 are supported.
Conclusion and Future Directions
This paper presents several key findings as identified through a formal study, and improves the power of the explanatory success model, which extends and enhances prior literature and understanding of BI. The first key finding is the identified implementation construct addition of a vertical architecture, particularly for the small-medium business (SMB) scenario. A vertically-integrated architecture improves the implementation timeline and required resource base for implementation success. The second key finding is around establishing a collaborative culture to promote organizational learning capabilities. This extends previous notions of team work and business-IT alignment, and is necessary to support adoption and use of BI. Third involves implementation success outcomes when democratization or universal user adoption of BI has been achieved. In past studies measuring BI success, only a small portion of users had access to BI capabilities, successful outcomes can be realized while extending BI benefits to all users.
Limitations include the use of single organization study; additional organizations should be reviewed to increase sample size in and across various industries and firm sizes. There is an importance to identifying universally applying critical success factors, and the ability for tailoring those factors to an individual organization or implementation. However it is the adaptability of the BI capabilities and the overall project that will ensure successful completion. Other identified areas of study and importance beyond implementation include establishing a competency center to ensure continue usage of business intelligence, stakeholder satisfaction, and decreased costs. Another area is establishing an architecture roadmap for future iterations and system updates, these include enhancements to key capabilities and features, bug fixes, security improvements, and ensured vendor support. A governance plan is also important to establish technical roles, support service level agreements, back and recovery, database and data standards, metadata standards, content branding, life cycle policies, and training.
References
Arnott, D. "Success Factors for Data Warehouse and Business Intelligence Systems," Australasian Conference on Information Systems) 2008, pp 55-65.
Bernard, A. "Four Technology Best (and not so best) Bets for 2009," in: CIO Update, 2009.
Bhatti, T.R. "Critical Success Factors for the Implementation of Enterprise Resource Planning (ERP): Empirical Validation," The Second International Conference on Innovation in Information Technology) 2005.
Davenport, T.H., Harris, Jeanne G. Competing on Analytics Harvard Business School Publishing Corporation, Boston, 2007.
Garson, G.D. "Partial Least Squares Regression (PLS)," N.C.S. University (ed.), 2010.
Howson, C. Successful Business Intelligence: Secrets to Making BI a Killer App, 2008.
Leidner, D., Elam, Joyce "Executive Information Systems: Their Impact on Executive Decision Making," Journal of Management Information Systems (101:3) 1993, pp 139-155.
Moore, G.C., Benbasat, Izak "Development of an Instrument to Measure the Perceptions of Adopting an Information Technology Innovation," Information Systems Research (2:3) 1991, pp 192-222.
Moss, L.T., Atre, Shaku Business Intelligence Roadmap, Boston, 2007.
Rainer, R.K., Watson, Hugh "The Keys to Executive Information System Success," Journal of Management Information Systems (12:2) 1995, pp 83-98.
Ranjan, J. "Business Justification with Business Intelligence," The Journal of Information and Knowledge Management Systems (38:4) 2008, pp 461-475.
Ringle, C.M., Wende, S., Will, S. "SmartPLS 2.0 (M3) Beta," Hamburg, 2005.
Saha, G.K. "Business Intelligence Computing Issues," ACM Ubiquity (8:25) 2007.
Williams, S., Williams, Nancy The Profit Impact of Business Intelligence, 2007.
Wixom, B.H., Watson, Hugh J. "An Empirical Investigation of the Factors Affecting Data Warehouseing Success," MIS Quarterly (25:1) 2001, pp 17-41.
Definitive Source and Citation:
Woodside, Joseph M. (2011). Business Intelligence Best Practices for Success. International Conference on Information Management and Evaluation. Available at: http://connection.ebscohost.com/c/articles/60168280/business-intelligence-best-practices-success
This article shares useful best practices around business intelligence, particularly how it supports strategic decision-making. What stood out to me was the emphasis on aligning data insights with business goals—something that's just as critical in resource allocation. For organizations managing multiple projects, integrating business intelligence with resource planning software can significantly improve visibility and operational efficiency. It's all about turning data into action with the right tools in place.