final term research paper vaibhav

Click here to load reader

Upload: vaibhav-jindal

Post on 12-Feb-2017

142 views

Category:

Documents


0 download

TRANSCRIPT

BUSINESS INTELLIGENCE AND DATA ANALYTICSBy Vaibhav Jindal, Stevens Institute of Technology, 2016

ABSTRACT - In this paper we briefly discuss on the topic of Business Intelligence and Data Analytics that is what they are, why these were developed, what their function is and what their role in an organization. Moreover this paper is broadly divided into two major parts, first is analysis of the topic and second part is their applications in an industry or a particular organization.I. INTRODUCTIONFigure 1: General Overview of BI and Data Analytics

For a long time Business Intelligence and Data Analytics have been an important part of company or organization.

II. BUSINESS INTELLIGENCE (BI)Business intelligence(BI) is defined as "a set of techniques and tools for the acquisition and transformation of raw data into meaningful and useful information for business analysispurposes". BI technologies are capable of handling large amounts of structured and sometimes unstructured data to help identify, develop and otherwise create new strategic business opportunities. The goal of BI is to allow for the easy interpretation of these large volumes of data. Identifying new opportunities and implementing an effective strategy based on insights can provide businesses with a competitive market advantage and long-term stability.

Figure 2: BI OverviewBI is made up of an increasing number of components including: De-normalization, tagging and standardization. Real-time reporting with analytical alert. Open item management. Statistical interference and probabilistic simulation. Key performance indicators optimization. Version control and process management. Multidimensional aggregation and allocation.

A. BUSINESS INTELLIGENCE PROCESSES Understand Business Intelligence Information Needs Define and Maintain the DW / BI Architecture Implement Data Warehouses and Data Marts Implement BI Tools and User Interfaces Process Data for Business Intelligence Monitor and Tune Data Warehousing Processes Monitor and Tune BI Activity and Performance

B. PRIORITIZATION OF PROJECTSIt is difficult to provide a positive business case for business intelligence initiatives, and often the projects must be prioritized through strategic initiatives. BI projects can attain higher prioritization within the organization if managers consider the following: The BI manager must determine the tangible benefits such as eliminated cost of producing legacy reports. Data access for the entire organization must be enforced. In this way even a small benefit, such as a few minutes saved, makes a difference when multiplied by the number of employees in the entire organization. Managers should also consider letting the BI project be driven by other business initiatives with excellent business cases. To support this approach, the organization must have enterprise architects who can identify suitable business projects. Using a structured and quantitative methodology to create defensible prioritization in line with the actual needs of the organization, such as a weighted decision matrix.

C. SUCCESS FACTOR OF IMPLEMENTATIONThere are 3 critical areas that organizations should assess before getting ready to do BI project: The level of commitment and sponsorship of the project from senior management. The level of business need for creating a BI implementation. The amount and quality of business data available.

BUSINESS SPONSORSHIPThe commitment andsponsorshipof senior management is the most important criteria for assessment because having strong management backing helps overcome shortcomings elsewhere in the project. However even the most elegantly designed BI system cannot overcome a lack of business [management] sponsorship.The best business sponsor have organizational clout and is well connected within the organization. Also it is ideal that the business sponsor is demanding but also able to be realistic and supportive if the implementation runs into delays or drawbacks.BUSINESS NEEDSThe needs and benefits of the implementation are sometimes driven by competition and the need to gain an advantage in the market. Another reason for a business-driven approach to implementation of BI is the acquisition of other organizations that enlarge the original organization it can sometimes be beneficial to implement DW or BI in order to create more oversight.Companies that implement BI are often large, multinational organizations with diverse subsidiaries.[24]A well-designed BI solution provides a consolidated view of key business data not available anywhere else in the organization, giving management visibility and control over measures that otherwise would not exist.

AMOUNT AND QUALITY OF AVAILABLE DATAWithout proper data, or with too little quality data, any BI implementation fails; it does not matter how good the management sponsorship or business-driven motivation is. Before implementation it is a good idea to do data profiling. This analysis identifies the content, consistency and structureof the data. This should be done as early as possible in the process and if the analysis shows that data is lacking, put the project on hold temporarily while the IT department figures out how to properly collect data.When planning for business data and business intelligence requirements, it is always advisable to consider specific scenarios that apply to a particular organization, and then select the business intelligence features best suited for the scenario.Often, scenarios revolve around distinct business processes, each built on one or more data sources. These sources are used by features that present that data as information to knowledge workers, who subsequently act on that information. The business needs of the organization for each business process adopted correspond to the essential steps of business intelligence. These essential steps of business intelligence include but are not limited to:1. Go through business data sources in order to collect needed data2. Convert business data to information and present appropriately3. Query and analyze data4. Act on the collected dataThequality aspectin business intelligence should cover all the process from the source data to the final reporting. At each step, thequality gatesare different:1. Source Data: Data Standardization: make data comparable (same unit, same pattern...) Master Data Management:unique referential2. Operational Data Store (ODS): Data Cleansing:detect & correct inaccurate data Data Profiling: check inappropriate value, null/empty3. Data warehouse: Completeness: check that all expected data are loaded Referential integrity: unique and existing referential over all sources Consistency between sources: check consolidated data vs sources4. Reporting: Uniqueness of indicators: only one share dictionary of indicators Formula accuracy: local reporting formula should be avoided or checked

D. BI PORTALSA BI portal is the primary access interface for BI applications. The BI portal is the users first impression of the BI system. It is typically a browser application, from which the user has access to all the individual services of the BI system, reports and other analytical functionality. The BI portal must be implemented in such a way that it is easy for the users of the BI application to call on the functionality of the application.The BI portal's main functionality is to provide a navigation system of the DW/BI application. This means that the portal has to be implemented in a way that the user has access to all the functions of the DW/BI application.The following is a list of features for web portals in general and BI portal in particular: Usable: User easily finds what they need in the BI tool. Content Rich: The portal in not just a report printing tool, it contains more functionality such as advice, help, support information and documentation.Figure 3: BI Interface

Clean: The portal is designed such that it is easily understandable and not over-complex as to confuse the users. Current: Its important that the portal is updated regularly. Interactive: Portal is implemented in a way that makes it easy for the user to use its functionality and encourage them to use the portal. Scalability and customization give the user the means to fit the portal to each user. Value Oriented: It is important that the user has the feeling that the BI application is a valuable resource that is worth working on.

E. BI IN ENTREPRENEURSHIP

Figure 4: BI in Entrepreneurship

BI in entrepreneurship is based on current trends of market in industry, efficiency of people, performance, speed and reliability and most importantly knowledge. A number of BI vendors have entered the market.Figure 5: Data Analytics Process

BI is mostly industry specific. Specific consideration for BI have to be taken in some sectors such as healthcare. The information collected by healthcare institutions and analyzed with BI software must be protected from some group or individuals, while being fully available to other group or individuals. So, BI solutions must be sensitive to those needs and be flexible enough to adapt to new regulations and changes to existing laws.III. DATA ANALYTICSAnalytics is the "extensive use of data, statistical and quantitative analysis, explanatory and predictive models, and fact-based management to drive decisions and actions." It is a subset of business intelligence, which is a set of technologies and processes that use data to understand and analyze business performance. Data Analytics involves a process of inspecting, cleaning, transforming, and modelingdatawith the goal of discovering usefulinformation, suggesting conclusions, and supporting decision-making. Data analytics has multiple facets and approaches, encompassing diverse techniques under a variety of names, in different business, science, and social science domains. The term data analysisis sometimes used as a synonym fordata analytics.Data miningis a particular data analysis technique that focuses on modeling and knowledge discovery for predictive rather than purely descriptive purposes.Intelligence covers data analysis that relies heavily on aggregation, focusing on business information. In statistical, some people divide data analysis intodescriptive statistics,exploratory data analysis(EDA), andconfirmatory data analysis(CDA). EDA focuses on discovering new features in the data and CDA on confirming or falsifying existing hypotheses.Analytics focuses on application of statistical models for predictive forecasting or classification, whiletext analyticsapplies statistical, linguistic, and structural techniques to extract and classify information from textual sources, a species of unstructured. All are varieties of data analysis.

Figure 6: Data Analytics Overview

Data integrationis a precursor to data analysis, and data analysis is closely linked todata visualizationand data dissemination.A. THE PROCESSES OF DATA ANALYSISAnalysis refers to breaking a whole into its separate components for individual examination. It is basically a process for obtaining raw data and converting it into useful information for decision making.There are several phases of data analysis that can be described: Data Requirements: The data necessary as inputs to the analysis are specified based upon the requirements of those directing the analysis or customers who will use the finished product of the analysis. The general type of entity upon which the data will be collected is referred to as an experimental unit. Data may be numerical or categorical. Data Collection: Data is collected from a variety of sources. The requirements may be communicated by analysts to custodians of the data, such as information technology personnel within an organization. The data may also be collected from sensors in the environment, such as traffic cameras, satellites, recording devices, etc. It may also be obtained through interviews, downloads from online sources, or reading documentation. Data Processing: Data initially obtained must be processed or organized for analysis. For instance, this may involve placing data into rows and columns in a table format for further analysis, such as within a spreadsheet or statistical software. Data Cleaning: Once processed and organized, the data may be incomplete, contain duplicates, or contain errors. The need for data cleaning will arise from problems in the way that data is entered and stored.Figure 7: Steps for obtaining Intelligence

Data cleaning is the process of preventing and correcting these errors. Common tasks include record matching, identifying inaccuracy of data, and over-all quality of existing data, deduplication, and column segmentation. Exploratory Data Analysis: Once the data is cleaned, it can be analyzed. Analysts may apply a variety of techniques referred to asexploratory data analysisto begin understanding the messages contained in the data. The process of exploration may result in additional data cleaning or additional requests for data, so these activities may be iterative in nature.Descriptive statisticssuch as the average or median may be generated to help understand the data.Data visualizationmay also be used to examine the data in graphical format, to obtain additional insight regarding the messages within the data. Modeling and Algorithms: Mathematical formulas or models calledalgorithmsmay be applied to the data to identify relationships among the variables, such ascorrelationorcausation. In general terms, models may be developed to evaluate a particular variable in the data based on other variable(s) in the data, with some residual error depending on model accuracy (i.e., Data = Model + Error). Data Product: A data product is a computer application that takes data inputs and generates outputs, feeding them back into the environment. It may be based on a model or algorithm. An example is an application that analyzes data about customer purchasing history and recommends other purchases the customer might enjoy. Communication: Once the data is analyzed, it may be reported in many formats to the users of the analysis to support their requirements. The users may have feedback, which results in additional analysis. As such, much of the analytical cycle is iterative. When determining how to communicate the results, the analyst may considerdata visualizationtechniques to help clearly and efficiently communicate the message to the audience. Data visualization usesinformation displayssuch as tables and charts to help communicate key messages contained in the data.

Figure 8: The Analytics Lifecycle

B. TECHNIQUES FOR ANALYZING QUANTITIVE DATA

There are a series of practices for understanding quantitative data which include: Check raw data for anomalies prior to performing your analysis. Re-perform important calculations, such as verifying columns of data that are formula driven. Confirm main totals are the sum of subtotals. Check relationships between numbers that should be related in a predictable way, such as ratios over time. Normalize numbers to make comparisons easier, such as analyzing amounts per person or relative to GDP or as an index value relative to a base year. Break problems into component parts by analyzing factors that led to the results.

C. BARRIERS TO EFFECTIVE ANALYSIS

Barriers to effective analysis exists among the analysts performing the data analysis or among the audience. Distinguishing fact from opinion, cognitive biases, and innumeracy are all challenges to sound data analysis. Confusing fact and opinion: Effective analysis requires obtaining relevantfacts to answer questions, support a conclusion or formalopinion, or testhypotheses. Facts by definition are irrefutable, meaning that any person involved in the analysis should be able to agree upon them. Cognitive biases: There are a variety ofcognitive biasesthat can adversely affect analysis. For example, confirmationis the tendency to search for or interpret information in a way that confirms one's preconceptions. In addition, individuals may discredit information that does not support their views. Innumeracy: Effective analysts are generally adept with a variety of numerical techniques. However, audiences may not have such literacy with numbers ornumeracy; they are said to be innumerate. Persons communicating the data may also be attempting to mislead or misinform, deliberately using bad numerical techniques.

IV. BUSINESS INTELLIGENCE AND DATA ANALYTICS IN HEALTHCARE SYSTEMHealthcare business intelligence (BI) is the answer that hospitals are looking for as they move to data-driven healthcare improvements and cost reductions provided its built on the foundation of a data warehouse. Healthcare is changing rapidly and so is the industrys need for analytics and business intelligence, which brings up a problem: what exactly IS healthcare business intelligence? The term itself has multiple meanings and can be difficult to define, which leaves organizations that know they need a solution wondering exactly where to turn.The trouble stems from the overuse of the term business intelligence. Sometimes business intelligence refers to a broad category of analytics,data warehousingand visualization tools, all of which are must-haves for any long-term and sustainable analytics foundation. Other times, business intelligence tools are linked to the visualization layer only those tools that take the data and return a visual representation of it.

Figure 9: Basic Steps for Implementation of BI and Data AnalyticsA. NEED FOR BI AND DATA ANALYTICSToday, healthcare organizations and their partners outside of healthcare need to tightly control access to data. To manage the: Tougher government requirements Tougher patient data management regulations The risk of data breaches The increase in legal actionsB. MAJOR COMPONENTS USING BI AND DATA ANALYTICS IN HEALTHCARE SYSTEM Appointment Module Admission Module Emergency Module Testing Module Payment Module Pharmacy CafeteriaEach major component have the following parameters: High Level Diagrams. Pre and Post conditions Contracts ERD Information ModelC. HEALTHCARE ANALYTICS ADOPTION MODELHealthcare in world has slowly been progressing through three wave of data management: data collection, data sharing and data analytics. the data collection and sharing waves, characterized by the urgent deployment of electronic health records and health information exchanges, have failed to significantly impact the quality and cost of healthcare. And despite the current hype aboutbig databeing the next big thing in other industries, the reality in healthcare is that we are just beginning to have the necessary analytics capabilities that enable system-wide quality improvement and cost reduction efforts. The real promise of analytics lies in its ability to transform healthcare into a truly data-driven culture.Healthcare Analytics can be confusing, even overwhelming without a systematic framework for guidingyour approach and priorities.The following framework, called theHealthcare Analytics Adoption Model, was developed by a cross-industry group of healthcare industry veterans as a guide to classifying groups of analytics capabilities, and provide a systematic sequencing to adopting analytics within health system organizations.A successful and sustainable analytics strategy requires building foundational elements of the model first in order to support the upper levels of the model in later years.The Healthcare Analytics Adoption Model provides: A framework for evaluating the industrys adoption of analytics A roadmap for organizations to measure their own progress toward analytic adoption A framework for evaluating vendor productsLEVELS OF ANALYTICS ADOPTION MODELLevel 0 Fragmented Point Solution (Inefficient, inconsistent version of the truth) Vendor-based and internally developed applications are used to address specific analytic needs as they arise. The fragmented point solutions are neither co-located in a data warehouse nor otherwise architecturally integrated with one another. Overlapping data content leads to multiple versions of analytic truth. Reports are labor intensive and inconsistent. Data governance is non-existent.Level 1 Enterprise Data Warehouse (Foundation of data and technology) At a minimum, the following data are co-located in a single data warehouse, locally or hosted: HIMSS EMR Stage 3 data, Revenue Cycle, Financial, Costing, Supply Chain, and Patient Experience. Searchable metadata repository is available across the enterprise. Data content includes insurance claims, if possible. Data warehouse is updated within one month of source system changes. Data governance is forming around the data quality of source systems. The EDW reports organizationally to the CIO.

Level 2 Standardized Vocabulary & Patient Registries (Relating and organizing the core data) Master vocabulary and reference data identified and standardized across disparate source system content in the data warehouse. Naming, definition, and data types are consistent with local standards. Patient registries are defined solely on ICD billing data. Data governance forms around the definition and evolution of patient registries and master data management.Figure 10: Healthcare Analytics Adoption Model

Level 3 - Automated Internal Reporting (Efficient, consistent production) Analytic motive is focused on consistent, efficient production of reports supporting basic management and operation of the healthcare organization. Key performance indicators are easily accessible from the executive level to the front-line manager. Corporate and business unit data analysts meet regularly to collaborate and steer the EDW. Data governance expands to raise the data literacy of the organization and develop a data acquisition strategy for Levels 4 and above.Level 4 - Automated External Reporting (Efficient, consistent production and agility) Analytic motive is focused on consistent, efficient production of reports required for regulatory and accreditation requirements (e.g. CMS, Joint Commission, tumor registry, communicable diseases); payer incentives (e.g. MU, PQRS, VBP, readmission reduction); and specialty society databases (e.g. STS, NRMI, Vermont-Oxford). Adherence to industry-standard vocabularies is required. Clinical text data content is available for simple key word searches. Centralized data governance exists for review and approval of externally released data.

Level 5 - Clinical Effectiveness & Accountable Care (Measuring & managing evidence based care) Analytic motive is focused on measuring adherence to clinical best practices, minimizing waste, and reducing variability. Data governance expands to support care management teams that are focused on improving the health of patient populations. Population-based analytics are used to suggest improvements to individual patient care. Permanent multidisciplinary teams are in-place that continuously monitor opportunities to improve quality, and reduce risk and cost, across acute care processes, chronic diseases, patient safety scenarios, and internal workflows. Precision of registries is improved by including data from lab, pharmacy, and clinical observations in the definition of the patient cohorts. EDW content is organized into evidence-based, standardized data marts that combine clinical and cost data associated with patient registries. Data content expands to include insurance claims (if not already included) and HIE data feeds. On average, the EDW is updated within one week of source system changes.Level 6 - Population Health Management & Suggestive Analytics (Taking financial risk and preparing your culture for the next levels of analytics) The accountable care organization shares in the financial risk and reward that is tied to clinical outcomes. At least 50% of acute care cases are managed under bundled payments. Analytics are available at the point of care to support the Triple Aim of maximizing the quality of individual patient care, population management, and the economics of care. Data content expands to include bedside devices, home monitoring data, external pharmacy data, and detailed activity based costing. Data governance plays a major role in the accuracy of metrics supporting quality-based compensation plans for clinicians and executives. On average, the EDW is updated within one day of source system changes. The EDW reports organizationally to a C-level executive who is accountable for balancing cost of care and quality of care.Level 7 - Clinical Risk Intervention & Predictive Analytics (Taking more financial risk & managing it proactively) Analytic motive expands to address diagnosis-based, fixed-fee per capita reimbursement models. Focus expands from management of cases to collaboration with clinician and payer partners to manage episodes of care, using predictive modeling, forecasting, andrisk stratificationto support outreach, triage, escalation and referrals. Physicians, hospitals, employers, payers and members/patients collaborate to share risk and reward (e.g., financial reward to patients for healthy behavior).Figure 11: BI implementation solution

Patients are flagged in registries who are unable or unwilling to participate in care protocols. Data content expands to include home monitoring data, long term care facility data, and protocol-specific patient reported outcomes. On average, the EDW is updated within one hour or less of source system changes.Level 8 - Personalized Medicine & Prescriptive Analytics (Contracting for & managing health) Analytic motive expands to wellness management, physical and behavioral functional health, and mass customization of care. Analytics expands to include NLP of text, prescriptive analytics, and interventional decision support. Prescriptive analytics are available at the point of care to improve patient specific outcomes based upon population outcomes. Data content expands to include 724 biometrics data, genomic data and familial data. The EDW is updated within a few minutes of changes in the source systems.D. BUSINESS INTELLIGENCE FOR HEALTHCAREOrganizations shopping for healthcare business intelligence, or BI, solutions will find a multitude of available healthcare analytics options. These tend to fall into three primary categories:

a. Single-solution apps, a.k.a., one-solution wonders: Often called best-of-breed or point solutions, these apps focus on a single goal and a single slicing of the data. Ministry Health Cares CIO, Will Weider, describes them as extract-ware because they receive an extract of your data with the promise of allowing you to have the ability to analyze and monitor your data. The good part: these single-solution apps are very focused on solving a single problem, and can be straightforward to implement because they focus narrowly on a limited slice of data. Investing in one of these apps can have an affordability appeal as well.The downside to single-solution apps is that each solves for just one thing, for example, controlling supply costs. The functionality provided by these types of solutions is limited to the out-of-the-box capabilities of the app. In healthcare, problems and solutions are often multifaceted, which exceeds the capabilities of most single-solution apps. And more complex concerns and analysis arent possible because one-solution wonders dont integrate with other one-solution wonders. So if you want to know how the size of your staffing and patient loads affect the hospitals supply consumption, youll likely need an app that focuses on supply consumption and another one that looks at staffing and patients. Then youll also need to bring in an analyst to take the information from both apps, determine how it relates, digest it, and offer his or her best guess as to what it all means to your business.b. Build-it-yourself solutions: Organizations looking to solve very specific problems with the tools they have on hand often opt to build solutions themselves.The benefit: the organization knows exactly what it has to work with and, with the right people, can sometimes create a solution that enlists available data and produces answers to previously-identified questions. Internally developed solutions can come with a low upfront cost commitment when created by teams already employed by the organization. And rather than waiting for the solution to be finished before testing it out, internal teams have the benefit of running it by key decision makers for feedback then implementing that feedback mid-development before the solution is ready for roll out.Concerns with the taking DIY route, however revolves around expertise and total cost of ownership; while the teams building these solutions may be intimately connected with a hospitals IT system and/or existing data and needs, developing complex systems that take all of this into consideration may prove to be a costly and time and resource-intensive process. If a DIY approach is initially successful, scaling that team to meet enterprise-wide demand can be a real challenge. IT teams that traditionally take a strict waterfall project management approach may also lack the agility necessary to adapt to rapidly changing vocabularies, standards and later healthcare use cases. Inability to deliver continuing value can lead to project delays, unmet expectations and overall frustration with the costs that have gone into an internal development effort.c. Enterprise Data Warehouse Platform Solutions: Designed for enterprise-level operation, platform solutions offer the greatest amount of analytic flexibility and adaptability. A healthcare enterprise data warehouse (EDW) platform serves as the foundation to run analytics applications and execute an analytics strategy for years to come.For large healthcare organizations, it is impossible to provide reliable and repeatable reporting and analysis without aggregating data typically found in hundreds of different technology solutions from various vendors into a single source of organizational truth. An EDW serves as an analytic foundation on which organizations can build registries, drive reporting, and enable population health and model clinical and financial risk.This approach is best suited to a data-driven culture that values analytics as a business differentiator. Organizations with a commitment to a higher degree of data literacy and data management skills are very successful with a data warehouse.Across all industries, the average ROI from successful EDW projects is 431 percent.With the right architecture in place, healthcare EDW implementations can generate similarly impressive returns.Challenges associated with adopting an EDW often include the perception that EDWs come with a slow initial time to value. Note, however, that not all EDWs are plagued by this; EDWs using a Late-Binding architecture, for example, have a short time to value, often spanning weeks rather than months or years. A vendors implementation approach can also greatly affect time to value. For example, a vendor that takes an iterative route will often produce results faster than a vendor who wants to tackle multiple projects and data sets right off the block.E. THE EDW ARCHITECTURE AND BI FOR HEALTHCAREFirst, a well-implemented data warehouse should support a large number of business intelligence use cases. For example, if an organizations quality data warehouse, research data warehouse and operational data warehouse each loads data in its own way, the result will be multiple versions of the truth, frustration and a waste of valuable resources. The underlying data should be consistent for quality, operations, research or any other analytic needs, and the flexibility to mix and match data sources and data elements from different feeds for different purposes should be a design goal of any healthcare data warehouse.Figure 12: EDW and BI for Healthcare implementing process for short and long term.

Second, a good data warehouse should ensure appropriate data access and security. There are valid organizational reasons to be wary of granting access to an enterprise data warehouse to research users, for example. The data warehouse should adhere to HIPAAs strict rules regarding the use of medical data for research. In order to support a diverse and thriving user base, a data warehouse must have a good governance structure and an architecture that supports sophisticated security rules.One type of EDW architecture is early binding. Prevalent in several industries, these traditional data warehouses extract data from source systems and bind it to business rules. This platform architecture applies business rules or data-cleansing routines very early in the data warehouse development lifecycle.Early-binding approaches, utilizing enterprise data models, are appropriate for business rules or vocabularies that change infrequently or in cases where the organization needs to lock down data for consistent analytics. In practice, however, the decision to bind early can have a significant, often negative impact on the success of data warehousing projects, particularly in healthcare.Because of the unique requirements of healthcare analytics, a particularly effective warehousing approach for the healthcare industry is the Late-Binding architecture. This platform architecture delays the application of business rules (such as data cleansing, normalization and aggregation) to data for as long as possible so organizations have time to review and revise data, form hypotheses and determine optimal analytic uses. Late-Binding is especially ideal for what-if scenario analysis and best suited to ever-changing healthcare data and evolving use cases.Ideally suited to healthcare, this unique approach delivers:Data modeling flexibility:Late-Binding data warehouse architecture leverages the natural data models of the source systems by reflecting much of the same data modeling in the data warehouse.Data flexibility:Because the EDW does not bind data from the outset into a comprehensive enterprise model, an organization can use that data as needed to create analytic applications with the platform. For example, an analytic application for quality improvement might contain clinical data, patient satisfaction data and costing data. An application for operational purposes might contain staffing levels and clinical data. An analytic application for research might combine clinical outcomes data with a research registry. The more data an organization feeds into the warehouse, the more options it has for using the data. The organization loads the most useful sources of data first and then follows a road map to incorporate more data sources over time.Changessaved:Late-Binding architecture retains a record of the changes to vocabulary and rule bindings in the data models of the data warehouse. This provides a self-contained configuration control history that can be invaluable for conducting retrospective analysis, and power forecasting and predictive analytics.Iterative approach:Late-Binding architecture allows for the delivery of project based value more rapidly by making it easier to break what is often detailed, high-intensity technical work into manageable chunks. Successful iterations early on in the project allow users to build momentum and celebrate success sooner and to realize success before committing additional resources and/or embarking on the next project.Granular security:The Late-Binding architectures security infrastructure keeps data secure while enabling appropriate access for different types of users. For example, the system could be set up to grant a researcher access only to data marts that have been de-identified. Additionally, researchers approved for access to patient data could be granted access to only the specific data about patients in their study. Late-Binding architectures also support a flexible security model for adding users to or removing users from security groups that map to particular data marts. Researchers might be limited to only de-identify data, finance users could have limited access to clinical data and clinical departments may have only limited access to financial data.F. Where Healthcare Business Intelligence and Analytics Stand TodayThe lack of a BI strategy is one of nine fatal flaws in business operations improvement (BOI) in healthcare. The biggest flaw of all is the lack of a documented BI strategy, or the use of a poorly developed or socialized one.G. Different Sorts of Inputs in Hospital Doctor Patient Subscriber Symptom Appointment Bill Payment Insurance CompanyFigure 13: Hospital Information Model

Non- SubscriberPatientsDoctorSymptomsDiagnosis

Subscriber (insured)BillPayment Payment nPayment 2Payment 1Insurance CompanyAppointment

V. CONCLUSIONLets finish up with the following three points. First, we found Business Intelligence in general is very important for any organization because are the set of techniques and tools for acquisition and transformation of raw data into data useful by the organization.Next, process of BI is followed by Data Analytics which is actually a subset of BI and uses the processed data from BI for statistical and quantitative analysis, explanatory and predictive models and fact-based management to drive the process of decision making.Finally, we observed that BI and Data Analytics in healthcare is the answer that hospitals are looking for as they move to data driven healthcare improvements and cost reductions that is built on data warehouse.ACKNOWLEDGEMENTI would like to express deepest gratitude to my professor Joseph Morabito for his full support, expert guidance, understanding and encouragement throughout my study and research. Without his incredible patience and timely wisdom and counsel, my thesis work would have been a frustrating and overwhelming pursuit.

REFERENCES https://en.wikipedia.org/wiki/Business_intelligence https://en.wikipedia.org/wiki/Data_analysis https://www.healthcatalyst.com/healthcare-analytics-adoption-model/ https://www.healthcatalyst.com/healthcare-business-intelligence-data-warehouse http://searchbusinessanalytics.techtarget.com/report/Business-intelligence-in-healthcare-Special-report20 | Page