Thursday, January 30, 2020

Summary The Health Care Quality Book Essay Example for Free

Summary The Health Care Quality Book Essay Chapter 1: science and knowledge foundation Two notable contributions to the industry from the Journal of American Medical Association: 1. Assessment of the state of quality ïÆ'   serious and widespread quality problems 2. Categorization of three defects: a. Underuse: many scientifically sound practices are not used as often as they should be b. Overuse: can be seen in areas such as imaging studies for diagnosis in acute asymptomatic low back pain or prescription of antibiotics when not indicated for infections. c. Misuse: when the proper clinical care process is not executed appropriately, such as giving the wrong drug to a patients. To Err Is Human: publication that shows the severity of the quality problems in a way that captured the attention of all key stakeholders for the first time ïÆ'   this report spoke about the negative, not how it should be improved. Crossing the quality chasm: provided a blueprint for the future that classified and unified the components of quality through six aims for improvement, chain of effect and simple rules for redesign of health care. Six dimensions of quality (Berwick): Outcome measures and goals (IOM) = Institute of Medicine’s Safe Percentage of overall mortality rates/patients experiencing adverse events or harm Effective: science and evidence should be applied and serve as the standard for delivery of care. How well are evidence based practices followed? Percentage of time diabetic patients receive all recommended care at each doctor visit.. Efficient: Care and service should be cost effective, and waste should be removed. Analyzing the costs of care by patient, organization, provider or community Timely: no waits or delays in receiving care Measured by waits and delays in receiving needed care, service, and test results. Patient centered: system should revolve around the patient, respect its preferences and put the patient in control Patient or family satisfaction with care and service Equitable: Disparities should be eradicated. Examining differences in quality measures by race, gender, income or other factors. The underlying framework for achieving these aims depicts the health care system in four levels: Level A: what happens with the patient Level B: the micro system where care is delivered by small provider teams Level C: organizational level: the macro system or aggregation of the Microsystems and supporting functions. Level D: external environment where payment mechanisms, policy and regulatory factors reside (verblijven) Chapter 2: Basic concepts of health care quality The following attributes relevant to the definition of quality of care are important: Technical performance ïÆ'   refers to how well current scientific medical knowledge and technology are applied in a given situation (it is usually assessed in terms of timeliness and accuracy of the diagnosis, appropriateness in of therapy) Management of the interpersonal relationship ïÆ'   refers to how well the clinician relates to the patient on a human level. The quality of this relationship is important because: By establishing a good relationship with the patient the clinician is able to fully address the patient’s concerns, reassure the patient and relieve the patient’s suffering It can affect technical performance: the clinician is better able to elicit from that patient are more complete and accurate medical history, which can result in a better diagnosis Amenities (voorzieningen) ïÆ'   refers to the characteristics of the setting in which the encounter between patient and clinician takes place, such as comfort, convenience and privacy. Amenities are valued both in their own right and for their effect on the technical and interpersonal aspects of care. Amenities can yield (opleveren) benefits that are more indirect. Access ïÆ'   refers to the degree to which individuals and groups are able to obtain needed services. Responsiveness to patient preferences ïÆ'   respect for patients’ values, preferences and expressed needs affects quality of care as a factor in its own right. Equity ïÆ'   the amount, type or quality of health care provided can be related systematically to an individual’s characteristics, particularly race and ethnicity, rather than to the individual’s need for care or healthcare preferences, have heightened concern about equity in health care. Medicine does not fulfill its function adequately until the same perfection is within the reach of all individuals. Efficiency ïÆ'   refers to how well resources are used in achieving a given result. Cost-effectiveness ïÆ'   how much benefit, typically measured in terms of improvement in health status, the intervention yields for a particular level of expenditure. For each stakeholder in health care, quality can be differently defined: page 30 + 31. These definitions have a great deal in common: Each definition emphasizes different aspects of care Definitions conflict only in relation to cost-effectiveness All evaluations of quality of care can be classified in terms of one of the three aspects of caregiving they measure: Structure: when quality is measured in terms of structure, the focus is on the relatively static characteristics of the individuals who provide care and of the settings where the care is delivered. These characteristics include the education, training and certification of professionals. Process: refers to what takes place during the delivery of care, also can be the basis for evaluating quality of care. Outcomes: Outcome measures, which capture whether healthcare goals were achieved, are another way of assessment of quality of care. Outcome measures have to include the costs of care as well as patients’ satisfaction with care. Which one is better to use? ïÆ'   none of them, all depends on the circumstances. To assess quality using structure, process or outcome measures, we need to know what constitutes good structure, good process and good outcomes. We need criteria and standards we can apply to those measures of care: Criteria = specific attributes that are the basis for assessing quality Standards = express quantitatively what level the attributes must reach to satisfy preexisting expectations about quality. For example ïÆ'   type of measure: structure and focus on primary care group practice: Criterion: percentage of board-certified physicians in internal or family medicine – Standard: 100% of physicians in the practice must be board certified in internal or family medicine. Optimal standards: denote the level of quality that can be reached under the best conditions, typically conditions similar to those under which efficacy is determined ïÆ'   useful as reference point. Structural measures are well suited to detecting lack of capacity to deliver care of acceptable quality. They are also only as good and useful as strength of their relation to desired processes and outcomes. To evaluate structure, process and outcome measures criteria and standards are essential. Whereas the formulation of criteria is expected to be evidence driven (efficacy). The setting of standards is not similarly tied to scientific literature. The decision to set standards at a minimal, ideal or achievable level is most meaningful if driven by the goals behind the specific quality of care evaluation for which the standards are to be used. Chapter 3: Variation in medical practice and implications for quality Variation ïÆ'   the difference between an observed event and a standard or norm. Without this standard, or best practice, measurement of variation offers little beyond (biedt niet meer dan) a description of the observations. Random variation = physical attribute of the event or process, adheres to the laws of probability and cannot be traced to a root cause. (houdt zich aan de wetten van waarschijnlijkheid en kan niet worden herleid tot een oorzaak). It is not worth to study it in detail. Assignable variation = arises from a single or small set of causes that are not part of the event or process and therefore can be traced, identified, and implemented and eliminated ïÆ'   subject to potential misunderstanding because of complexity of design and interpretation. 1. Process variation = the difference in procedure throughout an organization (use of various screening methods for colorectal cancer) Technique ïÆ'   multitude of ways in which a procedure can be performed within the realm of acceptable medical practice. 2. Outcome variation = difference in the result of a single process (mostly focus on this measure) the process yielding optimal results ïÆ'   outcomes research 3. Performance variation = the difference between any given result and the optimal ideal result. This threshold or best practice is the standard against which all other measurements of variation are compared. Performance variation tells us where we are and how far we are from where we want to be, and suggests ways to achieve the desired goal. Variation can be desirable? ïÆ'   a successful procedure that differs from other, less successful procedures is by definition variation. The objective then for quality improvement is not simply to identify variation but to determine its value. How can the variation be eliminated or reduced in the ways that focus on the variation rather than on the people involved? So, understanding the implications for quality of variation in medical practice is not simply learning how eliminate variation but learning how to improve performance by identifying and accommodating good or suboptimal variation from a predefined best practice. Variability plays a role in identifying, measuring and reporting quality indicators (effective, efficient, equitable..) and process-of-care improvements. Some hospitals are reluctant to use quality improvement measures (they perceive them as biased towards academic medical research centers or large health care organization) ïÆ'   untrue! Quality improvements efforts can be and have been successfully applied to small organization and practices. The size of an organization also effects the ability to disseminate (verspreiden) best practices. Large organization tend to have rigid frameworks or bureaucracies; change is slow and requires perseverance (doorzettingsvermogen) and the ability to make clear to skeptics and enthusiasts the value of the new procedure in their group and across the system. An organization ‘s commitment to paying for quality improvement studies and implementation is equally affected by its size and infrastructure, but there are some minimum standard levels of quality and linked reimbursement schemes to achieving goals established by the Joint Commission, CMS and Medicare ïÆ'   all organizations obligated to meet these standards. Quality improvement effort must consider organizational mind-set, administrative and physician worldviews, and patient knowledge and expectations. Physician buy-in is critical to reducing undesired variation or creating new and succesfull preventive systems of clinical care, therefore: training physician champions and inciting (aanzetten) them to serve as models, mentors and motivators and it reduces the risk of alienating (vervreemden) the key participants in quality improvement efforts. Patient education in quality of care is equally subject to variation; patients are aware of the status of health care providers in terms of national rankings, public news of quality successes and so on. Educating patients about a health care organization and its commitment to quality makes variation and process-of-care measures available to the public. Organizational mind set ïÆ'   organizational infrastructure is an essential component in minimizing variation, disseminating best practices and supporting a research agenda associated with quality improvements. Economic incentives may be effective in addressing variation in health care by awarding financial bonuses to physicians and administrators who meet quality targets or withholding bonuses from those who do not. Goals of incentives: to help people understand that their organization is serious about implementing quality changes and minimizing unwanted variation to ensure alignment with national standards an directions in quality of care and to encourage them to use the resources of the organization to achieve this alignment . Chapter 4: Quality improvement: the foundation, processes, tools and knowledge transfer techniques Different leaders of quality improvement systems: page 63 – 67 Quality improvement approaches (derivatives and models of the ideas and theories developed by thought leaders): PDCA/PDSA, Associates for Process Improvement’s Model for Improvement, FOCUS PDCA, Baldrige criteria, ISO 9000, Lean, Six Sigma. PDCA/PDSA cycle Basis for planning and directing performance improvement efforts. 1 Plan: Objective: what are you trying to accomplish? What is the goal? Questions and predictions: What do you think will happen? Plan to carry out the cycle: Who? What? When? Where? 2 Do Educate and train staff Carry out the plan (try out the change on a small scale) Document the problems and unexpected observations? Begin analysis of the data 3 Study/Check Assess the effect of the change and determine the level of success as compared to the goal/objective Compare results to predictions Determine what changes need to be made and what actions will be taken next 4 Act Act on what you have learned Determine whether the plan should be repeated with modifications or a new plan should be created Perform necessary changes Identify remaining gaps in process or performance Carry out additional PDCA/PDSA cycles until the agreed-upon goal or objective is met API improvement model Simple model for improvement based on Deming’s PDSA cycle. The model contains three fundamental questions that form the basis of improvement: What are we trying to accomplish? How will we know that a change is an improvement? What change can we make that will results in improvement? FOCUS/PDCA model Building on de PDCA cycle the FOCUS PDCA model is created: more specific and defined approach to process improvement. The key feature of this model is the preexistence of a process that needs improvement. The intent of this model is to maximize the performance of a preexisting process, although the inclusion of PDCA provides the option of using this model for new or redesign process. F: FIND a process to improve O: ORGANIZE a team that knows the process C: CLARIFY current knowledge of the existing or redesigned process U: UNDERSTAND the variables and causes of process variation within the chosen process S: SELECT the process improvement and identify the potential action for improvement Baldrige criteria The criteria can be used to assess performance on a wide range of key indicators: health care outcomes; patient satisfaction; and operational, staff and financial indicators. The Baldrige healthcare criteria are built on the following set of interrelated core values and concepts (page 70). The criteria are organized into seven interdependent categories: Leadership Strategic planning Focus on patients, other customers, and markets Measurement, analysis and knowledge management Staff focus Process management Organizational performance results Baldrige’s scoring system is based on a 1000 point scale. Each of the seven criteria is assigned a maximum value ranging from 85 to 450 maximum points. The most heavily weighted criterion is the results category (450). The weight of this category is based on an emphasis Baldrige places on results and an organization’s ability to demonstrate performance and improvement in the following areas: Product and service outcomes, customer-focused outcomes, financial and market outcomes, workforce-focused outcomes, process effectiveness outcomes, leadership outcomes. ISO 9000 The international Organization for Standardization (ISO) issued the original 9000 series of voluntary technical standards in 1987 to facilitate the development and maintenance of quality control programs in the manufacturing industry. In 2000, ISO made major changes to the standards to make them more relevant to service and health care settings. Focused more on quality management systems, process approach, and the role of top management, the most recent standards include eight common quality management principles: Customer-focused organization Leadership Involvement of people Process approach System approach to management Continual improvement Factual approach to decision making Mutually beneficial supplier relationships Lean thinking Lean ïÆ'   to describe production methods and product development that, when compared to traditional mass production processes, produce more products, with fewer defects, in a shorter time. The focus of Lean methodology is a ‘back to basics’ approach that places the needs of the customer first through the following five steps: 1. Define value as determined by the customer, identified by the provider’s ability to deliver the right product or service at an appropriate price. 2. Identify the value stream: the set of specific actions required to bring a specific product or service from concept to completion 3. Make value added steps flow from beginning to end 4. Let the customer pull the product from the supplier, rather than push products 5. Pursue perfection of the process Six sigma The aim of six sigma is to reduce variation (eliminate defects) in key business processes. By using a set of statistical tools to understand the fluctuation of a process, management can predict the expected outcome of that process. Six sigma incluses five steps, commonly known as DMAIC: Define: Identify the customers and their problems. Determine the key characteristics important to the customer along with the processes that support those key characteristics. Identify existing output conditions along with process elements. Measure: Categorize key characteristics, verify measurement systems and collect data Analyze: Convert raw data into information that provides insights into the process. These insights include identify the fundamental and most important causes of the defects or problems. Improve: Develop solutions to the problem, and make changes to the process. Measure process changes and judge whether the changes are beneficial or another set of changes is necessary. Control: If the process is performing at a desired and predictable level, monitor the process to ensure that no unexpected changes occur. The primary tool of six sigma is that focus on variation reduction will lead to more uniform process output. Secondary effects include less waste, less throughput time and less inventory. Quality tools: three categories (also six categories distinguishing on page 74) Basic quality tools Control chart: upper and lower control boundaries that define the limits of common cause variation. It is used to monitor and analyze variation from a process to determine whether that process is stable and predictable or unstable and not predictable Histogram Cause-and-Effect/Fishbone diagram: the problem is stated on the right side of the cart, and likely causes are listed around major headings that lead to the effect. It can help organize the causes contributing to a complex problem. Pareto chart: 80% of the variation of any characteristic is caused by only 20% of the possible variables. Management and planning tools (75) Affinity diagram: a list of ideas is created, and then individual ideas are written on small note cards. Team members study the cards and group the ideas into common categories. The affinity diagram is a way to create order of a brainstorm session. Matrix diagram: helps us to answer two important questions when sets of data are compared: Are the data related? How strong is the relationship? Priorities matrix: uses a series of planning tools built around the matrix chart. Other quality tools Benchmarking: compares the processes and successes of you competitor of similar top-performing organizations to your current processes to define, through gap analysis, process variation and organizational opportunities for improvement. Benchmarking defines not only organizations that perform better but also how they perform better. Failure mode and effect analysis: examines potential problems and their causes and predicts undesired results. FMEA normally is used to predict product failure form past part failure, but it also can be used to analyze future system failures ïÆ'   both in patient safety toolbox. 5S: is a systematic program that helps workers take control of their workspace so that is actually works for them instead of being a neutral or, as is quite common, competing factor. Sort: means to keep only necessary items Straighten: means to arrange and identify items so they can be easily retrieved when needed. Shine: means to keep items and workspaces clean and in working order Standardize: means to use best practices consistently Sustain: means to maintain the gains and make a commitment to continue the first four S. Theory of Transfer of Learning ïÆ'   page 77 Rapid cycle testing/improvement Developed by IHI, rapid cycle testing/improvement was designed to create various small tests involving small sample sizes and using multiple PDSA cycles that build on the lessons learned in short period while gaining buy-in from staff involved in the change. It is designed to reduce the cycle time of new process implementation from months to days. Read 78/79/80/81 Chapter 5: Milestones in the quality measurement journey Many health care providers struggle to address the measurement mandate proactively, which leads organizations to assume a defensive posture when external organizations release the data. In such cases, the provider usually responds in one of the following ways: data are old, data are not stratified and do not represent appropriate comparisons, our patients are sicker than those in other hospitals. A more proactive posture would be to develop an organization-wide approach to quality measurement that meets both internal and external demands. This approach is not a task, but a journey that has many potential pitfalls and detours. Key milestones exist that mark your progress and chart your direction. Milestone 1: Develop a measurement philosophy (strategic step): What is/should be the role of performance measurement in the organization? Should it be done periodically or a day-to-day function? The first step toward this milestone should be the creation of an organizational statement on the role of measurement. Three simply questions should be explored when developing a measurement philosophy: 1. Do we know our data better than anyone else does? 2. Do we have a balanced set of measures that encompasses clinical, operational, customer service and resource allocations? 3. Do we have a plan for using the data to make improvements? Milestone 2: Identify the concepts to be measured (types and categories of measures) (strategic and operational step) The second milestone consists of deciding which concepts the organization wishes to monitor. There are three basic categories of measures: structure (s): represents the physical and organizational aspects of the organization processes (p): every activity, every job, is part of a process. outcomes (o): structure combine with processes to produce outcomes. The relationship between these categories usually is shown as follows: s + p = o Another categorization that can be made is (more specific) according to the six aims for improvement: 1 Safe, 2 Effective, 3 Patient centered, 4 Timely, 5 Efficient, 6 Equitable Regardless of the method used, an organization must decide which concepts, types, or categories of measures it wishes to track. Milestone 3: Select specific measures What aspect of (patient safety) do we want to measure? What specific measures could we track? Choose a specific indicator In this step you need to specifying what aspect of for example patient safety you intend to measure and the actual measures. Within the patient safety, you could focus on medication errors, patient falls, wrong site surgeries etc. Within the medication error you can measure different things: number of medication orders that had an error, total number of errors caught each day, percentage of orders with an error etc. Milestone 4: Develop operational definitions for each measure An operational definition is a description, in quantifiable terms, of what to measure and the specific steps needed to measure it consistently. A good operational definition: Gives communicable meaning to a concept or an idea Is clear and unambiguous Specifies the measurement method, procedures and equipment Provides decision-making criteria when necessary and Enables consistency in data collection The problem created by poor operational definitions should be obvious: if you do not use the same operational definition each time you record and plot data on a chart, you will either miss a true change in the data or think a change has occurred when in fact one has not. Using the same operational definition becomes even more critical if you are trying to compare several hospitals or clinics in a system. Milestone 5: Develop a data collection plan and gather data (giving special consideration to stratification and sampling) Direct start with data collection may cause teams to collect the wrong data in the wrong amounts. The data collection phase consists of two parts: Planning for data collection: what process will be monitored? What specific measures will be collected? What are the operational definitions of measures?.. The actual data gathering: how will you collect the data? Will you conduct a pilot study? Who will collect the data? (page 94) Once you have resolved these issues, the data collection should go smoothly. Sometimes improvement teams do not spend enough time on data collection plans. This can lead to the following problems: (1) collect too much, or too little data (2) collect the wrong data (3) become frustrated with the entire measurement journey. Consequences can be: the team tends to (1) distort (verdraaien) the data (2) distort the process that is produced the data or (3) kill the messenger. Two key data collection skills – stratification and sampling enhance any data collection effort. Stratification = the separation and classification of data into reasonably homogeneous categories. The objective of stratification is to create strata, or categories, within the data that are mutually exclusive and facilitate discovery of patterns that would not be observed if the data were aggregated. Stratification allows understanding of differences in the data caused by different factors (page 95). If you do not think about how these factors could influence your data you run the risk of making incorrect conclusions and having to filter out the stratification effect manually after you have collected the data. Sampling (steekproef) ïÆ'   the most important thing you can do to reduce the amount of time and resources spent on data collection. There are four conditions for developing a sampling plan: accuracy, reliability, speed and economy. Sampling consists of a series of comprom ises and trade-offs. The basic purpose of sampling is to be able to draw a limited number of observations and be reasonably confident that they represent the larger population from which they were drawn. There are two basic approach to sampling: Probability sampling techniques: based on statistical probability (systematic sampling, simple random sampling, stratified random sampling, stratified proportional random sampling) Non-probability sampling techniques: should be used when estimating the reliability of the selected sample or generally applying the results of the sample to larger population is not the principal concern. The basic objective is of this type of sampling is to select a sample that the researchers believe is typical of the larger population. (convenience sampling, quota sampling and judgement sampling) 99-102 Milestone 6: Analyze the data using statistical process control methods (especially run and control charts) Translate data into information. Milestone 7: Use the analytic results to take action (implement cycles of change, test theories and make improvements) Chapter 6: Data collection Quality measurements can be grouped into four categories: Clinical quality Financial performance Patient satisfaction Functional status To report on each of these categories, several spate data sources may be required. The challenge is to collect as much data as possible from the fewest sources with the objectives of consistency and continuity in mind. Retro prospective data collection: involves identification and selection of a patient’s medical record or group of records after the patient has been discharged. Prospective data collection: relies on medical record review, but it is completed during a patient’s hospitalization or visit rather than retrospectively. Disadvantage: time consuming and can distract nurse from their direct patient care responsibilities, expensive method, mostly full time data analyst needed. Source for data for quality improvements: Administrative databases: are information collected, processed and stored in automated information systems. Excellent source of data for reporting on clinical quality, financial performance, and certain patient outcomes. Advantages: less expensive source of data, they incorporate transaction systems, moest of the code sets embedded are standardized, the database are staffed by individuals who are skilled, the volume is great, data reporting tools are available.. Disadvantages: some argue that these data is less reliable than data gathered by chart review. Patient surveys: especially when teams are interested in the perceptions of patients, either in terms of the quality of care or the quality of service provided. A team can design the survey itself, hire an expert to design a survey, or purchase an existing survey/survey service. Functional status surveys: usually measured before and at several points following the treatment or procedure. (for example a baseline before the knee procedure and then assessments are made at regular intervals after the surgery) Health plan databases: excellent source of data for quality improvement projects, particularly projects that have a population health management f ocus. These databases are valuable because they contain detailed information on all care received by health plan members. It provides a comprehensive record of patient activity and can be used to identify and select patients for enrollment in disease management programs. Used properly: rich source of data for population management, disease management and quality improvement projects. Health plan databases limitations: considerations include accuracy, detail and timeliness. Recoding may make some data inaccurate, they do not contain detailed information on outcomes of care . Patient registries: powerful source of quality improvement data. Advantages: rich source of information because they are customized, can collect all the data that the physician or health system determines are most important, can be used for quality improvements, they are not subject to the shortcomings of administrative or health plan databases, collection techniques can be combined to provide a complete picture of the patient experience. They are versatile and flexible. Example case study in clinical reporting: page 123-127 Conclusion: there are many sources and data collection approaches from which to choose. Rarely does one method serve purposes, so it is important to understand the advantages and disadvantages of all methods. A combination is also possible. Knowledge of different sources and techniques will help you to use data more effectively and efficiently in your clinical improvement effort. Chapter 7: Statistical tools for quality improvement Three fundamental purposes for performance measurement: Assessment of current performance: identify strengths and weaknesses of current processes Demonstration and verification of performance improvement And control of performance Performance measurement benefits organizations in several ways: provides factual evidence of performance, promotes ongoing organization self-evaluation and improvement, illustrates improvement, facilitates cost-benefit analysis, helps to meet external requirements and demands for performance evaluation, may facilitate the establishment of long-term relationships with various external stakeholders. May differentiate the organization from competitors, may contribute to the awarding of business contacts and fosters organizational survival. .. Chapter 13: Leadership for quality Leadership = working with people and systems to produce needed change. Individual leadership = this set of leadership is about what people must be and what they must know how to do, if they are to influence others to bring about needed changes. Both being and doing are needed, especially when the changes required for quality improvement involve reframing core value or remaking professional teams. Many improvements in health care will require these kinds of deep changes in values. These changes are sometimes labeled as transformational changes to distinguish them from transactional changes, which do not require changes in values and patterns of behavior. Organizational leadership = about creating a supportive organizational environment in which hundreds of capable individual leaders’ work can thrive (groeien). One way to view this level (system-of-leadership level) is as a complex set of interrelated activities in five broad categories: Set direction: every organization has a sense of direction, a future self-image. A leader should set that direction. Establish the foundation: leaders must prepare themselves and their leadership teams with the knowledge and skills necessary to improve systems and lead change (and reframe values) Build will: to initiate and sustain change takes will, which seem to be highly sensitive to discord and often grind to a halt because of one loud voice opposing change ïÆ'   therefore making logical and quantitative links should be made between improvement and key business goals. Generate ideas: quality challenges require innovation. Page 313 Implementing quality as the core organizational strategy Implementing a culture that has quality improvement at its core is an important goal for providers who want to serve patients better, gain the support of healthcare providers, stay ahead of government regulation, meet consumer’s demand for transparent information on quality and costs, an gain a competitive advantage in the marketplace. Recent history: many efforts have not resulted in the sustainable quality improvements that the leaders hoped to see. Quality improvement strategy should start with leadership from the board of trustees, the CEO and the executive team, but it is a challenge for health care organizations because of the many internal competing agendas, the rapidly changing environment, employees and so on. First step: to establish an organizational culture that will support the hospital on their journey to quality ïÆ'   starting point: leadership! Kaplan Norton: Balanced Scorecard ïÆ'   this approach includes the perspective of the patient and family, internal processes such as clinical pathways, learning and growth opportunities that focus on employees and financial performance. Role of leadership: leaders ask financial questions about market share, margins and quality implications. They raise questions related to the satisfaction of their internal and external customers and the way in which business processes must change to improve and sustain quality. Primary focus on creating a culture of quality. Baldrige National Quality Program Creating the change towards quality starts with leadership. Road map for change Eight stage change process, modified form Kotter’s seminal work (Leading Change 1996) serves as a realistic and viable framework to guide leaders who are managing a change to quality: 1. Unfreezing the old culture This is the most difficult step because of culture’s influence on employee behavior and some employee’s to desire to resist change and impede progress. 2. Forming a powerful guiding coalition 3. Developing a vision and strategy 4. Communicating a vision and strategy 5. Empowering employees to act on the vision and strategy 6. Generating short-term wins 7. Consolidating gains and producing more change 8. Refreezing new approaches in the culture View as multi-pages

Tuesday, January 21, 2020

Antibiotic Usage Essay -- Biology, DNA

This optimism has been dissipated long before the 20th century when the proliferation of antibiotic resistance bacteria became evident as Fleming predicted earlier. With the rapid development of infectious disease associated with antibiotic resistance forced us to change the way we view disease and the way we treat patients. However, antibiotic use has not been without consequence and several factors had contributed to the development of resistance. Some resistances are due to spontaneous mutation and these mutations are for select antibiotic resistance whilst other bacteria tend to steal the deoxyribonucleic acid (DNA) from their counterparts who are already familiar with antibiotics (Mims, 2004; Tenover, 2006). Antibiotic remained effective against most bacterial infection, however these bacterial cells develop resistance and continue to divide resulting in a resistant population and some are no longer effective against infectious disease that was killed few years ago (Levy, 2000). The impact of antibiotic resistance could lead to more infectious diseases that are hard to treat and could result in a global threat making it difficult as mutation and evolutionary pressure cause increase in antibiotic resistance (Strelkauskas et al, 2010). Antibiotics are extremely important medicine but unfortunately bacteria become resistant make it problematic as antibiotic resistance had posed problem around the world people continued to travel and modern technology and sociology exacerbated the development of resistant strains, these strains are transferred from infected people causing a repeat in cycle as they moved from one place to the next (WHO, 2001). Increased globalisation is responsible for res istance, in large overpopulat... ...odified penicillin binding proteins. Some resistant bacteria could be dangerous for example; MRSA and the vancomycin resistant staphylococcus auerus that are virulent in human pathogens (Strelkauskas, 2010) by reducing the permeability of their membranes as a way of keeping out antibiotic by turning off production of porin and other proteins (Weston, 2008), for example; the multi-drug resistant mycobacterium tuberculosis. In pathogens such as Escherichia coli and staphylococcus aureus, efflux pumps played a major role in multi-drug resistance likewise Klebsiella species becoming resistance and other bacteria producing enzymes from Extended Spectrum Beta-lactamase (Livermore and Hawley, 2005; Tenover, 2006) also multi-drug resistance antibiotics Acinetobacter Baumanii and the New Delhi Metallo-1(NDM-1) causing havoc amongst the healthcare setting (HPA, 2010).

Monday, January 13, 2020

Impact of Computer on Our Society

Ilorin Being a paper presented at the closing ceremony of the 1st computer training and issuance of certificate by Ascetic Computer Centre on the 15th September 2007 Introduction At this closing ceremony of the 1st computer training and issuance of certificate by Ascetic Computer Centre, let us lift our eyes toward the challenges that await us in the years to come.It is our great good fortune, as organisers, that time and chance have put us not only at the edge of a chapter in the lives of these graduands, but on the edge of a bright new prospect in their affairs —a moment that will define their course, and their character, for many years to come. Guided by the ancient vision of a promised future, let us set our sights upon a set of graduands of new promise. However, our march to this new future seemed less certain than it did yesterday. We must vow to set a clear course to renew our generation.We should thank the organisers for their vision towards the creation of new Nigeria ns who will appreciate problems, exhilarated by challenges, and strengthened by achievements, Nigerians with better employment opportunities, who will be job creators. Computer is gaining vast popularity globally in recent years. Its use, which extended information-processing capabilities, are influencing organizations of all types and sizes bringing about changes in institutional goals, relations and operations.A large percentage of the activities in any institution or organisation comprise the processing of communicating information in the production and distribution processes. In the developing countries, computers are becoming part of everyday activities because of the kind of information they generate and their speed of delivery (Award 1988). 1 What is a Computer A computer is an electronic device, which accepts and processes data by following a set of instructions (PROGRAM) to produce an accurate and efficient result (INFORMATION). Since the ultimate aim of computer is to prod uce information, the art if computing is often referred o as information processing. The values of the computer lie solely on its high speed (due to its electronic nature), ability to store large amount of data, the unfailing accuracy and precision. These account for its supremacy over manual computation. The computer industry began in the late forties with a very small initial investment, and has been increasing both in strength and importance. When one looks back with analytical mind, we can conclude that computer technology keeps on advancing with remarkable increase in speed, accuracy and reliability.Computing in whatever field, science, business, and industry is reaching directly or indirectly into various aspects of our society thereby, without loss of generality has shrunk the world into such a compactness that no part can afford to lack behind or live in isolation. The advent of electronic computer was hailed by the world as a great revolution, like any industrial revolution , it promised to free man from simple routine jobs of repetitive nature by providing computing power. The first to have exploited this facility was the search for scientific enquiries.This was not unconnected with the fact that the designers of these systems were scientists themselves. Several problems have been eluding satisfactory solutions, though numerical methods of solutions exist but involved a very huge accomplished. Not only did computer make reliable solutions possible it equally opened new application methods or areas. To quote but a few are the optimization techniques of Operations Research (OR), the awe-stricken field of space research, the molecular restructuring in Biochemistry etc. It would have been tedious inexhaustible, incomplete and inaccurate to list all the possible applications of computer. There is no limit tot he uses and applications of computer; hence there is hardly a branch of science that can resist computer invasion.After science was business. Because of the natural conservatism, oppressive and exploitative nature of this class coupled with their cautions approach to things, did not deem the computing machine fit until its worth and capabilities had been proved. What came to be recognized and embraced by this class was the importance of computer for decision aking and data processing, for these reasons, elaborate mathematical tools like Operations Research (OR), Critical Path Method (CPM), and Program Evaluation and Review Techniques (PERT) which were developed in the early fifties attained respectable height in the eyes of this class. Common examples of data processing are payroll, accounting, inventory management, banking, airline seat reservation etc. These required a lot of input and output and relatively little computing. Hence the costs of computing in such areas tend to be closely bound by inputting the data and outputting the results.Some applications like airlines seat reservation, banking business and inventory control , call for real-time systems, which are dedicated to a particular application. They furnish â€Å"immediate† responses to input signals. For examples, it is easier today for one to know his statement of account in a bank the moment signals for such requests are sent to the computer. Similarly debited or credited accounts are updated almost immediately for further transactions. The Need for Computer Most of our national projects could better be accomplished with the use of computers.Consider the registration of voters for election, common entrance examination into Nigeria Secondary Schools and conducting head counts (census) to mention only a few. These involve a huge volume of data, and would naturally be unwise if accomplished manually as it would be tiring, inaccurate and full of errors. 3 Going through the history/evolution of computers, the search for a realistic head count by the United States census bureau in 1890 led to the development of an electro-mechanical machine that helped greatly during the census by cutting down the man-hour required for the processing of the census data.The accuracy of computer cannot be over-emphasized and it conforms to the objective of using the computer. A cashier in a departmental store would definitely find life boring if all calculations, issuance of receipts and giving of change would be done manually for each customer after each transaction. But with the aid of an adding machine the job is done accurately and he feels relieved. Going from the adding machine to the computer itself, the ability of the computer to perform repetitive tasks makes things easier.Once the computer is programmed using the adequate software the whole job is done with ease. The need for computer in our banking system cannot be underrated because of the huge amount of transactions and the accuracy desired. Similarly in the data processing environment where the bulk of the job is either sorting, merging files, updating information, searchin g for a particular key in a pile of data etc. , all these the computer does with ease and accurately too. In a developing economy like ours, the speed of the computer is again one of the distinguishing factors that make it inevitable.From the saying that â€Å"Time is Money†, speed is equally synonymous to time, and since computers work as a phenomenal speed coupled with its ability to access records or information directly from remote locations, efforts should be directed towards introducing computer into every facets of human endeavour. Computer and Unemployment The extent to which the computer has come to permeate all levels of our society is immeasurable. In fact, it does not matter, what you plan to do for a living, you will encounter the computer.Its impact is analogous to that of the automobile and television. Things could be quite different with computers, but it has become part 4 of our society. The automobile is largely responsible for the air pollution and congesti on of our cities today. The television has been accused of literally â€Å"rotting our minds†. Technology is usually a bit of mixed blessing and the computer is no exception. There is a popular slogan and fear that computer causes unemployment. From a professional point of view, it is untrue.The problem therefore is the ability to distinguish between unemployment and job displacement. Hence with a thorough distinction between the two, computer should be seen as a saviour from slavery. Before considering the question of displacement and unemployment, it might be necessary once again to define computer. Computer as earlier defined, is an electronic device or machine, which accepts data, following some sets of logical instructions to produce the needed results. Therefore we have to ask ourselves, can this so-called computer operate without the full assistance of human being?Can we just go to the computer for a complaint and have our problems solved without having to call on some pre-written programs meant for specific assignments, and written by programmers of course? The answer to these questions is NO. It is true that behind every successful man there is a woman, also behind the successful operations of computers there is a brain (the man). Computer on its own is just an empty box or a junk, and can in no way do any intelligent job, but with programs written the jobs are done.Therefore if a computer is given a job with the logical step to follow, it can do it better, faster and more accurately than human beings and these are some of the advantages. Consider the developed countries like Japan, USA, UK etc. , where their industries are filled with Robots. The word Robot means labour. Robots are no human beings but rather are machines that can be programmed to carry out complex and tedious task without getting bored or tired. Robots are blind, deaf and have no sense of touch. Therefore jobs that are hazardous, tedious, that could be left undone are done by these Robots.Considering an assembly plant where the 5 only thing a man does in such a plant is just to take the assembled goods out of the plant to give chance for the Robots to start assembling another. What a miracle it is to know that if a Robot breaks down in the course of its duty, almost immediately a fellow Robot (Doctor Robot) will attend to it to put it back to work. Think of the most dangerous tasks, which for the love of our dear lives we cannot do, these Robots do them. Ample examples are, blast furnace, disaster area (caused by poisonous gas) marshy areas etc. of course Robots do these jobs not thinking of any relations or parents and in fact, at a faster speed enhancing productivity. How does computer displace people and who are those displaced? As earlier mentioned, computer can only do a routine job and cannot think in any form. Now come to think of some well-structured organizations, UAC, SCOA and Leventis etc. , there are skilled personnel and unskilled workers/cl erks. The daily job of these unskilled workers are mere routine jobs like accounting procedures, oading and assembling goods which can be taken over by computers that will do it better, faster and in fact, more reliable thereby enhancing greater productivity. These labourers so displaced by computer can be taken to such areas as sales since more articles are produced. These workers should go out soliciting for markets. While the managers, engineers and technician can not be displaced because they as skilled in their job, they do real thinking. Therefore to supplement these greater efficiency and productivity of the computer, these skilled workers should be kept in an airconditioned office, and in fact, think of things yet to happen.No wonder the United States of American sees nothing left on earth again and hence proceeded to exploit the outer space. With these productive forces in operation, the prices of goods and the workers conditions of service would be improved. 6 Recently, th ese big time companies embarked on agriculture. These unskilled workers could be better utilized there, and more of them is needed for such jobs. In Africa, there are popular terms like laziness, redundancy, unproductively, etc. These are happening because there are no challenges to face.What we do mostly are routine jobs which could be boring thereby creating unhappiness and in such a situation we become less productive. The routine jobs should be given to computers while a conducive atmosphere be created for the skilled workers to think of ways of making the continent self reliant. Computer and Job Creation Let us consider a particular case study of an information system, the effect of computer and how it helps creating jobs. An information system means the collection and processing of data to yield useful information for decision making.To collect data, enumerators are needed to actually go to the field for data collection, typist are needed too and other people employed in the c ourse of recording the data and finally statisticians computer operators use the computer for processing the data to give useful results (information). Again think of the case of our consultancy services. Any company going into such ventures like agricultural business needs a consultant having in his service Agricultural scientists, soil scientists and host of other professionals in allied discipline to perform the feasibility studies.They give useful information to customers are regards the type of crops to plant, the planting seasons, the fertilizers to apply and when, the type of pests attracted by such plants etc. , all these lead to greater productivity and this is one of the things the attention of our professionals should be directed to. Come to think of the perennial problems of Cancer, AIDS and some other deadly diseases, which we have no solutions to. Our scientist and medical personnel should use most of their time addressing their minds through intensive research towards providing a remedy to such ills of the society while the computer is left to do their routine jobs for them. History has it that the early jobs where computers were employed were in accounting, payroll, ledger etc. , all of which had fixed procedures or routine hence it was easier to computerize these systems to enhance productivity and save workers from boredom. However, it must be recognized that computer requires lesser number of personnel, it can produce more and it is not likely to go on maternity leave, nor go on strikes, nor demand overtime or old age pension etc.Also the use of computer creates more jobs such as consultancy, developing and selling of software and hardware while the displaced workers could be trained as salesmen to market the goods or taken to such areas like agriculture where they can serve humanity. Therefore computer will lead to re-arrangement in the organizational set up and this will lead to greater productivity. Conclusions Computer offer innumerable benefits in enriching the quality and quantity of goods and services in any organisation.Despite the prevalent nature of computer in virtually every aspect of human endeavours, it has not been widely integrated into the production and distribution processes in Nigeria. Its integration will not only revolutionize the economy, it will engender the development of individual’ innate scientific inquiry mind and their critical thinking abilities. COMPUTER PROVIDES LONGER LIFE, REDUCES WORKING HOURS AND GREATER REMUNERATION IN RETURN.REFERENCES Ayo, C, K. (1994). Computer literacy: Operations and appreciation. Egbe: Alanukitan Press Walton, D. (1984). Blob (Computer Program): Applied system knowledge (ASK). London: Unwin Ltd 8

Sunday, January 5, 2020

How to promote yourself in different Social Media Sites.

As a continuation to the previous post, there are also social media web-sites, which can be very useful for you. Promoting yourself in Social Media Sites like Facebook.com, Twitter.com, Answers.yahoo.com and others is another good way of gaining more orders. Register yourself to the Top Social Media Sites. The most appropriate sites for you, where you could have more chance to meet targeted visitors are the following: answers.yahoo.com, freelancer.com, craigslist.org, facebook.com. To read how you should proceed on each of these web-sites, login and go to the News section. Best Regards, ThePesnters.com team