A Canadian viewpoint on data, information and uncertainty in the context of prediction in ungauged basins

2012 ◽  
Vol 44 (3) ◽  
pp. 419-429 ◽  
Author(s):  
C. Spence ◽  
D. H. Burn ◽  
B. Davison ◽  
D. Hutchinson ◽  
T. B. M. J. Ouarda ◽  
...  

The quality (i.e. the degree of uncertainty that results from the interpretation and analysis) of information dictates its value for decision making. There has been much progress towards improving information on the water budgets of ungauged basins by improving knowledge, tools and techniques during the Prediction in Ungauged Basins (PUB) initiative. These improvements, at least in Canada, have come through efforts in both hydrological process and statistical hydrology research. This paper is a review of some recent Canadian PUB efforts to use data to generate information and reduce uncertainty about the hydrological regimes of ungauged basins. The focus is on the Canadian context and the problems it presents, but the lessons learned are applicable to other countries with similar challenges. With a large land mass that is relatively poorly gauged, novel approaches have had to be developed to extract the most information from the available data. It can be difficult in Canada to find gauged or research basins sufficiently similar to ungauged sites of interest that contain the data required to force either statistical or deterministic models. Many statistical studies have improved information or at least an understanding of the quality of that information, of ungauged basin streamflow regimes using innovative regression-based approaches and pooled frequency analysis. Hydrological process research has reduced knowledge uncertainty, particularly in regard to cold regions processes, and this situation has led to the development of new algorithms that are reducing predictive uncertainty. There remains much to do. Current progress has created an opportunity to better integrate statistical and deterministic models via data assimilation of regionalization model estimates and those from coupled atmospheric-hydrological models. Aspects of such a modelling system could also provide more robust uncertainty analyses than traditional approaches.

Author(s):  
Nimini Wickramasinghe ◽  
Rajeev K. Bali

In a dynamic and complex global environment traditional approaches to healthcare delivery are becoming more and more inadequate. To address this von Lubitz and Wickramasinghe (2006e) proffered the need for a networkcentric approach that allows free and rapid sharing of information and effective knowledge building required for the development of coherent objectives and their rapid attainment. However, to realize this vision it is essential to have rich theory and robust approaches to analyse the levels of complexity of modern healthcare delivery. This paper discusses how this might be done by drawing upon the strong rich analysis tools and techniques of Social Network Analysis combined with Actor Network Theory.


Author(s):  
Paul White

Purpose – This paper aims to address the increasingly low levels of staff morale found in workplaces and the challenges managers have. Employees tend to view employee recognition programs cynically and the reasons for these reactions are explained, along with the negative results which follow. The concept of authentic appreciation is discussed, and the core components necessary for employees to feel truly valued and practical steps that can be taken are outlined. Design/methodology/approach – The paper reports lessons learned through the author’s experiences of applying the concepts to workplaces over the past several years. Findings – Job satisfaction and employee engagement are declining in spite of the proliferation of employee recognition programs. Employees perceive much employee recognition activity as being disingenuous, leading to apathy and sarcasm. There are structural issues that need to be corrected for employee recognition to be perceived as authentic – making recognition less generic, more individualized and communicated regularly in the manner that is valued by the recipients. Practical implications – Traditional approaches to employee recognition (awards and rewards) need to be re-evaluated. Continuing these activities may actually increase the negativity within a work environment. Learning what each individual employee values and then communicating appreciation to them in ways that are perceived as authentic is critical to having a positive result. Originality/value – The paper challenges the current (and growing) trend of impersonal employee recognition programs and examines the factors that contribute to recognition being perceived as inauthentic. The author then provides an alternative approach and methodology that facilitates the ability to communicate authentic appreciation.


2020 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Shreeranga Bhat ◽  
E.V. Gijo ◽  
Anil Melwyn Rego ◽  
Vinayambika S. Bhat

PurposeThe aim of the article is to ascertain the challenges, lessons learned and managerial implications in the deployment of Lean Six Sigma (LSS) competitiveness to micro, small and medium Enterprises (MSME) in India and to establish doctrines to strengthen the initiatives of the government.Design/methodology/approachThe research adopts the Action Research methodology to develop a case study, which is carried out in the printing industry in a Tier III city using the LSS DMAIC (Define-Measure-Analyze-Improve-Control) approach. It utilizes LSS tools to deploy the strategy and to unearth the challenges and success factors in improving the printing process of a specific batch of a product.FindingsThe root cause for the critical to quality (CTQ) characteristic, turn-around-time (TAT) is determined and the solutions are deployed through the scientifically proven data-based approach. As a result of this study, the TAT reduced from an average of 1541.2–1303.36 min, which in turn, improved the sigma level from 0.55 to 2.96, a noteworthy triumph for this MSME. The company realizes an annual savings of USD 12,000 per year due to the success of this project. Top Management Leadership, Data-Based Validation, Technical Know-how and Industrial Engineering Knowledge Base are identified as critical success factors (CSFs), while profitability and on-time delivery are the key performance indicators (KPIs) for the MSME. Eventually, the lessons learned and implications indicate that LSS competitiveness can be treated as quality management standards (QMS) and quality tools and techniques (QTT) to ensure competitive advantage, sustainable green practices and growth.Research limitations/implicationsEven though the findings and recommendations of this research are based on a single case study, it is worth noting that the case study is executed in a Tier III city along with novice users of LSS tools and techniques. This indicates the applicability of LSS in MSME and thus, the modality adopted can be further refined to suit the socio-cultural aspects of India.Originality/valueThis article illustrates the deployment of LSS from the perspective of novice users, to assist MSME and policymakers to reinforce competitiveness through LSS. Moreover, the government can initiate a scheme in line with LSS competitiveness to complement the existing schemes based on the findings of the case study.


Author(s):  
Caroline Howard ◽  
Richard Discenza ◽  
Murray Turoff

Colleges and universities around the country are scrambling to keep pace with the innovations in technology to engage a generation of students that come to campus with laptops, camera cell phones, and the knowledge and skills on how to use Google. Some professors make available course websites while others use podcast lectures, but these are often considered experimental. Many of these tools and techniques aim to revolutionize the learning process, however, many faculty and students worry that these advances are just distracting from the material and from time tested methods of teaching. Since no one understands the full impacts of these teaching tools or about their long range effectiveness, for now, colleges and universities are engaged in a beta test to determine how technologies will co-exist with or replace the traditional approaches. The challenge of each innovation is that it must be carefully measured against the successes of the traditional approaches.


Author(s):  
Larbi Esmahi ◽  
Kristian Williamson ◽  
Elarbi Badidi

Fuzzy logic became the core of a different approach to computing. Whereas traditional approaches to computing were precise, or hard edged, fuzzy logic allowed for the possibility of a less precise or softer approach (Klir et al., 1995, pp. 212-242). An approach where precision is not paramount is not only closer to the way humans thought, but may be in fact easier to create as well (Jin, 2000). Thus was born the field of soft computing (Zadeh, 1994). Other techniques were added to this field, such as Artificial Neural Networks (ANN), and genetic algorithms, both modeled on biological systems. Soon it was realized that these tools could be combined, and by mixing them together, they could cover their respective weaknesses while at the same time generate something that is greater than its parts, or in short, creating synergy. Adaptive Neuro-fuzzy is perhaps the most prominent of these admixtures of soft computing technologies (Mitra et al., 2000). The technique was first created when artificial neural networks were modified to work with fuzzy logic, hence the Neuro-fuzzy name (Jang et al., 1997, pp. 1-7). This combination provides fuzzy systems with adaptability and the ability to learn. It was later shown that adaptive fuzzy systems could be created with other soft computing techniques, such as genetic algorithms (Yen et al., 1998, pp. 469-490), Rough sets (Pal et al., 2003; Jensen et al., 2004, Ang et al., 2005) and Bayesian networks (Muller et al., 1995), but the Neuro-fuzzy name was widely used, so it stayed. In this chapter we are using the most widely used terminology in the field. Neuro-fuzzy is a blanket description of a wide variety of tools and techniques used to combine any aspect of fuzzy logic with any aspect of artificial neural networks. For the most part, these combinations are just extensions of one technology or the other. For example, neural networks usually take binary inputs, but use weights that vary in value from 0 to 1. Adding fuzzy sets to ANN to convert a range of input values into values that can be used as weights is considered a Neuro-fuzzy solution. This chapter will pay particular interest to the sub-field where the fuzzy logic rules are modified by the adaptive aspect of the system. The next part of this chapter will be organized as follows: in section 1 we examine models and techniques used to combine fuzzy logic and neural networks together to create Neuro-fuzzy systems. Section 2 provides an overview of the main steps involved in the development of adaptive Neuro-fuzzy systems. Section 3 concludes this chapter with some recommendations and future developments.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Ajay Noronha ◽  
Shreeranga Bhat ◽  
E.V. Gijo ◽  
Jiju Antony ◽  
Suma Bhat

PurposeThe article evaluates the obstacles, lessons learned and managerial implications of deploying Lean Six Sigma (LSS) in a dental college hospital in India.Design/methodology/approachThe work adopts the action research (AR) methodology to establish a case study, which is carried out using the LSS define–measure–analyze–improve–control (DAMIC) approach in a dental college. It uses LSS tools to enhance the productivity and performance of the Conservative Dentistry Department of a dental college and to unravel the obstacles and success factors in applying it to the education and healthcare sector together.FindingsThe root cause for high turn-around time (TAT) is ascertained using LSS tools and techniques. The effective deployment of the solutions to the root causes of variation assists the dental college to reduce the TAT of the Conservative Dentistry process from an average of 63.9 min–36.5 min (i.e. 42.9% improvement), and the process Standard Deviation (SD) was reduced from 2.63 to 2 min. This, in turn, raises the sigma level from 0.48 to 3.23, a noteworthy successful story for this dental college.Research limitations/implicationsWhile the results and recommendations of this research are focused on a single case study, it is to be noted that the case study is carried out with new users of LSS tools and techniques, especially with the assistance of interns. This indicates the applicability of LSS in dental colleges; thus, the adopted modality can be further refined to fit India's education and hospital sector together.Originality/valueThis article explains the implementation of LSS from an aspiring user viewpoint to assist dental colleges and policymakers in improving competitiveness. In addition, the medical education sector can introduce an LSS course in the existing programme to leverage the potential of this methodology to bring synergy and collaborative research between data-based thinking and the medical field based on the findings of this study. The most important contribution of this article is the illustration of the design of experiments (DOE) in the dental college process.


2015 ◽  
Vol 1 (1) ◽  
pp. 170-178
Author(s):  
Stelian Brad ◽  
Emilia Brad

AbstractDesigning courses for emerging areas of study is subject to clear challenges. If the envisaged courses are directly paid by the students – as it is the case of this research – they are actually the customers whose requirements have to be satisfied. Traditional approaches for collecting student requirements are not feasible for building up very novel topics. For such cases, an approach for course unit design that respects the lean philosophy is introduced in this paper. Lean is about creation of more value for students with fewer resources; or maximizing value while minimizing waste. The approach is based on the paradigm that, in highly dynamic and strong competitive educational markets, top quality courses must be designed from the very early stages. A hypothesis-based process defines the “content-prototype” of the course, which is further tested via web-based surveys that are directed to potential students. Results are statistically interpreted and a refined course content is formulated. The prototype for the most delicate module of the course is elaborated to test the level of delight of potential students (also called the WOW effect). Lessons learned are then considered to design the “promoter-prototype” of the course. A focus group is then used to test if potential students will feel a special experience interacting with the course content (also known as the KANDO effect). The methodology was experimented to design a master course unit on digital entrepreneurship. Empirical researches reveal the viability of the methodology to extract the appropriate topics of a course in emerging areas of study. Researches also show that a well-piloted strategy for course delivery should be in place to achieve the desired market impact.


2018 ◽  
Vol 2 (3) ◽  
pp. 11
Author(s):  
Asmaa Taher Sallam ◽  
Ali Fathi Eid ◽  
Ali Foad Elfaramawy ◽  
Laila M. Khodier

Knowledge is considered one of the effective assets which control the success of organizations, and its effective management is crucial. Although knowledge has existed and has been used along all projects, the way it was managed was almost intuitive and highly reliable on in-house systems. As a consequence, knowledge management was introduced in the late 1990s to help companies create, share, and systematically use knowledge. Knowledge management can be defined as the identification, optimization, and active management of intellectual assets that create value, increase productivity, and gain and sustain competitive advantage. Construction field, as one of the most complicated fields, is considered a project- based field where numbers of investments in it are in millions every year. Although knowledge in construction is among the main factors for project success, most of this knowledge lies in the minds of the people, which makes it hard to be captured and stored. Accordingly, effective knowledge management in construction is affected by different factors, including the willingness of people to share their knowledge and the mobilization of the workforce from one project to another without sharing lessons learned and previous knowledge. Here comes the role of application of KM, which could help prevent “reinventing the wheel” in construction. This paper aims at offering a comprehensive overview of the application of KM in construction through reviewing extant literature sources. Topics discussed included factors affecting KM, KM tools and techniques, the processes of KM, and the main benefits and challenges facing KM. There are many factors affecting knowledge management and many tools and techniques to manage knowledge. As for the findings of this paper, they took the form of an analysis of the main benefits and challenges facing the application of KM in construction.


Author(s):  
Pat Armstrong ◽  
Ruth Lowndes

The final chapter identifies some critical lessons learned during an eight-year project. Many in the team had worked on large grants and/or on ethnographic studies. Developing a new version of ethnography, however, required creative teamwork. So did moving beyond narrower forms of interdisciplinary and international research and more traditional approaches to mentoring in order to ensure collective, consultative, reflexive, as well as continuous knowledge creation and sharing. This chapter argues that such creative team work depends on building relationships and on organizing meetings that are stimulating intellectually and move the research forward. Those meetings must also be fun. Creative team work also requires significant preparation for the site visits, especially when those visits are intense and involved highly vulnerable populations. It means mentoring through sharing the entire research process in egalitarian ways. Finally, it means thinking about what happens to the team and the data after the funding ends.


Author(s):  
Iain Morrison ◽  
Bryn Lewis ◽  
Sony Nugrahanto

The aim of increasing the quality of healthcare has led to the development of a number of ‘guideline’ systems whereby clinicians receive assistance in decision making in a given care context – for example in areas such as prescribing or therapeutics. These guidelines range in complexity and functionality from simple textual references through to executable modules which can subsume some of the clinical decision making process. In the latter case, ensuring consistent and interoperable engagement between the guideline engine, clinical information system and patient record can become problematic. Critical areas include vocabulary and terminology (in differing use contexts) and the interfaces and interaction between different sub-systems where traditional approaches have been focussed on tightly coupling of sub-systems and in the generation of special purpose ‘glue’ languages and logic. In this paper, we briefly describe an approach to clinical, information and service modelling. This approach uses tools and techniques gaining increasing acceptance in the e-Commerce domain, which shares many of the technical and interoperability problems present in e-Health.


Sign in / Sign up

Export Citation Format

Share Document