scholarly journals Computational Thinking in der Musikwissenschaft: Jupyter Notebook als Umgebung für Lehre und Forschung

2020 ◽  
Author(s):  
Uwe Seifert ◽  
Sebastian Klaßmann ◽  
Timo Varelmann ◽  
Nils Dahmen

We show that in connection with the digitalization of musicology a special kind of mathematical and logical thinking, i. e. computational thinking/literacy, is in need. Computational thinking is characterized by effective procedures whereas computational literacy includes the implementation of these procedures on machines, i.e. programming. Both are the core of formalization, model building and computer simulation. Furthermore, we point out that “computation” as a central concept for the sciences in the 21st century and its use in cognitive science and the computational sciences make it necessary to reassess the basic assumptions underlying musicological research as science of mind (Geisteswissenschaft). We propose a digital habitat to integrate computational thinking/literacy in musicology and to become acquainted with model building and computer simulation. Jupyter Notebook provides a basis for such a digital habitat. We describe our use of Jupyter Notebook as a teaching environment for computational thinking/literacy.

Processes ◽  
2020 ◽  
Vol 9 (1) ◽  
pp. 67
Author(s):  
Stefanie Hering ◽  
Nico Schäuble ◽  
Thomas M. Buck ◽  
Brigitta Loretz ◽  
Thomas Rillmann ◽  
...  

Increasing regulatory demands are forcing the pharmaceutical industry to invest its available resources carefully. This is especially challenging for small- and middle-sized companies. Computer simulation software like FlexSim allows one to explore variations in production processes without the need to interrupt the running process. Here, we applied a discrete-event simulation to two approved film-coated tablet production processes. The simulations were performed with FlexSim (FlexSim Deutschland—Ingenieurbüro für Simulationsdienstleistung Ralf Gruber, Kirchlengern, Germany). Process visualization was done using Cmap Tools (Florida Institute for Human and Machine Cognition, Pensacola, FL, USA), and statistical analysis used MiniTab® (Minitab GmbH, Munich, Germany). The most critical elements identified during model building were the model logic, operating schedule, and processing times. These factors were graphically and statistically verified. To optimize the utilization of employees, three different shift systems were simulated, thereby revealing the advantages of two-shift and one-and-a-half-shift systems compared to a one-shift system. Without the need to interrupt any currently running production processes, we found that changing the shift system could save 50–53% of the campaign duration and 9–14% of the labor costs. In summary, we demonstrated that FlexSim, which is mainly used in logistics, can also be advantageously implemented for modeling and optimizing pharmaceutical production processes.


1976 ◽  
Vol 4 (4) ◽  
pp. 281-298 ◽  
Author(s):  
Robert T. Grauer

Recent advances in, and acceptance of, computer simulation methodology make direct experimentation possible for the social scientist. This technique can be used to supplement his traditional tools of experimental design, namely regression analysis and factorial designs. In this paper a unified approach to model building is synthesized from these disparate techniques. The capabilities of each are discussed and then combined into a modeling philosophy which can be applied to a variety of educational problems.


2021 ◽  
Vol 8 (1) ◽  
pp. 49-74
Author(s):  
Mona Emara ◽  
Nicole Hutchins ◽  
Shuchi Grover ◽  
Caitlin Snyder ◽  
Gautam Biswas

The integration of computational modelling in science classrooms provides a unique opportunity to promote key 21st century skills including computational thinking (CT) and collaboration. The open-ended, problem-solving nature of the task requires groups to grapple with the combination of two domains (science and computing) as they collaboratively construct computational models. While this approach has produced significant learning gains for students in both science and CT in K–12 settings, the collaborative learning processes students use, including learner regulation, are not well understood. In this paper, we present a systematic analysis framework that combines natural language processing (NLP) of collaborative dialogue, log file analyses of students’ model-building actions, and final model scores. This analysis is used to better understand students’ regulation of collaborative problem solving (CPS) processes over a series of computational modelling tasks of varying complexity. The results suggest that the computational modelling challenges afford opportunities for students to a) explore resource-intensive processes, such as trial and error, to more systematic processes, such as debugging model errors by leveraging data tools, and b) learn from each other using socially shared regulation (SSR) and productive collaboration. The use of such SSR processes correlated positively with their model-building scores. Our paper aims to advance our understanding of collaborative, computational modelling in K–12 science to better inform classroom applications.


1991 ◽  
Vol 18 (1) ◽  
pp. 30-46 ◽  
Author(s):  
SANDRA T. AZAR

This review assesses progress in the development of causal models of physical child abuse by examining the foundations of current theorizing. The sociopolitical forces and methodological problems that have acted to inhibit model building are highlighted. Existing models are then analyzed using five dimensions important to theory construction: definitions, basic assumptions, levels of analysis, complexity, and forms of antecedent-consequence relationships posited. Progress and conceptual problems in current theorizing are discussed and guidelines for future theory development are outlines.


2019 ◽  
Vol 0 (0) ◽  
Author(s):  
Ulrike Susanne Pompe-Alama

AbstractThe purpose of this essay is to summarize and critically evaluate the epistemological and pragmatic questions with regard to computer simulations as a new technological-scientific format as put forth in current philosophical debates. Computer simulation practices are situated in the broader context of model-building practices and experimentation; the scope and limits of knowledge generated by computer simulations are considered.


1980 ◽  
Vol 40 ◽  
pp. X91
Author(s):  
P. Barnes ◽  
J.L. Finney ◽  
B.J. Gellatly ◽  
I.C Golton ◽  
J. Goodfellow

Author(s):  
Kiyomichi Nakai ◽  
Yusuke Isobe ◽  
Chiken Kinoshita ◽  
Kazutoshi Shinohara

Induced spinodal decomposition under electron irradiation in a Ni-Au alloy has been investigated with respect to its basic mechanism and confirmed to be caused by the relaxation of coherent strain associated with modulated structure. Modulation of white-dots on structure images of modulated structure due to high-resolution electron microscopy is reduced with irradiation. In this paper the atom arrangement of the modulated structure is confirmed with computer simulation on the structure images, and the relaxation of the coherent strain is concluded to be due to the reduction of phase-modulation.Structure images of three-dimensional modulated structure along <100> were taken with the JEM-4000EX high-resolution electron microscope at the HVEM Laboratory, Kyushu University. The transmitted beam and four 200 reflections with their satellites from the modulated structure in an fee Ni-30.0at%Au alloy under illumination of 400keV electrons were used for the structure images under a condition of the spherical aberration constant of the objective lens, Cs = 1mm, the divergence of the beam, α = 3 × 10-4 rad, underfocus, Δf ≃ -50nm and specimen thickness, t ≃ 15nm. The CIHRTEM code was used for the simulation of the structure image.


Sign in / Sign up

Export Citation Format

Share Document