Fourth (Final) Report on a Test of McDougall'S Lamarckian Experiment on the Training of Rats

1954 ◽  
Vol 31 (3) ◽  
pp. 307-321
Author(s):  
W. E. AGAR ◽  
F. H. DRUMMOND ◽  
O. W. TIEGS ◽  
M. M. GUNSON

This is the final report of an experiment of 20 years' duration, in which we have repeated, in its essentials, the well-known experiment of William McDougall purporting to reveal a Lamarckian inheritance of the effects of training on rats. The test is one involving light discrimination, and McDougall recorded a steady improvement in the rate of learning on a succession of 32 generations; but he omitted to check the results against a properly conducted control. Our experiment confirms McDougall to the extent that we too have obtained long duration trends of improvement in learning-rate (Figs. 2, 3); but we find that the effect is not sustained, and that it is, moreover, shown also by a control experiment, using animals of untrained ancestry. This forbids a Lamarckian interpretation. Statistical analysis of the data indicates that the ‘condition’ of the rat markedly affects its speed of learning, and that progressive changes in learning-rate, over a succession of generations, are in reality correlated with the health of the laboratory colony, which is subject to periods of decline and recovery.

2006 ◽  
Vol 58 (3) ◽  
pp. 339-377 ◽  
Author(s):  
Benjamin Valentino ◽  
Paul Huth ◽  
Sarah Croco

Do the international laws of war effectively protect civilian populations from deliberate attack? In a statistical analysis of all interstate wars from 1900 to 2003 the authors find no evidence that signatories of The Hague or Geneva Conventions intentionally kill fewer civilians during war than do nonsignatories. This result holds for democratic signatories and for wars in which both sides are parties to the treaty. Nor do they find evidence that a state's regime type or the existence of ethnic or religious differences between combatants explains the variation in civilian targeting. They find strong support, however, for their theoretical framework, which suggests that combatants seek to kill enemy civilians when they believe that doing so will coerce their adversaries into early surrender or undermine their adversaries' war-related domestic production. The authors find that states fighting wars of attrition or counterinsurgency, states fighting for expansive war aims, and states fighting wars of long duration kill significantly more civilians than states in other kinds of wars.


2020 ◽  
Vol 134 (6) ◽  
pp. 497-500
Author(s):  
O Denton ◽  
A Daglish ◽  
L Smallman ◽  
S Fishpool

AbstractObjectiveRate of learning is often cited as a deterrent in the use of endoscopic ear surgery. This study investigated the learning curves of novice surgeons performing simulated ear surgery using either an endoscope or a microscope.MethodsA prospective multi-site clinical research study was conducted. Seventy-two medical students were randomly allocated to the endoscope or microscope group, and performed 10 myringotomy and ventilation tube insertions. Trial times were used to produce learning curves. From these, slope (learning rate) and asymptote (optimal proficiency) were ascertained.ResultsThere was no significant difference between the learning curves (p = 0.41). The learning rate value was 68.62 for the microscope group and 78.71 for the endoscope group. The optimal proficiency (seconds) was 32.83 for the microscope group and 27.87 for the endoscope group.ConclusionThe absence of a significant difference shows that the learning rates of each technique are statistically indistinguishable. This suggests that surgeons are not justified when citing ‘steep learning curve’ in arguments against the use of endoscopes in middle-ear surgery.


1968 ◽  
Vol 23 (1) ◽  
pp. 3-8 ◽  
Author(s):  
John A. Mills

Scaled familiarity values were obtained for 19 24-word passages of connected discourse using the successive categories scaling method. The passages ranged from an item on the sporting page of a newspaper to a fifth-order approximation to English. Eight passages, lying at approximately equal intervals along the scale, were set on memory drums. 64 Ss learned them by the serial-anticipation method and relearned them 1 wk. later. Learning rate and extent of recall showed a steady decrease as familiarity with normal English sentences fell, whereas a fall in familiarity with abnormal English passages was accompanied by a rise in both learning rate and extent of recall. It is suggested that the relationship between familiarity scale values and extent of recall is an artifact resulting from the fact that differing degrees of associative strength at the end of learning were not controlled.


Author(s):  
Công Nguyễn Chí

Landslide is a complex geo-hazard which impacts on sustainable social-economic development in the mountain area. This phenomenon is a result of a combination of critical natural and artificial conditions of many impact factors such as topographic attribute, infrastructure constructions, geology, land cover, and rainfall. Estimating the contribution weights of these factors plays a significant role in disaster management activities. This study focuses on three provinces which are Hue, Quang Nam and Quang Ngai which are frequently and severely impacted by landslide in the central region of Vietnam. Historical events are investigated by statistical analysis, field survey with supports from GIS to figure out these significant factors to landslide occurring in the study area. The result has illustrated landslide increases, according to the development of human activities and long duration critical rainfall.


1942 ◽  
Vol 19 (2) ◽  
pp. 158-167
Author(s):  
W. E. AGAR ◽  
F. H. DRUMMOND ◽  
O. W. TIEGS

The experiment, devised to test McDougall's claim that the effect of training in rats is inherited, has been carried out for twenty generations. In addition to the trained line, a control line has been maintained parallel with it, from which a number of rats have been trained in each generation, but not used for breeding. For each generation of the trained line there is therefore a corresponding group of trained control rats for comparison, differing from the rats of the trained line only in that they have no trained ancestry. During the first fifteen or sixteen generations there was a progressive, though irregular decline in the number of errors made in each generation in both lines. In generation 18 both lines showed a marked increase in the number of errors made, with fluctuations in subsequent generations running closely parallel in the two lines. This parallelism of periodic fluctuations in rate of learning in the two lines makes it impossible to attribute a progressive change in the trained line, when it happens to be in the direction of decreasing number of errors, to the inherited effects of ancestral training. Our experiment is being continued, and therefore our conclusions must be regarded as tentative only. The results of the experiment up to the present, together with those of Crew's experiment, show however, that the progressive decrease in the number of errors in successive generations of McDougall's experiment, in which no control line was maintained, cannot be held to have established the operation of Lamarckian inheritance.


2017 ◽  
pp. 5-16 ◽  
Author(s):  
Александр Боровик ◽  
Aleksandr Borovik ◽  
Антон Жданов ◽  
Anton Zhdanov

This paper is a sequel to earlier papers on time parameters of solar flares in the Hα line. Using data from the International Flare Patrol, an electronic database of solar flares for the period 1972–2010 has been created. The statistical analysis of the duration of the main phase has shown that it increases with increas-ing flare class and brightness. It has been found that the duration of the main phase depends on the type and features of development of solar flares. Flares with one brilliant point have the shortest main phase; flares with several intensity maxima and two-ribbon flares, the longest one. We have identified more than 3000 cases with an ultra-long duration of the main phase (more than 60 minutes). For 90 % of such flares the duration of the main phase is 2–3 hrs, but sometimes it reaches 12 hrs.


2019 ◽  
Vol 3 (2) ◽  
pp. 422
Author(s):  
Jaya Tata Hardinata ◽  
Harly Okprana ◽  
Agus Perdana Windarto ◽  
Widodo Saputra

Backpropagation is an artificial neural network that has the architecture in conducting training and determining the right parameters to produce the correct output of similar but not the same input. One of the parameters that influences the determination of bacpropagation architecture is the rate of learning, where if the value of the learning rate is too high then the network architecture becomes unstable otherwise if the value of the learning rate is too low the network architecture converges and takes a long time in training network architecture. This research data is secondary data sourced from UCI Data Mechine Learning. The best network architecture in this study is 13-10-3, with different learning rates ranging from 0.01, 0.03, 0.06, 0.01, 0.13, 0.16, 0.2, 0.23, 0.026, 0.3, 0.35, 0.4, 0.45, 0.5, 0.55, 0.6, 0.65, 0.7, 0.75, 0.8, 0.9. From the 21 different learning rate values in the 13-10-3 network architecture, it is found that the level of learning rate is very important to get the right and fast network architecture. This can be seen in experiments with a learning rate of 0.65 can produce a better level of accuracy compared to a learning rate smaller than 0.65.


2017 ◽  
Vol 1 (S1) ◽  
pp. 18-19
Author(s):  
Ram Gouripeddi ◽  
Mollie Cummins ◽  
Randy Madsen ◽  
Bernie LaSalle ◽  
Andrew Middleton Redd ◽  
...  

OBJECTIVES/SPECIFIC AIMS: Key factors causing irreproducibility of research include those related to inappropriate study design methodologies and statistical analysis. In modern statistical practice irreproducibility could arise due to statistical (false discoveries, p-hacking, overuse/misuse of p-values, low power, poor experimental design) and computational (data, code and software management) issues. These require understanding the processes and workflows practiced by an organization, and the development and use of metrics to quantify reproducibility. METHODS/STUDY POPULATION: Within the Foundation of Discovery – Population Health Research, Center for Clinical and Translational Science, University of Utah, we are undertaking a project to streamline the study design and statistical analysis workflows and processes. As a first step we met with key stakeholders to understand the current practices by eliciting example statistical projects, and then developed process information models for different types of statistical needs using Lucidchart. We then reviewed these with the Foundation’s leadership and the Standards Committee to come up with ideal workflows and model, and defined key measurement points (such as those around study design, analysis plan, final report, requirements for quality checks, and double coding) for assessing reproducibility. As next steps we are using our finding to embed analytical and infrastructural approaches within the statisticians’ workflows. This will include data and code dissemination platforms such as Box, Bitbucket, and GitHub, documentation platforms such as Confluence, and workflow tracking platforms such as Jira. These tools will simplify and automate the capture of communications as a statistician work through a project. Data-intensive process will use process-workflow management platforms such as Activiti, Pegasus, and Taverna. RESULTS/ANTICIPATED RESULTS: These strategies for sharing and publishing study protocols, data, code, and results across the spectrum, active collaboration with the research team, automation of key steps, along with decision support. DISCUSSION/SIGNIFICANCE OF IMPACT: This analysis of statistical methods and process and computational methods to automate them ensure quality of statistical methods and reproducibility of research.


2017 ◽  
Vol 3 (4) ◽  
pp. 5-16 ◽  
Author(s):  
Александр Боровик ◽  
Aleksandr Borovik ◽  
Антон Жданов ◽  
Anton Zhdanov

This paper is a sequel to earlier papers on time parameters of solar flares in the Hα line. Using data from the International Flare Patrol, an electronic database of solar flares for the period 1972–2010 has been created. The statistical analysis of the duration of the main phase has shown that it increases with increasing flare class and brightness. It has been found that the duration of the main phase depends on the type and features of development of solar flares. Flares with one brilliant point have the shortest main phase; flares with several intensity maxima and two-ribbon flares, the longest one. We have identified more than 3000 cases with an ultra-long duration of the main phase (more than 60 minutes). For 90 % of such flares the duration of the main phase is 2–3 hrs, but sometimes it reaches 12 hrs.


Sign in / Sign up

Export Citation Format

Share Document