scholarly journals Simulation of Genetic Systems by Automatic Digital Computers IV. Selection Between Alleles at a Sex-Linked Locus

1958 ◽  
Vol 11 (4) ◽  
pp. 613 ◽  
Author(s):  
JSF Barker

A programme simulating selection between two alleles at a sex�linked locus has been developed for an automatic digital computer (the SILLIAC). It introduces selection and chance effects at four stages of the life cycle.

Author(s):  
Alan Turing

The lecture ‘Can Digital Computers Think?’ was broadcast on BBC Radio on 15 May 1951, and was repeated on 3 July of that year. (Sara Turing relates that Turing did not listen to the Wrst broadcast but did ‘pluck up courage’ to listen to the repeat.) Turing’s was the second lecture in a series with the general title ‘Automatic Calculating Machines’. Other speakers in the series included Newman, D. R. Hartree, M. V. Wilkes, and F. C. Williams. Turing’s principal aim in this lecture is to defend his view that ‘it is not altogether unreasonable to describe digital computers as brains’, and he argues for the proposition that ‘If any machine can appropriately be described as a brain, then any digital computer can be so described’. The lecture casts light upon Turing’s attitude towards talk of machines thinking. In Chapter 11 he says that in his view the question ‘Can machines think?’ is ‘too meaningless to deserve discussion’ (p. 449). However, in the present chapter he makes liberal use of such phrases as ‘programm[ing] a machine . . . to think’ and ‘the attempt to make a thinking machine’. In one passage, Turing says (p. 485): ‘our main problem [is] how to programme a machine to imitate a brain, or as we might say more briefly, if less accurately, to think.’ He shows the same willingness to discuss the question ‘Can machines think?’ in Chapter 14. Turing’s view is that a machine which imitates the intellectual behaviour of a human brain can itself appropriately be described as a brain or as thinking. In Chapter 14, Turing emphasizes that it is only the intellectual behaviour of the brain that need be considered (pp. 494–5): ‘To take an extreme case, we are not interested in the fact that the brain has the consistency of cold porridge. We don’t want to say ‘‘This machine’s quite hard, so it isn’t a brain, and so it can’t think.’’ ’ It is, of course, the ability of the machine to imitate the intellectual behaviour of a human brain that is examined in the Turing test (Chapter 11).


1968 ◽  
Vol 12 ◽  
pp. 391-403 ◽  
Author(s):  
Hung-Chi Chao

AbstractThe texture of sheet metal Is best described, by means of pole figures, which are very expensive and time-consuming to prepare. About 8 to 12 hours of effort by a specially trained, and. highly skilled technician are needed to prepare each pole figure. Accordingly, pole figures are not used as extensively in research studies as they would, be if they could be obtained more easily.A method has been developed for automatically producing pole figures by printing results directly from a digital computer. This method does not require the use of additional plotting attachments and, is therefore less expensive and time consuming than other methods. With this method, any laboratory with access to a digital computer can produce pole figures automatically.X-ray diffraction intensities are recorded on punched tape or on punched cards and are fed into the digital computer. A computer program corrects X-ray data obtained, by either transmission or reflection X-ray techniques, maps the stereographic projection, and prints pole figures directly. The time required, to prepare an accurate pole figure is reduced from 8 to 12 hours to 20 minutes or less depending on the type of digital computer used.


1957 ◽  
Vol 10 (4) ◽  
pp. 484 ◽  
Author(s):  
AS Fraser

Methods of setting automatic digital computers to simulate the algebraic aspects of reproduction, segregation, and selection are discussed. The application of these methods to the problem of the importance of linkage in multifactorial inheritance is illustrated by results from the SILLIAC.


1961 ◽  
Vol 1 (03) ◽  
pp. 184-194
Author(s):  
Walter L. Henson ◽  
Paul L. Wearden ◽  
John D. Rice

Abstract Solutions to the unsteady-state partial water-drive reservoir performance problem can be obtained through the use of analogue computers or high-speed electronic digital computers. The solutions that have previously been resolved for use on digital computers, however, demand a knowledge of the aquifer parameters. Generally, the analogue computers now in use do not require this knowledge. A solution is presented herein where the aquifer performance, expressed in terms of difference equations, is related to the reservoir performance as expressed by the modified Schilthuis material-balance equation. A numerical procedure for a medium-sized digital computer also is presented in which a solution to the set of equations defining the aquifer and reservoir performance is obtained and the aquifer parameters (permeability and sand thickness) are automatically optimized while simultaneously matching the known pressure behavior. Predicted pressure behavior can be calculated using rates from any assumed future production practice. The procedure provides an output format which presents the cumulative, incremental and average rate of natural water influx, the per cent gas-cap expansion, the calculated reservoir pressure, the measured reservoir pressure, and the difference between the measured and calculated pressures. Results of a test problem are presented in comparison with results obtained by the Bruce Analyzer, Ohio Oil Co.'s Pace General Purpose Analogue Computer, and Sun Oil Co.'s Single-Pool Electronic Reservoir Analyzer. These results indicate that unsteady-state reservoir performance for a single-pool system can be adequately simulated by a numerical method employing a digital computer and that the special-purpose analogue computer can be supplanted by this method.


1957 ◽  
Vol 10 (4) ◽  
pp. 492 ◽  
Author(s):  
AS Fraser

Rates of progress of single populations under selection pressure have been simulated by an automatic electronic computer. Varying intensities of selection and tightness of linkage are compared, showing that linkage produces no qualitative effect on the rates of advance at values greater than 0�005, i.e. 0�5 per cent. recombination.


Author(s):  
A.V. Lobanov ◽  
I.V. Asharina

The paper deals with the organization of target work recovery processes after admissible failures and faults in an automatic failure and fault tolerant multitask distributed multi-machine system of the network structure performing a set of the target functions set by external users. The system is characterized by parallel execution of a set of interacting target tasks performed on separate computer subsystems, which are organized sets of digital computers. The specified level of failure- and fault-tolerance of the task is provided by its replication, i.e. parallel execution of copies of this task on several computers that make up the system, with the exchange of results and the choice of the correct one. The study introduces the characteristics, principles of construction, features of the considered systems and their "philosophical" essence from the point of view of failure- and fault-tolerance. Within the research, we determined the factors of complexity in the design of failure- and fault-tolerant systems of this class. The most general model of malicious computer failure is adopted, in which the computer behavior can be arbitrary, different in relation to other computers interacting with it, and even as malicious. We focus on the part of the problem of organizing dynamic redundancy in the developed system. The problem arises after an acceptable set of faults is detected in this system in some complex (or some set of F complexes) by each of the fault-free digital computers of each such complex and each such fault is also synchronously and consistently identified by place of origin and by type as a software failure of a certain digital computer of this complex. This part of the problem is solved by restoring all necessary information identified in a state of software malfunction of a certain complex. The information is transmitted to this digital computer from fault-free digital computers of this complex. The list of instructions required for such a recovery, as well as the actions of the complex in the recovery process, is determined.


1960 ◽  
Vol 13 (3) ◽  
pp. 344 ◽  
Author(s):  
AS Fraser

Simulation by Monte Carlo methods of the effect of selection against pheno. typic extremes has shown that selection can produce a degree of genetic canali� zation which is more restrictive than that indicated by the limits of selection, showing that canalization of a rigid degree can be caused by loose selection.


Author(s):  
Robert Stufflebeam

Turing's analysis of the concept of computation is indisputably the foundation of computationalism, which is, in turn, the foundation of cognitive science. What is disputed is whether computationalism is explanatorily bankrupt. For Turing, all computers are digital computers and something becomes a (digital) computer just in case its 'behavior' is interpreted as implementing, executing, or satisfying some (mathematical) function 'f'. As 'computer' names a nonnatural kind, almost everyone agrees that a computational interpretation of this sort is necessary for something to be a computer. But because everything in the universe satisfies at least one (mathematical) function, it is the sufficiency of such interpretations that is the problem. If, as anticomputationalists are fond of pointing out, computationalists are wedded to the view that a computational interpretation is sufficient for something to be a computer, then everything becomes a digital computer. This not only renders computer-talk vacuous, it strips computationalism of any empirical or explanatory import. My aim is to defend computationalism against charges that it is explanatorily bankrupt. I reexamine several fundamental questions about computers. One effect of this computation-related soul-searching will be a framework within which 'Is the brain a computer?' will be meaningful. Another effect will be a fracture in the supposed link between computationalism and symbolic-digital processing.


Geophysics ◽  
1954 ◽  
Vol 19 (2) ◽  
pp. 255-269 ◽  
Author(s):  
Stephen M. Simpson

The fitting of a nth order polynomial in x and y to gravity data by least squares is discussed. A consideration of the normal equations for the general case shows certain simplifications resulting from rectangularity in data distribution. Some sample residual maps are constructed. Density plotting, made possible by the digital computer, is described and illustrated. It is shown that this process can serve as a substitute for contouring when only a qualitative picture is desired.


Sign in / Sign up

Export Citation Format

Share Document