Interfacial behavior between particle and matrix in a composite rubber material

2011 ◽  
Vol 20 (1-3) ◽  
pp. 29-33
Author(s):  
Bartolomeo Trentadue

AbstractElectronic holography moiré is applied to the measurement of interfacial deformation field between an embedded particle and its surrounding matrix in a non-homogeneous rubber material. A tensile specimen is subjected to creep loading. The loads are applied in steps and measurements are carried out at equal intervals of time. The final output is provided by the Holo-Moiré Strain Analyzer and gives the principal strains and their directions in the region of observation. A definition of adhesion as an experimental measurable quantity through the evaluation of contour integrals is introduced.

2015 ◽  
Vol 2015 ◽  
pp. 1-17 ◽  
Author(s):  
Armando Maestro ◽  
Eva Santini ◽  
Dominika Zabiegaj ◽  
Sara Llamas ◽  
Francesca Ravera ◽  
...  

We report here a review of particle-laden interfaces. We discuss the importance of the particle’s wettability, accounted for by the definition of a contact angle, on the attachment of particles to the fluid interface and how the contact angle is strongly affected by several physicochemical parameters. The different mechanisms of interfacial assembly are also addressed, being the adsorption and spreading the most widely used processes leading to the well-known adsorbed and spread layers, respectively. The different steps involved in the adsorption of the particles and the particle-surfactant mixtures from bulk to the interface are also discussed. We also include here the different equations of state provided so far to explain the interfacial behavior of the nanoparticles. Finally, we discuss the mechanical properties of the interfacial particle layers via dilatational and shear rheology. We emphasize along that section the importance of the shear rheology to know the intrinsic morphology of such particulate system and to understand how the flow-field-dependent evolution of the interfacial morphology might eventually affect some properties of materials such as foams and emulsions. We dedicated the last section to explaining the importance of the particulate interfacial systems in the stabilization of foams and emulsions.


Author(s):  
Barbara H. Fried

The essays collected in this book take stock of the nonconsequentialist project over the past fifty years, in two key areas. The first part focuses on the moral “duty not to harm” others. Under a suitably broad definition of harm, that duty encompasses most of the restrictions imposed on individual conduct in the secular, liberal state. It examines how that duty has been cashed out in ostensibly nonaggregative terms in the principal strains of nonconsequentialist thought: tragic choices (trolleyology), libertarian property rights, corrective justice in tort law, and Scanlonian contractualism. Nonconsequentialists have not only failed to articulate a viable alternative to aggregation in this domain; they are doomed to fail, because in a world of scarcity (in the broadest sense) and epistemic uncertainty, everything we do poses some risk of harm to others’ fundamental interests, a conflict that can be resolved only through aggregation. The second part examines the treatment of distributive justice in nonconsequentialist political theory over the past fifty years, focusing on Nozickian libertarianism, Rawlsianism, left-libertarianism, and social contractarianism. It argues that whatever the moral attractiveness of the various distributive schemes proposed, none is logically entailed by the normative premises from which it is ostensibly derived. Unlike the argument in the first part, this is not an argument for consequentialism by logical elimination. Societal wealth need not be, and almost never is, distributed to optimize consequences. Rather, it underscores the relatively weak justifications that have been offered for some very strong conclusions.


PeerJ ◽  
2019 ◽  
Vol 7 ◽  
pp. e6657 ◽  
Author(s):  
Beatriz García-Jiménez ◽  
Mark D. Wilkinson

Analysis of microbiome dynamics would allow elucidation of patterns within microbial community evolution under a variety of biologically or economically important circumstances; however, this is currently hampered in part by the lack of rigorous, formal, yet generally-applicable approaches to discerning distinct configurations of complex microbial populations. Clustering approaches to define microbiome “community state-types” at a population-scale are widely used, though not yet standardized. Similarly, distinct variations within a state-type are well documented, but there is no rigorous approach to discriminating these more subtle variations in community structure. Finally, intra-individual variations with even fewer differences will likely be found in, for example, longitudinal data, and will correlate with important features such as sickness versus health. We propose an automated, generic, objective, domain-independent, and internally-validating procedure to define statistically distinct microbiome states within datasets containing any degree of phylotypic diversity. Robustness of state identification is objectively established by a combination of diverse techniques for stable cluster verification. To demonstrate the efficacy of our approach in detecting discreet states even in datasets containing highly similar bacterial communities, and to demonstrate the broad applicability of our method, we reuse eight distinct longitudinal microbiome datasets from a variety of ecological niches and species. We also demonstrate our algorithm’s flexibility by providing it distinct taxa subsets as clustering input, demonstrating that it operates on filtered or unfiltered data, and at a range of different taxonomic levels. The final output is a set of robustly defined states which can then be used as general biomarkers for a wide variety of downstream purposes such as association with disease, monitoring response to intervention, or identifying optimally performant populations.


1966 ◽  
Vol 24 ◽  
pp. 3-5
Author(s):  
W. W. Morgan

1. The definition of “normal” stars in spectral classification changes with time; at the time of the publication of theYerkes Spectral Atlasthe term “normal” was applied to stars whose spectra could be fitted smoothly into a two-dimensional array. Thus, at that time, weak-lined spectra (RR Lyrae and HD 140283) would have been considered peculiar. At the present time we would tend to classify such spectra as “normal”—in a more complicated classification scheme which would have a parameter varying with metallic-line intensity within a specific spectral subdivision.


1975 ◽  
Vol 26 ◽  
pp. 21-26

An ideal definition of a reference coordinate system should meet the following general requirements:1. It should be as conceptually simple as possible, so its philosophy is well understood by the users.2. It should imply as few physical assumptions as possible. Wherever they are necessary, such assumptions should be of a very general character and, in particular, they should not be dependent upon astronomical and geophysical detailed theories.3. It should suggest a materialization that is dynamically stable and is accessible to observations with the required accuracy.


1979 ◽  
Vol 46 ◽  
pp. 125-149 ◽  
Author(s):  
David A. Allen

No paper of this nature should begin without a definition of symbiotic stars. It was Paul Merrill who, borrowing on his botanical background, coined the termsymbioticto describe apparently single stellar systems which combine the TiO absorption of M giants (temperature regime ≲ 3500 K) with He II emission (temperature regime ≳ 100,000 K). He and Milton Humason had in 1932 first drawn attention to three such stars: AX Per, CI Cyg and RW Hya. At the conclusion of the Mount Wilson Ha emission survey nearly a dozen had been identified, and Z And had become their type star. The numbers slowly grew, as much because the definition widened to include lower-excitation specimens as because new examples of the original type were found. In 1970 Wackerling listed 30; this was the last compendium of symbiotic stars published.


Author(s):  
K. T. Tokuyasu

During the past investigations of immunoferritin localization of intracellular antigens in ultrathin frozen sections, we found that the degree of negative staining required to delineate u1trastructural details was often too dense for the recognition of ferritin particles. The quality of positive staining of ultrathin frozen sections, on the other hand, has generally been far inferior to that attainable in conventional plastic embedded sections, particularly in the definition of membranes. As we discussed before, a main cause of this difficulty seemed to be the vulnerability of frozen sections to the damaging effects of air-water surface tension at the time of drying of the sections.Indeed, we found that the quality of positive staining is greatly improved when positively stained frozen sections are protected against the effects of surface tension by embedding them in thin layers of mechanically stable materials at the time of drying (unpublished).


Author(s):  
W. A. Shannon ◽  
M. A. Matlib

Numerous studies have dealt with the cytochemical localization of cytochrome oxidase via cytochrome c. More recent studies have dealt with indicating initial foci of this reaction by altering incubation pH (1) or postosmication procedure (2,3). The following study is an attempt to locate such foci by altering membrane permeability. It is thought that such alterations within the limits of maintaining morphological integrity of the membranes will ease the entry of exogenous substrates resulting in a much quicker oxidation and subsequently a more precise definition of the oxidative reaction.The diaminobenzidine (DAB) method of Seligman et al. (4) was used. Minced pieces of rat liver were incubated for 1 hr following toluene treatment (5,6). Experimental variations consisted of incubating fixed or unfixed tissues treated with toluene and unfixed tissues treated with toluene and subsequently fixed.


Author(s):  
J. D. Hutchison

When the transmission electron microscope was commercially introduced a few years ago, it was heralded as one of the most significant aids to medical research of the century. It continues to occupy that niche; however, the scanning electron microscope is gaining rapidly in relative importance as it fills the gap between conventional optical microscopy and transmission electron microscopy.IBM Boulder is conducting three major programs in cooperation with the Colorado School of Medicine. These are the study of the mechanism of failure of the prosthetic heart valve, the study of the ultrastructure of lung tissue, and the definition of the function of the cilia of the ventricular ependyma of the brain.


Author(s):  
P. M. Lowrie ◽  
W. S. Tyler

The importance of examining stained 1 to 2μ plastic sections by light microscopy has long been recognized, both for increased definition of many histologic features and for selection of specimen samples to be used in ultrastructural studies. Selection of specimens with specific orien ation relative to anatomical structures becomes of critical importance in ultrastructural investigations of organs such as the lung. The uantity of blocks necessary to locate special areas of interest by random sampling is large, however, and the method is lacking in precision. Several methods have been described for selection of specific areas for electron microscopy using light microscopic evaluation of paraffin, epoxy-infiltrated, or epoxy-embedded large blocks from which thick sections were cut. Selected areas from these thick sections were subsequently removed and re-embedded or attached to blank precasted blocks and resectioned for transmission electron microscopy (TEM).


Sign in / Sign up

Export Citation Format

Share Document