scholarly journals Automated data‐intensive forecasting of plant phenology throughout the United States

2019 ◽  
Vol 30 (1) ◽  
Author(s):  
Shawn D. Taylor ◽  
Ethan P. White
2019 ◽  
Author(s):  
Shawn D. Taylor ◽  
Ethan P. White

AbstractPhenology - the timing of cyclical and seasonal natural phenomena such as flowering and leaf out - is an integral part of ecological systems with impacts on human activities like environmental management, tourism, and agriculture. As a result, there are numerous potential applications for actionable predictions of when phenological events will occur. However, despite the availability of phenological data with large spatial, temporal, and taxonomic extents, and numerous phenology models, there has been no automated species-level forecasts of plant phenology. This is due in part to the challenges of building a system that integrates large volumes of climate observations and forecasts, uses that data to fit models and make predictions for large numbers of species, and consistently disseminates the results of these forecasts in interpretable ways. Here we describe a new near-term phenology forecasting system that makes predictions for the timing of budburst, flowers, ripe fruit, and fall colors for 78 species across the United States up to 6 months in advance and is updated every four days. We use the lessons learned in developing this system to provide guidance developing large-scale near-term ecological forecast systems more generally, to help advance the use of automated forecasting in ecology.


2019 ◽  
Vol 33 (4) ◽  
pp. 166-180
Author(s):  
Fatih Demir ◽  
So Mi Kim ◽  
Neeley Current ◽  
Isa Jahnke

The schools in the United States undergo reforms that involve the integration of data-intensive improvement process known as strategic improvement plans (SIPs). This process requires digital systems to set goals, create interventions, use and analyse student data, monitor and report SIPs. A challenge with digital systems is the integration of a highly diverse set of data sources and identifying users who are engaged with the processes. This study explored how teachers and principals carry out SIPs currently. We applied the sociotechnical walkthrough, a qualitative method that combines a modelling notation and focus group interviews to understand the current workflows, technology use and interactions of teachers and principals developing SIPs. The results illustrate a variety of existing activities and indicate how a newly developed technology may have an impact as well as propose design recommendations to fill a gap in managing SIPs.


2017 ◽  
Vol 1 (4) ◽  
pp. 1-9 ◽  
Author(s):  
Ying Ding ◽  
Kyle Stirling

AbstractIn the current data-intensive era, the traditional hands-on method of conducting scientific research by exploring related publications to generate a testable hypothesis is well on its way of becoming obsolete within just a year or two. Analyzing the literature and data to automatically generate a hypothesis might become the de facto approach to inform the core research efforts of those trying to master the exponentially rapid expansion of publications and datasets. Here, viewpoints are provided and discussed to help the understanding of challenges of data-driven discovery.The Panama Canal, the 77-kilometer waterway connecting the Atlantic and Pacific oceans, has played a crucial role in international trade for more than a century. However, digging the Panama Canal was an exceedingly challenging process. A French effort in the late 19th century was abandoned because of equipment issues and a significant loss of labor due to tropical diseases transmitted by mosquitoes. The United States officially took control of the project in 1902. The United States replaced the unusable French equipment with new construction equipment that was designed for a much larger and faster scale of work. Colonel William C. Gorgas was appointed as the chief sanitation officer and charged with eliminating mosquito-spread illnesses. After overcoming these and additional trials and tribulations, the Canal successfully opened on August 15, 1914. The triumphant completion of the Panama Canal demonstrates that using the right tools and eliminating significant threats are critical steps in any project.More than 100 years later, a paradigm shift is occurring, as we move into a data-centered era. Today, data are extremely rich but overwhelming, and extracting information out of data requires not only the right tools and methods but also awareness of major threats. In this data-intensive era, the traditional method of exploring the related publications and available datasets from previous experiments to arrive at a testable hypothesis is becoming obsolete. Consider the fact that a new article is published every 30 seconds (Jinha, 2010). In fact, for the common disease of diabetes, there have been roughly 500,000 articles published to date; even if a scientist reads 20 papers per day, he will need 68 years to wade through all the material. The standard method simply cannot sufficiently deal with the large volume of documents or the exponential growth of datasets. A major threat is that the canon of domain knowledge cannot be consumed and held in human memory. Without efficient methods to process information and without a way to eliminate the fundamental threat of limited memory and time to handle the data deluge, we may find ourselves facing failure as the French did on the Isthmus of Panama more than a century ago.Scouring the literature and data to generate a hypothesis might become the de facto approach to inform the core research efforts of those trying to master the exponentially rapid expansion of publications and datasets (Evans & Foster, 2011). In reality, most scholars have never been able to keep completely up-to-date with publications and datasets considering the unending increase in quantity and diversity of research within their own areas of focus, let alone in related conceptual areas in which knowledge may be segregated by syntactically impenetrable keyword barriers or an entirely different research corpus.Research communities in many disciplines are finally recognizing that with advances in information technology there needs to be new ways to extract entities from increasingly data-intensive publications and to integrate and analyze large-scale datasets. This provides a compelling opportunity to improve the process of knowledge discovery from the literature and datasets through use of knowledge graphs and an associated framework that integrates scholars, domain knowledge, datasets, workflows, and machines on a scale previously beyond our reach (Ding et al., 2013).


2021 ◽  
Vol 13 (3) ◽  
pp. 399
Author(s):  
Qiming Zheng ◽  
Hoong Chen Teo ◽  
Lian Pin Koh

Plant phenology is closely related to light availability as diurnal and seasonal cycles are essential environmental cues for organizing bio-ecological processes. The natural cycles of light, however, have been dramatically disrupted by artificial light at night (ALAN) due to recent urbanization. The influence on plant phenology of ALAN and its spatial variation remain largely unknown. By analyzing satellite data on ALAN intensity across the United States, here, we showed that ALAN tended to advance the start date of the growing season (SOS), although the overall response of SOS to ALAN was relatively weak compared with other potential factors (e.g., preseason temperature). The phenological impact of ALAN showed a spatially divergent pattern, whereby ALAN mainly advanced SOS at climatically moderate regions within the United States (e.g., Virginia), while its effect was insignificant or even reversed at very cold (e.g., Minnesota) and hot regions (e.g., Florida). Such a divergent pattern was mainly attributable to its high sensitivity to chilling insufficiency, where the advancing effect on SOS was only triggered on the premise that chilling days exceeded a certain threshold. Other mechanisms may also play a part, such as the interplay among chilling, forcing and photoperiod and the difference in species life strategies. Besides, urban areas and natural ecosystems were found to suffer from similar magnitudes of influence from ALAN, albeit with a much higher baseline ALAN intensity in urban areas. Our findings shed new light on the phenological impact of ALAN and its relation to space and other environmental cues, which is beneficial to a better understanding and projection of phenology changes under a warming and urbanizing future.


Author(s):  
A. Hakam ◽  
J.T. Gau ◽  
M.L. Grove ◽  
B.A. Evans ◽  
M. Shuman ◽  
...  

Prostate adenocarcinoma is the most common malignant tumor of men in the United States and is the third leading cause of death in men. Despite attempts at early detection, there will be 244,000 new cases and 44,000 deaths from the disease in the United States in 1995. Therapeutic progress against this disease is hindered by an incomplete understanding of prostate epithelial cell biology, the availability of human tissues for in vitro experimentation, slow dissemination of information between prostate cancer research teams and the increasing pressure to “ stretch” research dollars at the same time staff reductions are occurring.To meet these challenges, we have used the correlative microscopy (CM) and client/server (C/S) computing to increase productivity while decreasing costs. Critical elements of our program are as follows:1) Establishing the Western Pennsylvania Genitourinary (GU) Tissue Bank which includes >100 prostates from patients with prostate adenocarcinoma as well as >20 normal prostates from transplant organ donors.


Author(s):  
Vinod K. Berry ◽  
Xiao Zhang

In recent years it became apparent that we needed to improve productivity and efficiency in the Microscopy Laboratories in GE Plastics. It was realized that digital image acquisition, archiving, processing, analysis, and transmission over a network would be the best way to achieve this goal. Also, the capabilities of quantitative image analysis, image transmission etc. available with this approach would help us to increase our efficiency. Although the advantages of digital image acquisition, processing, archiving, etc. have been described and are being practiced in many SEM, laboratories, they have not been generally applied in microscopy laboratories (TEM, Optical, SEM and others) and impact on increased productivity has not been yet exploited as well.In order to attain our objective we have acquired a SEMICAPS imaging workstation for each of the GE Plastic sites in the United States. We have integrated the workstation with the microscopes and their peripherals as shown in Figure 1.


2001 ◽  
Vol 15 (01) ◽  
pp. 53-87 ◽  
Author(s):  
Andrew Rehfeld

Every ten years, the United States “constructs” itself politically. On a decennial basis, U.S. Congressional districts are quite literally drawn, physically constructing political representation in the House of Representatives on the basis of where one lives. Why does the United States do it this way? What justifies domicile as the sole criteria of constituency construction? These are the questions raised in this article. Contrary to many contemporary understandings of representation at the founding, I argue that there were no principled reasons for using domicile as the method of organizing for political representation. Even in 1787, the Congressional district was expected to be far too large to map onto existing communities of interest. Instead, territory should be understood as forming a habit of mind for the founders, even while it was necessary to achieve other democratic aims of representative government.


Sign in / Sign up

Export Citation Format

Share Document