WORLDWAVES: High Quality Coastal and Offshore Wave Data Within Minutes for Any Global Site

Author(s):  
Stephen F. Barstow ◽  
Gunnar Mo̸rk ◽  
Lasse Lo̸nseth ◽  
Peter Schjo̸lberg ◽  
Ulla Machado ◽  
...  

There has been a growing demand for reliable information on the wave conditions, in particular at coastal sites, as a result of increased utilisation of the coastal zone to a multitude of activities including various shoreline developments related to transportation, tourism, fish farming and recently wind and wave energy industries. This trend is likely to continue. Reliable data is also needed with respect to the management and protection of these often fragile environments. Many of those concerned with these wave-impacted environments still use antiquated data sources, usually from offshore waters as, in the absence of long term wave data collected at the site of interest, the calculation of reliable wave statistics at a coastal site is a complicated, time consuming and expensive business, requiring various data sets to be assembled. WORLDWAVES simplifies and speeds up the modelling of wave conditions in coastal waters by integrating the following under a single Matlab toolbox: High quality long-term wave data offshore all global coasts; worldwide bathymetric and coastline data; SWAN and backward raytracing wave models; sophisticated offshore and nearshore wave statistics toolboxes with tabular and graphical presentations, including a facility to export ASCII time series data at offshore or inshore locations; a geographic module with easy zooming to any area worldwide; tools to set up model grids and display and edit bathymetry and coastline; a facility for the import of user offshore data and export of inshore time series data. In this paper we describe the design and implementation of WorldWaves including the fusion of satellite, model and buoy wave and wind data in the global offshore database and the new raytracing model.

Water ◽  
2021 ◽  
Vol 13 (4) ◽  
pp. 416
Author(s):  
Bwalya Malama ◽  
Devin Pritchard-Peterson ◽  
John J. Jasbinsek ◽  
Christopher Surfleet

We report the results of field and laboratory investigations of stream-aquifer interactions in a watershed along the California coast to assess the impact of groundwater pumping for irrigation on stream flows. The methods used include subsurface sediment sampling using direct-push drilling, laboratory permeability and particle size analyses of sediment, piezometer installation and instrumentation, stream discharge and stage monitoring, pumping tests for aquifer characterization, resistivity surveys, and long-term passive monitoring of stream stage and groundwater levels. Spectral analysis of long-term water level data was used to assess correlation between stream and groundwater level time series data. The investigations revealed the presence of a thin low permeability silt-clay aquitard unit between the main aquifer and the stream. This suggested a three layer conceptual model of the subsurface comprising unconfined and confined aquifers separated by an aquitard layer. This was broadly confirmed by resistivity surveys and pumping tests, the latter of which indicated the occurrence of leakage across the aquitard. The aquitard was determined to be 2–3 orders of magnitude less permeable than the aquifer, which is indicative of weak stream-aquifer connectivity and was confirmed by spectral analysis of stream-aquifer water level time series. The results illustrate the importance of site-specific investigations and suggest that even in systems where the stream is not in direct hydraulic contact with the producing aquifer, long-term stream depletion can occur due to leakage across low permeability units. This has implications for management of stream flows, groundwater abstraction, and water resources management during prolonged periods of drought.


2007 ◽  
pp. 88
Author(s):  
Wataru Suzuki ◽  
Yanfei Zhou

This article represents the first step in filling a large gap in knowledge concerning why Public Assistance (PA) use recently rose so fast in Japan. Specifically, we try to address this problem not only by performing a Blanchard and Quah decomposition on long-term monthly time series data (1960:04-2006:10), but also by estimating prefecturelevel longitudinal data. Two interesting findings emerge from the time series analysis. The first is that permanent shock imposes a continuously positive impact on the PA rate and is the main driving factor behind the recent increase in welfare use. The second finding is that the impact of temporary shock will last for a long time. The rate of the use of welfare is quite rigid because even if the PA rate rises due to temporary shocks, it takes about 8 or 9 years for it to regain its normal level. On the other hand, estimations of prefecture-level longitudinal data indicate that the Financial Capability Index (FCI) of the local government2 and minimum wage both impose negative effects on the PA rate. We also find that the rapid aging of Japan's population presents a permanent shock in practice, which makes it the most prominent contribution to surging welfare use.


2017 ◽  
Author(s):  
Easton R White

Long-term time series are necessary to better understand population dynamics, assess species' conservation status, and make management decisions. However, population data are often expensive, requiring a lot of time and resources. When is a population time series long enough to address a question of interest? We determine the minimum time series length required to detect significant increases or decreases in population abundance. To address this question, we use simulation methods and examine 878 populations of vertebrate species. Here we show that 15-20 years of continuous monitoring are required in order to achieve a high level of statistical power. For both simulations and the time series data, the minimum time required depends on trend strength, population variability, and temporal autocorrelation. These results point to the importance of sampling populations over long periods of time. We argue that statistical power needs to be considered in monitoring program design and evaluation. Time series less than 15-20 years are likely underpowered and potentially misleading.


Media Ekonomi ◽  
2017 ◽  
Vol 20 (1) ◽  
pp. 83
Author(s):  
Jumadin Lapopo

<p>Poverty is being a problem in all developing countries including Indonesia. Among goverment programs, poverty has become the center offattention in policy at both of the regional and national levels. Looking at thephenomenon of poverty, Islam present with solution to reduce poverty through Zakat. This study aims to analyze the effect of ZIS and Zakat Fitrah against poverty in Indonesia in 1998 until 2010, data used in this study is secondary data and uses time series data, for the dependent variabel is poverty and for independent variables are ZIS and Zakat Fitrah. The analysis tools used in this study is to use multiple regression analysis model and the assumptions of classical test using the software Eviews-4. In this study also concluded that the ZIS variables significantly affect to the reduction of poverty in Indonesia although the effect is very small. In the variable Zakat Fitrah not significantly affect poverty reduction in Indonesia because of the nature of Zakat Fitrah is for consumption and not for long-term needs. The results of this study can be used for the management of zakat to be able to develop the management and to get a better system for distribution of zakat so that the main purpose of zakat can be achieved to reduce poverty.<br />Keywords : Poverty, Zakat Fitrah, ZIS.</p>


2012 ◽  
Vol 2012 ◽  
pp. 1-15 ◽  
Author(s):  
Jia Chaolong ◽  
Xu Weixiang ◽  
Wang Futian ◽  
Wang Hanning

The combination of linear and nonlinear methods is widely used in the prediction of time series data. This paper analyzes track irregularity time series data by using gray incidence degree models and methods of data transformation, trying to find the connotative relationship between the time series data. In this paper, GM(1,1)is based on first-order, single variable linear differential equations; after an adaptive improvement and error correction, it is used to predict the long-term changing trend of track irregularity at a fixed measuring point; the stochastic linear AR, Kalman filtering model, and artificial neural network model are applied to predict the short-term changing trend of track irregularity at unit section. Both long-term and short-term changes prove that the model is effective and can achieve the expected accuracy.


Author(s):  
Shaolong Zeng ◽  
Yiqun Liu ◽  
Junjie Ding ◽  
Danlu Xu

This paper aims to identify the relationship among energy consumption, FDI, and economic development in China from 1993 to 2017, taking Zhejiang as an example. FDI is the main factor of the rapid development of Zhejiang’s open economy, which promotes the development of the economy, but also leads to the growth in energy consumption. Based on the time series data of energy consumption, FDI inflow, and GDP in Zhejiang from 1993 to 2017, we choose the vector auto-regression (VAR) model and try to identify the relationship among energy consumption, FDI, and economic development. The results indicate that there is a long-run equilibrium relationship among them. The FDI inflow promotes energy consumption, and the energy consumption promotes FDI inflow in turn. FDI promotes economic growth indirectly through energy consumption. Therefore, improving the quality of FDI and energy efficiency has become an inevitable choice to achieve the transition of Zhejiang’s economy from high speed growth to high quality growth.


2018 ◽  
Vol 7 (4.15) ◽  
pp. 25 ◽  
Author(s):  
Said Jadid Abdulkadir ◽  
Hitham Alhussian ◽  
Muhammad Nazmi ◽  
Asim A Elsheikh

Forecasting time-series data are imperative especially when planning is required through modelling using uncertain knowledge of future events. Recurrent neural network models have been applied in the industry and outperform standard artificial neural networks in forecasting, but fail in long term time-series forecasting due to the vanishing gradient problem. This study offers a robust solution that can be implemented for long-term forecasting using a special architecture of recurrent neural network known as Long Short Term Memory (LSTM) model to overcome the vanishing gradient problem. LSTM is specially designed to avoid the long-term dependency problem as their default behavior. Empirical analysis is performed using quantitative forecasting metrics and comparative model performance on the forecasted outputs. An evaluation analysis is performed to validate that the LSTM model provides better forecasted outputs on Standard & Poor’s 500 Index (S&P 500) in terms of error metrics as compared to other forecasting models.  


2004 ◽  
Vol 91 (3-4) ◽  
pp. 332-344 ◽  
Author(s):  
Jin Chen ◽  
Per. Jönsson ◽  
Masayuki Tamura ◽  
Zhihui Gu ◽  
Bunkei Matsushita ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document