Vol. 3(3) July 2010
Evaluation of Seismic Hazard Parameters for Bangalore
Region in South India
Anbazhagan P. 1*, Vinod J. S. 2 and Sitharam T.G. 1
In this paper, seismic hazard parameters are evaluated
and presented for Bangalore region following the different methods such as Gutenberg-Richter
(G-R) recurrence relation and maximum likelihood procedure and data sets. The seismic
data have been collected from various sources for area covering a radius of 350
km around Bangalore. A complete analysis has been carried out using the method as
proposed by Stepp27. From the analysis it was found that the seismic data is homogenous
for the last four decades irrespective of magnitude. The value of seismic hazard
parameter “b” was estimated for complete data by using G-R relation. Completed data
do not include the maximum reported magnitudes of 5 and above in this region. Hence
b value has been evaluated by considering mixed data magnitude range of 3.5 to 6.2
and 4 to 6.2 using Gutenberg–Richter6 recurrence relation. In addition seismic hazard
parameters such as, “b” of the magnitude- frequency relationship, R the mean return
period and Mmax maximum regional magnitude is evaluated based on maximum likelihood
procedure. It has been observed that the comparative analysis using complete and
mixed data, gives comparable values. The “b” values presented in this paper are
higher than the earlier reported values.
Full Text
Capacity Building for Disaster Prevention in Vulnerable
Regions of the World: Development of a Prototype Global Flood/Landslide Prediction
System
Hong Yang 1, 2*, Adler Robert F. 2, Bach Dalia 3 and Huffman George 4
With the availability of satellite rainfall analyses
at fine time and space resolution, it is now possible to develop a global flood/landslide
identification/prediction system for the most vulnerable regions by combining real-time
satellite observations with a database of global terrestrial characteristics. This
presentation describes a prototype research framework, Global Hazard System (GHS),
recently developed by a NASA-OU research partner group. Key components of the GHS-Flood
framework are: (a) a fine resolution precipitation acquisition system derived from
multi-satellites; (b) a characterization of land surface including digital elevation
from NASA SRTM (Shuttle Radar Terrain Mission), topography-derived hydrologic parameters
such as flow direction, flow accumulation, basin and river network etc.; (c) a hydrological
model to infiltrate rainfall and route overland runoff; and (d) an implementation
interface to relay the input data to the models and display the flood inundation
results to potential users and decision-makers. In terms of GHS-Landslide, the satellite
rainfall information is combined with a global landslide susceptibility map derived
from a combination of global surface characteristics (digital elevation topography,
slope, soil types, soil texture and land cover classification etc.) using a weighted
linear combination approach. In those areas identified as “susceptible” (based on
the surface characteristics), landslides are forecast where and when a rainfall
intensity/duration threshold is exceeded. The GHS (trial version) has been running
at near real-time in an effort to offer a practical cost-effective solution to the
ultimate challenge of building natural disaster early warning systems for the data-sparse
regions of the world (http://trmm.gsfc. nasa. gov). The interactive GHS website
shows close-up maps of the flood/landslide risks overlaid on topography/population
or integrated with the Google-Earth visualization tool. One additional capability,
which extends forecast lead-time by assimilating the satellite rainfall into the
Global Forecast System-QPF, also will be implemented in the future.
Full Text
Support Vector Machine for Evaluating Seismic Liquefaction
Potential Using Standard Penetration Test
Samui Pijush
Liquefaction of sandy soils during earthquakes causes
large amount of damage to buildings, highway embankments, retaining structures as
well as other civil engineering structures. So the determination of liquefaction
potential due to an earthquake is an imperative task in geotechnical earthquake
engineering. This paper examines the potential of support vector machines for prediction
of liquefaction based on standard penetration test (SPT) data from the 1999 Chi-Chi,
Taiwan earthquake by developing two models. SVM achieves good generalization ability
by adopting a structural risk minimization (SRM) induction principle that aims at
minimizing a bound on the generalization error of a model rather than minimizing
the error on the training data only. In MODEL I, cyclic stress ratio (CSR) vs SPT
value (N) value is trained for prediction of liquefaction. In MODEL II, this is
further simplified by relating normalized maximum horizontal acceleration (amax/g)
vs N for prediction of liquefaction. Further, the generalization capability of the
MODEL II has been examined by different case histories available globally. Equations
have been also developed to determine the soil condition during an earthquake for
MODEL I and MODEL II. The study indicates that SVM can successfully model the complex
relationship between seismic parameters, soil parameters and the liquefaction potential.
Full Text
Comparison between Prediction Capabilities of Neural
Network and Fuzzy Logic Techniques for L and Slide Susceptibility Mapping
Pradhan Biswajeet 1,2* and Pirasteh Saied 2
Preparation of L and slide susceptibility maps is important
for engineering geologists and geomorphologists. However, due to complex nature
of L and slides, producing a reliable susceptibility map is not easy. In recent
years, various data mining and soft computing techniques are getting popular for
the prediction and classification of L and slide susceptibility and hazard mapping.
This paper presents a comparative analysis of the prediction capabilities between
the neural network and fuzzy logic model for L and slide susceptibility mapping
in a geographic information system (GIS) environment. In the first stage, L and
slide-related factors such as altitude, slope angle, slope aspect, distance to drainage,
distance to road, lithology and normalized difference vegetation index (ndvi) were
extracted from topographic and geology and soil maps. Secondly, L and slide locations
were identified from the interpretation of aerial photographs, high resolution satellite
imageries and extensive field surveys. Then L and slide-susceptibility maps were
produced by the application of neural network and fuzzy logic approahc using the
aforementioned L and slide related factors. Finally, the results of the analyses
were verified using the L and slide location data and compared with the neural network
and fuzzy logic models. The validation results showed that the neural network model
(accuracy is 88%) is better in prediction than fuzzy logic (accuracy is 84%) models.
Results show that “gamma” operator (l = 0.9) showed the best accuracy (84%) while
“or” operator showed the worst accuracy (66%).
Full Text
Using Radar Data to extend the Lead Time of Neural
Network Forecasting on the River Ping
Chaipimonplin Tawee,1,2 See Linda M.* 2 and Kneale Pauline E. 3
Neural networks (NNs) and other data-driven methods are
appearing with increasing frequency in the literature for the prediction of river
levels or flows. Many of these data-driven models are tested on short lead times
where they perform very well. There have been much fewer documented attempts at
predicting floods at longer, more useful lead times from a flood warning and civil
protection perspective. In this paper NN flood forecasting models for the Upper
Ping catchment at Chiang Mai are developed. Simple input determination methods are
used to automate the process of which inputs to select for inclusion in the model.
Lead times of 6, 12 and 18 hours are tested. Radar data inputs are then added to
these NN models to see whether the lead time of the prediction can be increased.
The models without radar data show reasonable forecasting ability up to 18 hours
ahead but the addition of radar extends the lead times up to 36 hours ahead for
the prediction of the rising limb of the hydrograph and the flood peak.
Full Text
Latest Developments of Windshear Alerting Services
at the Hong Kong International Airport
Chan P.W.
Low-level windshear and turbulence are known to cause
hazard to the operation of the aircraft. During the operation of the Hong Kong International
Airport (HKIA) in the last 10 years or so, the detection and alerting of low-level
windshear has been continuously enhanced by the Hong Kong Observatory, including
the introduction of sophisticated instrumentation and detection algorithms to alert
pilots of the presence of low-level windshear and the use of more objective data
in the development and verification of the windshear alerting services. This paper
summarizes these development efforts in the recent years. On the detection side,
following the deployment of two LIDARs, one for each of the two runways, the LIDAR
Windshear Alerting System (LIWAS) has been enhanced for the alerting of windshear
over the departure runway corridors in addition to the arrival runway corridors.
A microwave radiometer has also been set up at the airport to detect temperature
changes in the boundary layer of the atmosphere in association with terrain-disrupted
airflow. Moreover, a short-range LIDAR is being tested in the detection of small-scale
wind disturbances arising from buildings at the airport. Studies are also conducted
in the calculation of turbulence intensity metric based on LIDAR and Terminal Doppler
Weather Radar (TDWR) data. On the data collection side, sophisticated algorithm
has been developed in collaboration with the National Aerospace Laboratory in
the Netherlands in the processing of Quick Access Recorder (QAR) data from commercial
jets in establishing a more objective database of windshear and turbulence based
on the flight data.
Full Text
A 3D Fire Spread Model Implementing Vegetation Combustion
applicable to Mountainous Forest Fires
Boboulos Miltiadis A.
Biomass combustion can be modelled on the governing equations
of mass conservation both for the gaseous phase and the solid fuel. Current modelling
examined the applicability of various chemical reactions describing dense surface
layer combustion, the production of volatiles, char and tar and their subsequent
oxidation. It employed an analytical five-step chemical reaction mechanism EDM +
Kinetics to describe the mixing effect and the chemical reactions. Applying the
litter combustion model into a realistic 3D domain implemented the vegetation combustion
mechanisms producing behaviour accounting. The process can be evaluated on the characteristics
and the range of its main parameters – temperature field and its distribution, velocity
field, the biomass energy contents and products, measured on the basis of expert
knowledge. Numerical results were compared to integral methods such as fire behaviour
prediction nomographs and controlled field experiments. The modelling was enhanced
with model adjustments in the constants and boundary conditions implementing TGA
results, experimental spread rates and IR camera measurements. The model determines
the concentration of gas components in the fire zone in order to establish their
effects on environment for various fire intensities and scale, the temperature -
fluid flow distribution and flame characteristics.
Full Text