Wednesday, August 26, 2015

The Business Model and Database Design

What is a "relational database"? You can look it up on Wikipedia:

A relational database is a digital database whose organization is based on the relational model of data, as proposed by E.F. Codd in 1970.[1] This model organizes data into one or more tables (or "relations") of rows and columns, with a unique key for each row. Generally, each entity type described in a database has its own table, the rows representing instances of that type of entity and the columns representing values attributed to that instance. Because each row in a table has its own unique key, rows in a table can be linked to rows in other tables by storing the unique key of the row to which it should be linked (where such unique key is known as a "foreign key"). Codd showed that data relationships of arbitrary complexity can be represented using this simple set of concepts.

The definition goes on to explain the differences with hierarchical data structures, etc.  Perhaps technically correct but doesn't tell the whole story.  To me the relational database is used to define how logical subsets of data are related and how they defend the integrity of the business model the database supports. 

Data Tables

In general the tables in a database should be "as little as possible" consisting of the least number of columns possible to define a unique record.  The link to other tables defines fundamental relationships between the data - like "parent-child" for example.  When constructed the entire database defines not only these relationships, but how data flows through the business process that supports it.  A well designed database defines the entire business model and can accommodate changes and additions with minor modifications.  This can happen when the designer spends enough time with his or her feet on the ground to understand the business process, and creates data structure that is granular - almost molecular - in it's composition. This takes the most time to create, but also creates the most flexible and long-lasting structure.  There are many other considerations, but there is no substitute for the really hard work of defining the business model with the database. 

A relational database is not a spreadsheet - or a collection of spreadsheets.  A spreadsheet makes sense for 2-dimensional representation of data and is used primarily to inform the human eye.  It works great for the human eye because we can quickly relate to the two-dimensions and peer down into the individual data pieces.  But it's not efficient for a computing engine - something that is designed to find and extract pieces of information as quickly as possible.  The example below shows how the eye can quickly find a measurement by triangulating between dates in the rows and the instrument in the columns. Suppose you were asked to find OW-12 on 7/20/94:

While the eye can do this in an instant, it's very inefficient form of data storage.  One way
to understand why is to look at the column headers.  They are unique for each instrument so they essentially require a custom data structure.  If you add a new measurement, you change the table structure.  Every time you search you don't know which column the result will be in.  Compare that structure with the following.

The table above would be used in a relational database to store the data shown in the spreadsheet. The data has been 'normalized' to minimize data redundancy and provide an efficient search path.  It consists of only three columns no matter how many instruments you have.  The first two columns define a unique record - the date and the sensor name. To find the record we found in the spreadsheet we work from left to right to find the day, then sensor, and then the value.  Not as easy for the eye perhaps, but easier for a database.

So how is this structure used to define a Business Model?  By first defining what constitutes a unique record in a table (called Primary Keys), and then creating relationships between tables, you define how data will be used to define your business model.


Example - Customers, Contracts and Plants

Business Model 1

Lets assume your database is defining a customer, the contracts you have with that customer, and the plant where the work will be done.  Maybe you first consider a simple business model like:

" Plants and Contracts belong to a Customer.  Multiple Projects can be grouped under Contracts"

This can be represented with a simple organization chart as follows:

Business Model 2

What if another customer then presented you with another business model scenario, like:
"Project Numbers are specific to Plants, with the possibility of multiple projects under a single contract"
You need to be able to define something like the structure shown in the figure below where Plants are related to Projects:

Business Model 3

Then another customer presents you with another business model:
"Two separate customers with their own contracts and projects are using the same plant."
Unless you want to spend all your time programming, you want your database design to be able to represent and enforce the integrity of ALL the business models you have to support. The figure below shows an actual database design that supports the above scenarios.
It's not as complicated as it looks.  It almost looks like the data structure for Business Model 1.  The key is that Plants is not connected directly to Customers.  It is linked to both Projects (Contract_Projects) and to Customers.  The link to Customers is not direct either.  It is using a "relationship-table" where the relationships between plants and customers is defined.   All the above relationships defined by business models 1 through 3 are supported by this structure - without data redundancy.

Tuesday, August 25, 2015

Monitoring Dam Safety

Monitoring a dam is a lot harder than you think.  Some might think that it's easy because, in general, dams don't move.  But it's precisely for this reason that dam monitoring is difficult and requires a special discipline.  How to stay interested when there isn't anything interesting going on?

Did you know that many dams in the US have reached or exceeded their design life?  We don't build many new dams because it's difficult to get a dam through the approval process, and partially because this is so, the cost is excessive. We are increasingly dependent on aging dams (like our aging infrastructure as a whole).  If a dam is past it's design life it isn't necessarily in danger of failing.  In fact in all likelihood most well-designed and built dams will still be sitting there when the lake behind it is filled with silt.  But predicting behavior in an earthquake, for example, becomes difficult. We can't say with certainty that a dam 10 years past it's design life of 50 years (60 years old) will behave a certain way in an earthquake - since we don't have similar observation to go by.  So we must watch our dams a little more carefully.

How do we watch a dam?  In general there are four basic types of measurements one can take to monitor the long term condition of an embankment dam:

  1. Pore pressure measurements
  2. Surveying surface points
  3. Seepage measurements
  4. Visual observation
The first three methods can be automated and often are.  Of these three, pore pressure measurements is the simplest way to get a direct measurements of the dams current condition compared to its theoretical design.

Pore Pressure Measurements
The proper design of a dam requires an understanding of how water will ultimately flow through the dam.  As the diagram shows below, the idea is to NOT let the phreatic surface, or top of the saturated zone, reach the toe of the dam with enough pressure behind it to flow with destructive force.  Many dam designs incorporate clay cores and gravel drains to prevent any seepage to reach the front shell of the dam.  Many dam monitoring systems use buried pressure transducers - called "piezometers" - to measure the pore water pressure in the dam and define the phreatic surface.  These devices are buried in a dam and can measure water pressure even with very small amounts of water being present.

Figure 1 - Flow net through embankment dams with and without drain blanket

In general, one would like to see measured pressures fall within certain operational limits based on reservoir head and the location of the pressure sensing element.  A pressure sensing element placed anywhere within a dam should read close to the design pressure - called the design phreatic surface.  In the event of an earthquake (if the dam is located in a seismic zone) then the before and after pressures should be the same.  If they are not then one has to quickly determine why.  A certain amount of increased pressure might be expected in the less permeable portion of the dam due to  pore pressure build up during seismic shaking.  But if the dam structure fails in any way that allows a more direct connection to reservoir head, this could cause critical failure. The plot below shows historic pressure readings in a dam in Northern California compared to reservoir elevation data.

Figure 2 - Historic piezometer and reservoir elevation plot
Historical Trends
Having piezometers wired up to a recording system allows pore pressure to be monitored at regular intervals in all conditions.  This provides a historical perspective for not only evaluating dam safety, but also understanding the actual conditions within the dam.  A piezometer like that shown in Figure 2 shows an attenuated response to reservoir level, but at a lower pressure level than full reservoir head.  This is normal response for a piezometer located in the interior of the dam.
Some piezometers located above the phreatic surface will exhibit no response to reservoir head (Figure 3) and some in highly permeable zones (for example in the dam abutment) will mimic reservoir level when water levels are above their tip elevation (Figure 4). 
Figure 3 - Historic piezometer plot with no response
Figure 4 - Historic piezometer plot that follows reservoir level
All of these piezometer time series provide a signature - like an electrocardiogram (EKG) does for a heart - of the interior conditions within a dam.  From the outside a dam may look static and unchanging, but a electronic monitoring of the dams interior pressure provides a more dynamic picture - one that changes with the seasons and with age.  It's an interesting and invaluable perspective for assessing the health of our aging dams. 

Thursday, July 9, 2015

The Way of the Ship

“All deep sea shipping must survive in the face of fierce international competition.  A more virile nation with mariners not accustomed to expensively too good conditions can always arise and run the softer ships from the face of the sea.”  Alan Villiers –1952 “ The Way of a Ship”
I love that line " the softer ships from the face of the sea."  It's a brutal poetics that has me first smiling and then feeling scared every time I read it.  When I first read this sentence in a book talking about the 'romantic' days of Clipper commerce, I was struck about how true it is now for any business.  When this book was written the world was much larger than it is today and we competed internationally very infrequently.  Now we compete internationally in almost everything.  I've done some business in east Asia and every time I go I'm amazed at how hard people work.  The competition is fierce.  On one recent trip I had the chance to talk to a head IT guy at Oracle who is in charge of setting up server farms all over the world.  He told me that they would rather hire a worker from Asia than from the US - primarily because workers in Asia are used to working harder and don't have an "8 to 5" mentality.  Really - if you are working with anyone in any part of the world besides your own time zone you can't have this mentality.  You must adapt a part of your schedule to fit with a compatriot on the other side of the world.  When I have to communicate with someone in Taiwan I have to wait until at least 4 PM in the afternoon - and that is just the beginning of their day.  And they are one day ahead of us.  Their week begins on our Sunday and ends on our Thursday.  I'm sure that many people working in Silicon Valley know this.  But many people in the US don't - and it's a severe handicap to our competiveness in the modern world.  We've become soft in many sectors and out of sync with the real working world with our expectation of working only between the hours of 8AM and 5PM (at some public agencies the day seems to end at 3PM).  I'm not going to say this is "entitlement" because everyone says that - and everyone wants to be entitled.  Let's say we 'deserve' it.  We'll take what we can get.  But the question is whether we can afford it.  Today there is another "virile" nation or two that are less used to the good life.  Are we just about to be wiped from the face of the sea?  Have we already been wiped from the face of the sea?

Friday, March 13, 2015

Benefits of Automated Monitoring

From a paper published by the author that describes the use of tiltmeters for advanced warning for natural disasters (landslides) and construction induced movements.  Both of these examples show fine correlation of instrument readings and observed phenomenon.  For those that wonder about the application of trigonometry in the real world - see Case History 2.

Because of recent advances in monitoring systems, their benefits are within reach of large and small civil engineering firms, and government agencies.  When included as part of the project design, monitoring systems can reduce costs that would otherwise be incurred as a result of overly conservative design assumptions.  Monitoring new or existing construction helps maintain safe working conditions by providing early warning of instability and therefore time to correct the problem.  Also, monitoring provides valuable quality assurance for verifying that as-built construction conforms to design.


This project shows that a relatively simple, easy to install automated system can be used to provide early warning of catastrophic slope failure. 

Project Description
The winter of 1998 brought devastating rain and landslides to California.  Several automated remote systems were designed and installed to monitor the stability of slopes during repair and cleanup.  One of these systems involved monitoring at the top of the slope in the vicinity of the Laguna Niguel landslide in southern California.  Slope movements long before the slide revealed that the homes had been built too close to the edge of a steep slope.  Personnel safety of current residents was of primary concern.  Several homes had to be abandoned, and a program of automated monitoring was implemented to provide advanced warning of imminent slope failure. 
Automated Instrumentation
The instruments installed to monitor the landslide consisted of an array of in-place inclinometers (IPIs) buried in shallow holes along the top of the slope, between the residences and the break in slope (Figure 1). 
Figure 1 - Plan view of instrument locations at Laguna Niguel
An inclinometer measures its own rotation and, therefore, the rotation of the structural element or portion of ground to which it is connected.  These instruments are particularly adept at measuring movement of landslides occurring on circular slip surfaces.  However, even landslides that are predominantly translational will produce tilts that are easily detected with conventional tiltmeters or inclinometers.  If a slope is moving, tiltmeter surveying can determine the direction of movement, delimit the areas of deformation and, in many cases, reveal the mechanism of movement (slumping, slope creep, settlement, etc.) (Figure 2).
Figure 2 - Tilt vectors show different slope failure mechanisms
In the case of the Laguna Niguel slide, the shallow inclinometers measure near-surface movements in the critical area between the free face of the slope and the house foundations.  With sensitivities on the order of 0.1mm/m, the buried inclinometers can measure small movements that are unobservable with other techniques.
The inclinometers were sanded into holes excavated with a hand-auger, and the cables were routed along the ground surface to an automated data acquisition system equipped with a phone modem and autodialer.  Threshold limits were used to trigger the autodialer and notify the project engineers via pager.  Measurements for comparison with the thresholds were taken once per minute, and recorded into the datalogger every fifteen minutes.

Upon installation and activation, the inclinometers showed a pattern of ground movement consistent with a “slump” type failure.  Inclinometers outside of the headscarp area showed no movement.  Those in the crown area rotated toward the slope.  And those in the headscarp rotated upslope, indicating backwards rotation of the slump block.  The movement was at a constant velocity for over 10 days.  Then several inclinometers showed a very gradual increase in the rate of movement over the next 7 to 10 days.  On March 15th, the 20th day after monitoring commenced, several inclinometers showed a distinct increase in the rate of movement, indicating a change in state of stability of the landslide mass (Figure 3).
Figure 3 - Recorded relative angular rotation versus time shows change in rate
The failure occurred in the early morning hours of March 19, 1998 in rather dramatic fashion (Figure 4).  At 2AM part of the fill slope failed.  Two homes were located in the headward part of the slump and were destroyed as they were carried downward on the upper part of the slump. Three other homes were left partly cantilevered over the crown and main scarp of the slump. On the 20th one of the partly cantilevered homes was destroyed as it fell over the main scarp of the slump.  As the displaced material comprising the slump moved downslope, the toe ( the most distal part) of the landslide moved against and destroyed five condominiums at the base of the fill slope.
Figure 4 - False color infrared view of Laguna Niguel landslide (from Geo-Tech Imagery Intl.)
Advance warning
The autodialer was activated at approximately 4AM on March 17th .  The increase in rate of movement was significant enough to cause the evacuation of several more houses in the vicinity of the headscarp (two houses were already evacuated when the automated instrumentation was installed).  Four days after the significant change in rate was observed, and two days after the alarm was triggered, the slope failed catastrophically.
The use of the buried inclinometers to monitor ground movement gave a minimum of 4 days advance notice of the impending catastrophic failure.  Sensitive instruments such as these can measure movements much smaller than can be observed with the naked eye.  A continuous record of instrument readings provides important information about the nature of subsurface movements, and aids engineers and earth scientists in establishing thresholds for failure monitoring.

This project shows that sensitive automated instrumentation can be used to distinguish between normal and construction induced movement of a bridge.  Real-time monitoring can incorporate modeling of normal movements to establish “baseline” behavior.  Alarming the difference between modeled behavior and measured behavior results in almost instantaneous notification of excessive construction induced movements, which streamlines the construction process.

Project Description
In June of 1999, Hayward Baker performed compaction grouting of the foundation soils beneath Laurel Street Bridge in Santa Cruz, California.  This work was performed as part of an extensive program of seismic upgrades to many of California’s bridges after the 1989 Loma Prieta earthquake.  The Laurel Street Bridge is a cast-in-place reinforced concrete structure that spans approximately 350 feet (106.7 m) across the San Lorenzo River near downtown Santa Cruz.  It is supported on a battered pile foundation.  Each of the two side spans is approximately 100 feet (30.48 m) long.  The length of the center span is 150 feet (45.72 m).

Automated Instrumentation
The project specifications limited vertical bridge deck movement to 0.1 inch during any grouting episode.  A good rule of thumb is to use an instrument with at least 20 times higher resolution than the minimum specified movement.  High-resolution tiltmeters are one of the few instruments that can reliably measure angles smaller than 1 arc second.  The tiltmeters used were Model 800 and Model 711, manufactured by Applied Geomechanics.  The tiltmeters have a published resolution of between 0.25 and 0.5 arc seconds – or 50 to 100 times smaller than the maximum allowable movement.
To convert rotation measured by the tiltmeters to displacement requires integration of the angular measurements over some finite length.  The rigidity of the structure allows for a fairly simple model for calculating displacements.  The tiltmeters measuring rotation parallel to the bridge axis were used to measure vertical movement of the bridge deck between the abutment and support piers.  For this purpose the abutment is assumed to be a fixed point, and the bridge deck is assumed to be rigid (Figure 5).  Vertical displacement (heave) is then calculated by assuming the angle of rotation, theta, measured by the tiltmeter is occurring over the entire span.  Heave (h) is therefore calculated as h=(70ft)*sin(theta), where theta is the angle measured with the mid-span tiltmeter. 
Figure 5 - Math in action.  The simplified model for calculating bridge deck displacement.

The tiltmeters were monitored continuously using a Campbell Scientific CR10X datalogger.  Alarm thresholds were used to activate a strobe light in the event of excessive movements.  Four of the six tiltmeters were installed near the joining of the support columns and bridge deck to provide a first indication of movement transferred through the footing to the deck.  Two of the tiltmeters were installed along the span midway between the footing and abutment to measure changes in deck elevation (Figure 6).
Figure 6 - Location of tiltmeters used to measure bridge deck movement

Figure 7 shows the results obtained from tiltmeter 21, mounted on the eastern span during grouting beneath the east footing.  The tiltmeter shows excellent correlation to the average of four vertical survey points on the bridge deck throughout the 60+ day period of monitoring.  However, the tiltmeter is able to accurately measure displacements less than 0.02 inch (0.5mm).  This is approximately 10 times better precision than that available using conventional surveying.  However, the real benefit to this approach is the ability to measure and respond to bridge movement in real time.
Figure 7 - Calculated displacement from Titlmeter 21 compared to average of 4 surveying points
Real Time Modeling
The sinusoidal nature of the data obtained from the tiltmeters is the thermoelastic expansion and contraction of the bridge due to diurnal temperature changes.  After the onset of baseline monitoring, it was discovered that the normal daily movement of the bridge was about the same as the specified maximum allowable movement.  Distinguishing the normal daily movements of the bridge from those caused by the compaction grouting turned out to be the most challenging aspect of the job.
In this instance it was decided to model the diurnal bridge motion with the simple sine wave function that includes parameters to adjust the amplitude, phase, and symmetry (skewness) of the waveform.  This is relatively easy to program within the datalogger and results in alarms that are responsive to grout-induced movement (Figure 8).
Figure 8 - Modeled diurnal bridge movement used to establish baseline for alarming
  Periodic adjustment of these parameters was necessary to account for variations in the diurnal behavior – caused for instance by the increased firmness of the foundation as the grouting proceeded.  The program was written to activate a flashing light when the difference between the model and the measured values exceeded 0.1 inch.  The flashing light was a signal to the grouting operators to cease pumping within the current stage and move up to the next stage.  After five minutes the program turned off the light and “re-zeroed” the alarm threshold by bringing it in conformance with the current tiltmeter reading.

Automated data acquisition systems allow for instruments to be sampled at any rate - typically multiple samples per minute, per hour, or per day.  Measurement accuracy is improved, and data can be remotely processed to provide useful information to the project team.  Other benefits of automated data acquisition include:
  • Human errors associated with manual reading and data transfer are virtually eliminated.
  • Data collection can be performed easily at remote site locations and in bad weather.
  • Data are available 24 hours per day.
  • Precise timing of structural and Geotechnical events makes it possible to correlate them with external factors such as rainfall, earthquakes, grading and repairs.
  • Continuous monitoring means that critical changes can be detected quickly, so that action can be taken before adverse conditions become worse.
  • Automated data acquisition systems can be programmed to monitor threshold values and rates of change and, therefore, can issue automatic warnings when predetermined limits are exceeded.

Wednesday, March 11, 2015

Cold Circuits versus the Naked Eye

Musings on Automated Monitoring in the Digital Age
I was introduced to the practice of "automated monitoring" in the Civil/Structural profession where we connected sensors that measured displacement, tilt, strain or water level to computers to monitor physical processes for primarily advanced warning purposes.  For example we monitored slopes and excavations for failure, bridges and buildings during construction and dams for post earthquake displacement.  Back when we started we were just learning about collecting digital measurements and finding timely ways to turn them into usable information.  The Internet was in it's infancy before broadband and mobile appliances.  At that time you could hear an interesting argument going on in publications and at conferences between those of us in the new school who were excited by what we saw in the digital signature of an analog instrument, and the old school who were trained to 'observe' with the physical world with the naked eye. The Master of the school of observation was a man by the name of Karl Terzaghi.  I call him a Master because he was a great, great scientist and engineer who made a huge mark on the practice of civil engineering and geology (see for example Karl Terzaghi: The Engineer as Artist
by . He was a renaissance man and natural observer - one who tried to see the physical world without prejudice and used his technical expertise and human understanding - intuition perhaps - to solve many engineering problems.  His teachings and books informed an entire generation of geotechnical engineers, and I came along at the beginning of the digital age and was taught by his pupils.  One foot in flesh-and-blood engineering and one foot in the digital age.
Many engineers back then were concerned that automated monitoring results in a “black box” approach where visual examination and experience are replaced by cold circuits and relays to provide advance warning of potential problems.  I suppose this could happen.  At a minimum we would have to acknowledge that communication technologies are not infallible and waiting for an alarm transmitted through some wires and not going out and using active observational techniques would be foolish.  But Terzaghi and his students advocated one to go out and look for problems.  He thought that was the best way to find issues before they became big problems.  
Another point of view argued at that time was that data obtained from continuous monitoring can provide the engineer with new information about the behavior of the structure being monitored that is not apparent to the naked eye.  This added perspective can actually increase the engineer’s knowledge base, giving them a deeper understanding of structural response.  Furthermore, the utilization of real-time monitoring systems in combination with advanced analysis tools enables designers to take economic advantage of these new insights without sacrificing construction safety.
Are cold circuits better than flesh-and-blood?  Ironically in a few short years we've probably travelled to the other end of the spectrum.  Walking across a busy street with our eyes glued to our progress on a digital map?  Hopefully not that bad - although I think I've seen it.  Perhaps it wouldn't be a bad idea to study the teachings of people like Terzaghi and Ralph Peck even in non-engineering fields to remind ourselves of what the naked senses are and how they have been applied to accomplish some incredible things.  Automated monitoring has definite benefits to the welfare of our society and environment.  But we should get outside and feel our environment at the same time.