ANAMAR News

Stay up to date with our latest news.

ANAMAR’s Work in Charleston Harbor Profiled in "Dredging Today"

ANAMAR’s Work in Charleston Harbor Profiled in "Dredging Today"

(Pictured above is the cooper marl we frequently encountered while collecting core samples)

The Charleston Harbor Federal Navigation Channel covers an area of approximately 14 square miles and is formed by the confluence of the Ashley, Cooper, and Wando rivers. Maritime interests want the harbor channel deepened beyond 45 feet so the Port of Charleston can handle the larger container ships that will routinely call when the expanded Panama Canal opens in 2015. In response to this need to accommodate larger ships and increasing ship traffic, a feasibility study is being conducted for the Charleston Harbor Navigation Improvement Project to analyze and evaluate improvements to Charleston Harbor. The Post-45 feasibility study examines the economic benefits and environmental impacts of the deepening project and determines what depth would be recommended for construction. ANAMAR was contracted to conduct sediment evaluations to determine if the proposed dredge material is suitable for disposal in the Charleston Harbor ocean dredged material disposal site (ODMDS) and to help identify potential beneficial uses for dredged material such as habitat development, shore protection, or beach nourishment.

Charleston 103012 007 FILEminimizer

ANAMAR managed all sampling operations and worked closely with subcontractors to coordinate logistics. The sampling plan included collection of vibracore samples at 105 sites, plus grab samples at the reference station, and site water samples at three locations for elutriate preparation. Due to the size of the project, the sampling effort took nearly 4 weeks to complete and presented some unique challenges. Inclement weather caused by Tropical Storm Sandy followed 2 days later by a winter storm resulted in minor delays in sampling operations. This area also experiences six-foot tidal fluctuations resulting in very strong currents during incoming and outgoing tides; therefore, the sampling team had to plan daily sampling operations during workable currents (i.e., slack tides). Since sampling was taking place within the shipping channel and berthing areas, the captain maintained regular communication with the ships so that sampling would not interfere with shipping traffic. The physical composition of the sediment itself also proved to be challenging. Most of the sediment in the areas of interest was highly consolidated Cooper Marl, which was difficult to penetrate and to remove from the core barrel. A method was developed in the field to pressurize the core barrel using compressed air to extrude sample material from the barrel. This “on-the-fly” innovation helped the field effort stay on schedule.

PB050043 FILEminimizer

Coordinating sample delivery with the chemistry and bioassay laboratories to meet holding times while field operations were ongoing required multiple sample shipments due to holding times and the amount of time required to collect all the samples. It was necessary to run the bioaccumulation tests in two batches due to holding times and the laboratory space required for such a large number of samples. Close coordination with the laboratories and couriers was critical.

PB050052 FILEminimizer

ANAMAR succeeded in collecting all the required sample material and processed and shipped the material to the laboratories within holding times. ANAMAR reviewed and evaluated all the laboratory data and produced a report summarizing the results of the physical, chemical, and toxicological analysis of sediment, elutriate, water, and tissue samples of the proposed dredge material collected from the project area.

Below is a quotation from the news article: Dredging Today (July 2, 2015) "Post 45 Project Gets Funding"

"The Charleston Harbor Post 45 Deepening Project is the first project in the U.S. Army Corps of Engineers to go through the Corps’ new Civil Works Planning Process from start to finish.

This has enabled the Charleston District to reduce the initial study timeline of five to eight years down to less than four years, and reduce the initial study budget from $20 million to less than $12 million dollars. This project will serve as a model for Corps civil works projects around the world."

Continue reading
4054 Hits
0 Comments

ANAMAR's Recent Work Profiled in International Dredging Review

ANAMAR's Recent Work Profiled in International Dredging Review

ANAMAR’s recent work with the U.S. Army Corps of Engineers–Jacksonville District to sample Jacksonville Harbor has been profiled in International Dredging Review.  The project is part of the Jacksonville Harbor Deepening, one of the five major ports mentioned in President Obama’s “We Can’t Wait” initiative from 2012.  Check out the International Dredging Review news article to learn more!

Continue reading
1547 Hits
0 Comments

The EPA Entering Public Comment Period for new Clean Power Plan

The EPA Entering Public Comment Period for new Clean Power Plan

EPA has won a ruling in the Supreme Court on proposing emission guidelines for greenhouse gases emitted by fossil-fuel-fired electric generating units. Although there are limits at power plants for pollutants such as arsenic and mercury, there are no national limits on carbon emissions. On June 2, 2014, under President Obama’s Climate Action Plan, EPA proposed a Clean Power Plan to cut these emissions. According to EPA’s website, power plants account for one-third of all domestic greenhouse gas emissions. Within the power sector, the plan is predicted to cut back the emissions by 30% from 2005 levels. The plan will also cut pollutants that cause soot and smog by more than 25% by 2030. According to the website, the plan estimates the public health benefits to be worth an estimated $55 billion to $93 billion in savings by the year 2030.  Included is the prevention of 2,700 to 6,600 premature deaths and 140,000 to 150,000 asthma attacks in children. The plan is also predicted to shrink power bills by 8% by 2030. During the week of July 28, 2014, EPA will hold four public hearings on the proposed Clean Power Plan in Atlanta, GA; Denver, CO; Pittsburgh, PA; and Washington, DC.

EPA is now holding a public comment period. All public comments on the Clean Power Plan Proposed Rule must be received by October 16, 2014. Directions for how to comment on the Clean Power Plan Proposed Rule can be found here.

The Federal Register contains a copy of EPA’s Clean Power Plan Proposed Rule and can be found here.

Continue reading
1739 Hits
0 Comments

ANAMAR’s President Nadia Lombardero Co-Presenting a Paper with the Region 4 EPA at 33rd PIANC World Congressional Conference

 Nadia Lombardero is in San Francisco at the 33rd PIANC World Congress where the theme is “Navigating the New Millennium.” Nadia is co-presenting a paper with EPA Region 4 entitled “Monitoring, Assessment and the Remediation of Elevated PCB Levels at a Deepwater ODMDS.”  Ocean Dredge Material Disposal Sites (ODMDS) are critical components of the nation’s navigation requirements and national security.  Disposal site monitoring is a requirement of the Marine Protection, Research and Sanctuaries Act of 1972 (MPRSA) and is conducted to ensure the environmental integrity of a disposal site and the areas surrounding the site to ensure that contaminants are kept out of the food chain and in compliance with the law.

 

Continue reading
2600 Hits
0 Comments

Port Aransas Sampling Expected to Begin Tomorrow

Port Aransas Sampling Expected to Begin Tomorrow

 

ANAMAR’s sampling team, Terry Cake and Manager Michelle Rau, is in Corpus Christi Bay area and will start sampling operations tomorrow. The sampling will be performed at the federally maintained Corpus Christi Ship Channel and the offshore ocean dredged material disposal site (ODMDS). Good luck to the sampling crew!

 

Continue reading
1815 Hits
0 Comments

ANAMAR Awarded Contract with the U.S. Army Corps of Engineers, Galveston District

ANAMAR Awarded Contract with the U.S. Army Corps of Engineers, Galveston District

 

The U.S. Army Corps of Engineers, Galveston District has awarded ANAMAR a contract to perform environmental services for the collection and analysis of water and sediment samples.  The sampling will be performed at the federally maintained Corpus Christi Ship Channel and the offshore ocean dredged material disposal site (ODMDS).

The purpose of the testing is to evaluate shoal material prior to maintenance of the channel to determine whether unacceptable impacts would result from dredging operations.  The Marine Protection, Research, and Sanctuaries Act of 1972 (MPRSA) prohibits placement of material into the ocean that would unreasonably degrade or endanger human health or the marine environment.

USACE-Galveston District employs an environmental management framework to provide structure and accountability within its business processes to help enhance and expand the positive impacts of its mission while reducing, mitigating, or eliminating potential negative impacts.

Work is expected to begin in May 2014.

 

Continue reading
1565 Hits
0 Comments

NOAA Publishes Data Collected from the Deepwater Horizon Oil Spill

On April 20, 2010, the Deepwater Horizon offshore oil drilling rig exploded and started leaking oil 5,000 feet below the ocean’s surface in the Gulf of Mexico. A process known as a Natural Resource Damage Assessment (NRDA) was set in motion under the Oil Pollution Act of 1990. With the combined help of many federal and state agencies, private industries, and academic institutions, years’ worth of data have been collected and the analytical chemistry results from the Deepwater Horizon oil spill have finally been made available to the public.

Here is a link to the site where you can find those results:

http://www.nodc.noaa.gov/deepwaterhorizon/specialcollections.html

Continue reading
1561 Hits
0 Comments

Word from the Field: Oregon ODMDS Sampling

Word from the Field: Oregon ODMDS Sampling

 

Congratulations to the ANAMAR crew working on the Ocean Dredged Material Disposal Sites off the Oregon coast. They completed sampling eight stations in and around the Chetco ODMDS yesterday and are collecting samples from the Coquille site today. While sampling, the crew spotted a grey whale and witnessed some picturesque fog rolling through.

Pictured above is the box corer the crew used to collect samples from the Chetco ODMDS, and below is a photo of the Oregon fog.

michportlpic2 FILEminimizer

 

Continue reading
2037 Hits
0 Comments

Data Reporting: Treatment of Outliers

Excerpted from the Southeastern Regional Implementation Manual (SERIM)

 7.4 Data Reporting and Statistics for Bioassay and Bioaccumulation Testing

 7.4.1 Definition and Treatment of Outliers

In most biological testing, some data points will be either much smaller or much larger then would be reasonably expected. Intuitively, outliers can be thought of as individual observations that are "far away" from the rest of the data. Outliers can be the result of faulty data, erroneous procedures, or invalid assumptions regarding the underlying distribution of all the data points that could potentially be sampled. In practice, a small number of outliers can be expected from a large number of samples including those that follow a normal distribution. Several techniques are available for outlier detection. Tests that involve hypothesis testing on data assumed to be normally distributed include Grubb's test, Rosner's test, and Dixon's test. The main advantage of using one of these formal statistical procedures is the ability to limit the risk of falsely flagging a valid data point as an "outlier".

When suspecting that a data point might be an outlier during the statistical analysis of bioassay and bioaccumulation data, the analysis should be performed twice, once with the suspected outlier and again without it. Both results should be reported and an explanation of why the outlier is believed to deserve exclusion or inclusion with the analysis should be presented. Such an explanation should not rely solely on the fact that some statistical test detected the outlier. In general, the more environmentally conservative approach should be utilized.

Citation: USEPA/USACE. 2008. Southeast Regional Implementation Manual (SERIM) for Requirements and Procedures for Evaluation of the Ocean Disposal of Dredged Material in South­eastern U.S. Atlantic and Gulf Coast Waters. EPA 904-B-08-001. U.S. Environmental Protection Agency Region 4 and U.S. Army Corps of Engineers, South Atlantic Division, Atlanta, GA. http://www.epa.gov/region4/water/oceans/documents/SERIM_Final_August 2008.pdf

 

Continue reading
2024 Hits
0 Comments

SERIM: Water Quality Criteria

 

Water Quality Criteria

Excerpted from the Southeastern Regional Implementation Manual (SERIM)

3.2.1.1 Screen to Determine WQC Compliance

 

A screening method utilizing sediment chemistry can be used to determine compliance. The screen assumes that all of the contaminants in the dredged material are released into the water column during the disposal operation (see Section 10.1.1 of the 1991 Green Book). If the numerical model predicts that the concentration of all COCs released into the water column are less than the applicable WQC, the marine WQC LPC is satisfied.

The model needs to be run only for the COC that requires the greatest dilution. If the contaminant requiring the greatest dilution is shown to meet the LPC, all of the other contaminants that require less dilution will also meet the LPC. The contaminant that would require the greatest dilution is determined by calculating the dilution that would be required to meet the applicable marine WQC. To determine the required dilution (Dr), the following equation is solved for each COC:

Dr = (Cs-Cwq) / (Cwq - Cds)                                 [Eq. 3-1]

where

Cs =    concentration of the contaminant in the dredged material elutriate, expressed as micrograms per liter (μg/L) as determined by either equation 3-1 below or by elutriate chemical analytical results discussed in Section 3.2.1.2.

Cwq =  applicable marine WQC (EPA WQC or state WQS), in (μg/L)

Cds =   background concentration of the constituent at the disposal site water column, in μg/L

NOTE:Dilution is defined as the volume of ambient water in the sample divided by the volume of elutriate water in the sample.

Note that most contaminant results are reported in micrograms per kilogram (μg/kg) dry weight. To convert the contaminant concentration reported on a dry-weight basis to the contaminant concentration in the dredged material, the dry-weight concentration must be multiplied by the mass of dredged-material solids per liter of dredged material:

                                  [Eq. 3-2]

where 

Cdw =  contaminant concentration in dredged material, reported on a dry-weight basis (μg/kg)

ns =    percent solids as a decimal

G =    specific gravity of the solids. Use 2.65 if site-specific data are not available.

A table showing each contaminant and the dilution required to meet the WQC should be provided with the analysis. Alternatively, a module in the STFATE model can be used. The module requires the solids concentration (g/L), which is the term in brackets in Equation 3-2 above multiplied by 1000.

The concentration of the contaminant that would require the greatest dilution is then modeled using a numerical mixing model. Model input parameters are specific to each proposed dredging project and each ocean disposal site. Standard STFATE input parameters for each disposal site are being developed with each ODMDS-specific SMMP. They are included in Appendix G along with additional guidance on model usage. The key parameters derived from the dispersion model are the maximum concentration of the contaminant in the water column outside the boundary of the disposal site during the 4-hour initial-mixing period or anywhere in the marine environment after the 4-hour initial-mixing period. If both of these concentrations are below the applicable marine WQC, the WQC LPC is met and no additional testing is required to determine compliance with the WQC. If either of these concentrations exceeds the WQC, additional testing is necessary to determine compliance with the WQC, as described in the next section.

 

 

Citation: USEPA/USACE. 2008. Southeast Regional Implementation Manual (SERIM) for Requirements and Procedures for Evaluation of the Ocean Disposal of Dredged Material in South­eastern U.S. Atlantic and Gulf Coast Waters. EPA 904-B-08-001. U.S. Environmental Protection Agency Region 4 and U.S. Army Corps of Engineers, South Atlantic Division, Atlanta, GA. http://www.epa.gov/region4/water/oceans/documents/SERIM_Final_August 2008.pdf

Continue reading
2201 Hits
0 Comments

Significant Figures Part III

How Many Significant Figures or Decimal Places Are Correct?

The number of sig figs or decimal places that should be presented is often a decision to be made by the end user. In many instances, showing one or two sig figs is adequate; for example, in a simple comparison determining whether a result is less than or greater than a benchmark value. If the results will undergo substantial statistical evaluation or other arithmetic calculations, then more sig figs would be recommended. As a general rule, it is better to keep at least one additional sig fig through all calculations and round afterwards rather than rounding first.

One famous example for keeping extra sig figs and decimal places for calculations is the original “butterfly effect.” In 1961, a mathematician/meteorologist named Edward Lorenz was using a computer to simulate weather effects. He entered a value that had been rounded from six decimal places to only three, which represented about a 0.025% difference in the value. After he ran the program, he realized that the result represented a completely different weather pattern than if he had used all six decimal places. During later presentations, this effect took on its more popular meaning, which is still in use today.

At the other end of the spectrum, it is possible to provide far too many sig figs. Since computers and calculators became common, they have been used more and more frequently for data collection and analysis. If the data are being calculated using mathematical formulae (e.g., linear or quadratic regression), the computer could easily provide 32 or more sig figs in its evaluation. If this were presented as a single whole number, for example, it would be the equivalent of counting the number of grains of sand in a pile the size of Earth.

Some things to keep in mind when determining how many sig figs should be presented is how the data will be collected, limits in the ability to precisely measure a value, and if the data collected represents an exact measurement or is instead measuring a sample. For example, counting the number of children enrolled at each elementary, middle, and high school in a state is relatively straightforward. An exact count can be provided fairly easily, so a number that has six or seven sig figs would be appropriate.

On the other hand, counting the total number of people living in a state would be more challenging, since it would include births, deaths, and people moving into and out of the state. In addition, there may be a certain number of people who are temporary residents. Since all these changes can happen hundreds or even thousands of times per day, it would be more appropriate to provide only three or four sig figs for a statewide population count.

Continue reading
2035 Hits
0 Comments

Sediment Testing Interference Series – Part IV

Specific Types of Interferences and Solutions (continued from Part III)

Toxicological Interferences

Whereas chemical testing is used to determine the concentration of target contaminants in a sediment sample, toxicological testing is used to determine the effects of the sediment on the survival and development of multiple representative species. Since the organisms will be affected by the sediment as a whole, and since the material is typically a mixture of sand, silt, and clay with numerous chemical contaminants present, it may be difficult or impossible to determine the exact cause of high mortality or abnormal development. Several interferences, or confounding factors, have been identified, however, and are described below.

Ammonia

Above certain levels, ammonia is highly toxic to most marine organisms. It is also a non-persistent toxicant in the environment, and procedures have been developed to help reduce ammonia to more tolerable levels for the test organisms. These procedures were initially developed for the more sensitive benthic species, but, with EPA approval, can also be applied to other organisms under certain circumstances.

Total Organic Carbon (TOC) Availability and Quality

Changes in the nature of TOC in the dredge material may pose limitations to test organism survival. Organic compounds can change over time, particularly with changes in temperature, moisture content, and oxygen availability. If the quality of TOC degrades over time, it can affect the survival of certain species, e.g., for Leptocheirus plumulosus due to poor quality food in the sediment. Though the sediment may have low toxicity, survival can be substantially reduced. Providing a small amount of food for the organisms during analysis alleviates the problem and allows for a more accurate determination of toxicity.

Salinity

Marine organisms are sensitive to the salinity of the test sediment. Sediment collected from far upstream or from terrestrial locations will often have much lower saline levels than sediment from offshore locations and could cause stress to the test organisms and increase mortality. An acclimatization period for the sediment, typically lasting a few days to 3 weeks, will gradually expose the sediment to saline. Once the acclimatization is completed, the test organisms can be added and testing can commence.

Continue reading
1870 Hits
0 Comments

Sediment Testing Interference Series – Part III

Specific Types of Interferences and Solutions (continued from Part II)

Organic Interferences

Organic compounds (PAHs, pesticides, PCBs) are typically measured by either mass spectroscopy or retention time. Because of a preliminary extraction procedure, salt does not interfere in the analysis of organic compounds, but sediment can still be affected by matrix interferences in a number of ways.

  • Complex organic molecules can bind together or degrade over time and form non-target compounds.
  • New contaminants that have the same characteristic mass or retention time as the target analyte can be introduced into the environment (from leaking vessels or industrial runoff, for example).
  • Sediment will likely settle into separate layers, leading to different chemical and physical characteristics in each layer. The sediment will require thorough homogenization prior to analysis to ensure that it is representative of the site.

There are a variety of solutions to eliminate or minimize potential interferences during preparation and analysis.

  • Commercially available cartridges are used to remove or “clean up” interfering (non-target) compounds for most organic analytical groups, such as PAHs, pesticides, and PCBs.
  • Using more-sensitive equipment can help distinguish target compounds from interfering compounds. High resolution mass spectroscopy (HRMS) is often used for testing parameters such as dioxins and PCBs to narrowly target specific compounds. Results from these procedures will often be much more sensitive than when ordinary mass spectrometer tests are applied, but these procedures are also considerably more costly. As one example, the cost for analyzing PCBs by gas chromatography is around $200 per sample, but for HRMS it is around $800 per sample.
  • Different types of detectors are available for different analyses. For example, an electron capture detector is useful for pesticide and PCB analysis.
Continue reading
1884 Hits
0 Comments

Sediment Testing Interference Series – Part II

Specific Types of Interferences and Solutions

 

Metals Interferences

Most metals analyses are performed by spectroscopy, generally by inductively coupled plasma (ICP) with or without a mass spectrometer attached. A small volume of prepared sample will be introduced into the instrument and energized using an argon plasma. The resulting plume will emit light at specific wavelengths in proportion to the concentration of contaminants present in the sample and is read using a detector measuring across a narrow band pass. For example, copper will emit light at a wavelength of 324.7 nm. When dealing with high-saline samples, various cations (including sodium, calcium, and magnesium) will be present. Calcium is a potential interfering element because it emits at a wavelength peak of 316.9 nm and with a band pass that overlaps that of copper.

Several options exist for dealing with these interferences:

  • Extract the metals into an organic reagent, such as methyl isobutyl ketone, effectively removing the salt interferences.
  • Change instrumentation. For example, use a borohydride generator for selenium analysis.
  • Prepare instrument standards in the same matrix as the project samples. This procedure will work if the interfering metals or compounds are of moderate concentrations.

A 2010 project audited by ANAMAR provides a good example that saline interferences can produce false positive readings. Based upon historical results, background levels for copper are typically around 1 µg/L. The results found for this project were in the range of 150 to 170 µg/L, and further evaluation by the ADDAMS model was indicated since the sample levels were above the federal water quality criteria. A review by the contractor and the analytical laboratory indicated that all batch QC was within limits, and the data appeared to have been analyzed and reported correctly. After some investigation, it was found that the contractor for this project did not include any procedures to reduce the saline interferences before analysis.

After discussion among the contractor, USACE, and EPA, it was determined that the copper analysis would be re-done using a saline-reduction procedure. Due to holding times and an insufficient volume of sample collected on the first sampling event, the contractor had to return to the project site to collect additional sediment and site water. Upon analysis of the elutriate using the saline-reduced sample, copper levels fell in line with historical levels, and further evaluation of the elutriate by the ADDAMS model was no longer necessary.

Continue reading
1911 Hits
0 Comments

Sediment Testing Interference Series – Introduction - Part I

During the course of an environmental dredging project, samples will be collected and sent to one or more laboratories for analysis. For most dredging projects, the samples collected will be of a complex nature and will often contain various types of interferences that must be addressed during analysis to ensure that the most accurate results are reported to the client and to the regulatory agency reviewing the results for final sediment disposal options.

Interferences may be found in any type of analytical testing. For dredging projects, such as those falling under MPRSA Section 103 protocols, testing will be required for physical, chemical, and toxicological parameters. This four-part blog will describe the most typical and lesser-known types of interferences.

Chemical Interferences during Chemical Analysis

Chemical matrix interferences are encountered when the project sample contains a constituent that either produces a signal indistinguishable from a target analyte or attenuates the target signal. Elutriate samples for ocean dredging projects have many common types of matrix interferences (e.g., saline) for trace metals analysis. In addition, the chemical composition of the project samples may have interfering compounds specific to the sampling location.

The most common interference found when analyzing dredge material is saline interference with the analysis of metals. Other analyses prone to matrix interferences include pesticides, PCBs, and PAHs. Laboratory methodology has been developed to address the most common interferences, either through laboratory sample preparation or by adjusting instrument settings for specific sample matrices. If such procedures cannot completely eliminate an interference, the data will be qualified or the method detection limit will be elevated, and an explanation for the interference will be provided by the laboratory.

 

Author: Paul Berman, B.S - QA Officer/Staff Scientist

Continue reading
1917 Hits
0 Comments

Center for Integrated Modeling and Analysis of Gulf Ecosystems

       b2ap3_thumbnail_C-image.JPG

 

The College of Marine Science at the University of South Florida has been on the scene of the Deepwater Horizon Blowout since April of 2010, and many faculty members continue to participate in the ongoing research. The Gulf of Mexico Research Initiative awarded funding for eight research consortia to study the impacts of the oil in the Gulf of Mexico ecosystem and the College is proud to host one of these centers. The Center for the Integrated Modeling and Analysis of the Gulf Ecosystem (C-IMAGE) is an international group of distinguished scientists interested in the integrated study of the fate, transport, and effects of oil and dispersants as they interact with the Gulf of Mexico marine environment.

Our study begins at the well head where hot high pressure petroleum fluids exit into a cold seawater environment. The thermodynamics of the oil/gas/seawater system deep on the seafloor determine the initial phase composition, bubble/droplet size and density. Moving forward in space and time, the petroleum compounds and aggregations are advected by the ocean circulation. Chemical and biological processes of dispersion, dissolution and degradation then provide the trophic level connections which determine the spatial interactions with biota and ultimately the potential for the oil/gas/dispersant mixture to affect populations, communities, and the overall ecosystem.

To date, C-IMAGE scientists have spent more than 80 days onboard research vessels in the Gulf of Mexico collecting water, sediment, and biological samples. Some samples have been sent to our partners in Hamburg for high pressure biodegradation experiments, some to Calgary to see how the oil partitions as it ages and weathers and many of them are being analyzed here at USF looking at oil concentrations in sediments, PAH concentrations in fish tissue, phytoplankton impacts, and toxicity impacts. Laboratory experiments are underway in the Netherlands at Wageningen University to determine the toxicity of oil with and without dispersants. Sediments are being analyzed to determine the impact of oil on microbial assemblages and to find signs of recovery. These studies are closely integrated, results from one feeding into parameterizations for another. By looking at the Gulf of Mexico holistically, we can better understand the impact of not only this oil spill, but future breaches.

C-IMAGE’s Education and Outreach team through USF’s College of Marine Science is working with elementary, middle and high schools through hosting a Teacher at Sea program where teachers can instruct their classrooms remotely from research vessels while engaging students in the process of conducting ocean research. Getting the message out about our research is important; we produced two podcasts with Mind Open Media that are publicly distributed and have more coming as our research progresses. We are also partnering up with the Pier Aquarium to produce an oil spill module for the “Secrets of the Sea” exhibit while participating in the St. Pete Science Festival.

Some of our College’s faculty members are also active in another GoMRI funded consortium, Deep-C , hosted by Florida State University.

Some C-IMAGE attributed publications:

Surface evolution of the Deepwater Horizon oil spill patch: Combined effects of circulation and wind-induced drift

Matthieu Le Hénaff, Vassiliki H. Kourafalou, Claire B. Paris, Judith Helgers, Zachary M. Aman, Patrick J. Hogan, and Ashwanth Srinivasan Environmental Science & Technology 2012 46 (13), 7267-7273

Detection of anomalous particles from the Deepwater Horizon oil spill using the SIPPER3 underwater imaging platform

Sergiy Fefilatyev,Kurt Kramer,Lawrence O. Hall,Dmitry B. Goldgof, Rangachar Kasturi,Andrew Remsen,Kendra Daly In proceedings of 2011 IEEE 11th International Conference on Data Mining Workshops (ICDMW), Vancouver, BC, Canada, December 11, 2011.

Label-noise reduction with support vector machines

Sergiy Fefilatyev, Matthew Shreve, Kurt Kramer, Lawrence Hall, Dimitry Goldgof, Rangachar Kasturi, Kendra Daly, Andrew Remsen, Horst Bunk International Conference on Pattern Recognition (ICPR), 2012 21st, pp.3504-3508, 11-15 November 2012.

Evolution of the Macondo well blowout: Simulating the effects of the circulation and synthetic dispersants on the subsea oil transport

Claire B.Paris, Matthieu Le Henaff, Zachary M. Aman, Ajit Subramaniam, Judith Helgers, Dong-Ping Wang, Vassiliki H. Kourafalou, and Ashwanth Srinivasan Environmental Science and Technology 201246(24), 13293-13302.

Sand bottom microalgal production and benthic nutrient fluxes on the northeastern Gulf of Mexico nearshore shelf

J. G. Allison, M. E. Wagner, M. McAllister, A. K. J. Ren, R. A. Snyder Gulf and Caribbean Research 201325.

Enhancing the ocean observing system to meet restoration challenges in the Gulf of Mexico

S. A. Murawski, W.T. Hogarth Oceanography 2013 26(1),10–16.

As C-IMAGE turns the corner on 2013, our researchers have been pushing the science forward, making great strides in answering some of the very important questions about the oil spill budget and impacts of the oil and the dispersant. Here are some important milesontes:

  1. In a modeling study led by Dr. Claire Paris of the University of Miami, researchers found that the amount oil reaching the sea surface may have been the same independent of dispersant application. Based on fundamental oil droplet size models, the authors estimate that the turbulent discharge of oil resulted in naturally small droplets contributing to the observed deep intrusion.
  2. The instrumentation to study biodegradation of oil and oil/dispersant mixtures at high pressures is up and running at the Technical University of Hamburg. Microbial samples with oil from the Gulf of Mexico are brought to high pressures in closed chambers and oxygen consumption is measured as a proxy for microbial activity. Initial results indicate high pressure increases biodegradation rates slightly. More experiments are ongoing.
  3. In an attempt to close the oil budget from the blowout, the latest estimates are that about 20% of the oil is unaccounted for. Researchers in C-IMAGE and other GoMRI funded consortia are working together to investigate the mechanisms that contribute to this 20% making its way to the seafloor, becoming part of the Gulf’s sedimentary record. The processes of the oil and dispersant interacting with particles in the water column and the mechanisms transporting this material to the seafloor require the marriage of lab and field work conducted at USF and many of our partner sites.
  4. The short and long term impacts of the oil on many fish species in the Gulf is being tackled at USF with assistance from Mote Marine Laboratory and the University of South Alabama. Hundreds of liver, blood, bile, and muscle samples from fish are analyzed for PAH exposure and stable isotope analysis and otoliths for age and growth studies. There are elevated polycyclic aromatic hydrocarbon (PAH) concentrations in fish liver samples collected to the south and east of the DWH spill and to the north and east on the west Florida shelf. PAHs are the most toxic and potentially carcinogenic substances in cruise oil. These PAH levels are above the baseline for this region. Liver PAH composition of red snapper is likely picking up the chemical signature of the oil out of the Macondo Well. Impacts on important economic species such as red snapper and tilefish are being assessed.

 

Continue reading
1977 Hits
0 Comments

Scientists Venture to the Southern Ocean (Antarctic) to Study the Effects of Ocean Acidification…

antarctic ocean

http://www.ocean-news.com/news-archives/ocean-energy/2573-ocean-acidification-in-the-southern-ocean

Continue reading
2017 Hits
0 Comments

Public Meeting - RESTORE Act (Deepwater Horizon)

DEP and FWC will co-host a meeting Wednesday, March 13, 2013 to gather public input for project ideas leveraging the funds from the RESTORE Act.  The RESTORE Act, which was passed by Congress on June 29, 2012 and signed into law on July 6, 2012 by the President, provides a vehicle for Clean Water Act civil and administrative penalties from the Deepwater Horizon oil spill.  Meeting details below..

Wednesday, March 13, 2013
Florida Fish and Wildlife Conservation Commission’s Research Institute
100 Eighth Ave. SE
St. Petersburg, FL 33701
6:00 p.m. EST Open House / Registration
6:30 – 9:00 p.m. EST Meeting & Public Comment

Continue reading
1953 Hits
0 Comments

©2006-2016 ANAMAR Environmental Consulting, Inc. | Web design & hosting provided by Blu Dove Designs

Gainesville, FL (352) 377-5770
Portland, OR (503) 220-1641
Fax: (352) 378-7620 • This email address is being protected from spambots. You need JavaScript enabled to view it.