Friday, February 13, 2009

Digital Dirt Map Compilation Chronicles, Part 1 of MANY

To date, the ND2MP 'staff' has been compiling all of the existing, digital renditions of the geology of Clark County, Nevada. Luckily, the USGS has been compiling and developing 100k maps across the county and has, thus, done a fair amount of work for us already. What follows is a descriptive update of what we have been doing. Stay tuned for some graphic examples.

We have started to evaluate the relative worth of the various map data that we have in hand, and have gained some important insights.

First off, kudos to those authors who have pored over existing maps of a range of scales to develop their compilations (e.g., USGS versions of the Las Vegas 100k, Lake Mead 100k sheets). We are keenly aware of the huge amount of work that went into that process, and we are also aware that it involved new mapping in some areas. Even more kudos to those authors who generated large amounts of original mapping at similar scales (e.g. USGS version of the Mesquite Lake 100k). That was obviously a huge effort.

Given this, however, there are several facts about these maps that bear directly on our efforts. For example, there is a high degree of variability in the level of detail in the compilation maps. This fact is pointed out by the authors, so this comes as no shock to anyone. However, as we develop the surficial geologic map of the entire County, we are interested in developing a dataset that has a consistent level of detail across the entire area. This will be a large task. It will involve mainly enhancing the detail in existing compilations, but will also involve some generalization of overly detailed areas.

The latter point applies, for example, to the Ivanpah Valley area, where House et al., mapped in detail and Schmidt and McMackin mapped more generally. What we want for the ND2MP is somewhere in between those extremes, so we are experimenting with some automated generalization routines with the House et al. data and comparing the results to the Schmidt and McMackin mapping. I will post some examples when they are ready. Areas where the existing compilations are simply too general or appear somewhat arbitrary in detail will require significant amounts of new mapping as part of this project.

As for generalizing existing overly detailed maps, we are going to establish a minimum map unit criterion between 5 and 10 hectares. This refers to the areal extent below which we will not show a polygon. With respect to existing compilations, we plan to express polygons below the threshold as point-features in the database. We are also experimenting with ways to efficiently eliminate parts of polygons that are less than 30-50 meters wide. This issue arises mainly in the area of single-thread active washes. Ideally, we can collapse the narrow polys to centerlines that retain the atttribute when needed. This is sort of an 'annealing' function that digital cartographers are experimenting with, but we can't find any explicit add-in for doing it in ArcGIS except when it involves the distance between two polygons and not parts of the same polygon (doesn't create the line we want, however). When we progress in these areas, I will post examples to this blog.

Another arguably more important issue is that the existing compilations use different nomenclature for the surficial deposits. Given that several of these maps probably had overlapping compilation periods, are contiguous, and are from the same agency this is a bit surprising. However, I too have some pretty schizophrenic labeling schemes on my own maps and NBMG has no formal standard, so I can relate somewhat. In any case, for a county wide depiction of surficial geology that we want to ultimately apply statewide, it is absolutely necessary that we develop a consistent, flexible, and understandable framework. Each of the existing compilations provide some good and well-reasoned examples. We will begin with them and either choose the one we think is the best, or possibly confuse the issue more by developing a framework that we think is better. In any case, we will develop a rubric that explains how the various schemes relate. Currently, we are leaning toward a composite of the Las Vegas 100k approach with the Mesquite Lake 100k approach.

Stay tuned for an upcoming post that provides an opportunity to view and comment on our proposed scheme and its rationale. I will also prepare some examples of areas in need of generalization or more detailed mapping to support my statements above.