NIRIS is the Navy's Environmental Restoration program - they keep track of sample locations, wells, any sort of environmental restoration data. The NAVFAC GES program went through the last couple years upgrading their data model from SDFIE 2.6 to SDFIE 3.01. There was a Navy adaptation of that data model and we supported them through the migration by taking the data they migrated and preparing it to be a part of enterprise GIS for NAVFAC. Now that NAVFAC is in the new data model, NIRIS also want to get their data migrated to it. GISi did not do the actual migration for them - we provided the model. In addition to the upgrade, they also wanted to be more in line with the rest of NAVFAC as far as enterprise GIS and so they wanted to be able to start editing in SDE, wanting their architecture to be set up just like NAVFAC. So we created schemas in the same repository that all the NAVFAC GeoReadiness Centers (GRCs) edit and publish their data in. NIRIS now has all their own schemas now set up pretty much the same way.

All the GRCs used to have schemas for each local projection their data might be in. So you might have a region that might have 5 or 6 different schemas - they would each in different state plane zones, or different UTM zones
NAVFAC wanted to consolidate everything to have a single projection enterprise-wide. So everything in the enterprise is now all the same spatial reference. When we did the NAVFAC migration, that allowed us to aggregate all 45 schemas, we rolled them up into regional schemas. The same thing happened with NIRIS - they spanned across all the Navy regions, at least all the CONUS ones, plus Hawaii. Since we went to the same spatial references as NAVFAC - we can aggregate everything to a single NIRIS schema for their new NDM compliant schema.

Since they are part of the enterprise system now, they have capabilities like geodatabase replication, which we set up for them. They had everything in one database and it uses Esri's geodatabase replication to synchronize it to the publishing database so they can edit their data throughout the day and the changes they make in their geodatabase get synchronized to that publishing database 4 times a day. So it is a lot more real-time than it used to be. The way they were publishing it before - they'd get a bunch of stuff done and then they have go through a longer process and could not get changes into web viewer side of it as fast as they can now.

The other big piece we did involved an existing script that took entries that are made into a database table (they have a user interface where you can enter in sample location points for these environmental samples) and this script would run once a night, go out to that table and create a feature class or update an existing one. It was old and they could not keep up with it. We did a python script that does pretty much the same thing. It reaches out to that table and populates the feature class that is in SDE as part of their new schema. It runs as a windows scheduled task and runs every night. It updates the new feature class and sends an email notification every night about how many new features were created.

One challenge with that script is they don't put anything in the table in a common projection. They put all the x y coordinates that they entered for each of those features in the table. It is all based on local coordinate systems (mostly state plane) so if they're putting in a record for San Diego, it is in some state plane California coordinate system. The script is based on where they say the installation is for that record because in one field it's: "what installation does it belong to?" It goes out and says "this installation would be in this coordinate system" - so I need to re-project it to the common spatial reference they're using and use the correct geographic transformations. It does this for 187 or so installations. There's a whole bunch of different projections that it could be doing. It does that, populates it to a worldwide feature class that has all the sample locations across the entire NIRIS program. It does the same thing as all the other feature classes they edit, after it gets done updating, replication picks it up and pushes it all to the publishing database so the web application can consume the updated data.

The biggest benefit for the client is that they are in line with the rest of enterprise GIS within the Navy. They can easily plug into GRX or any initiative that any of the NAVFAC GeoReadiness centers are doing, and GeoReadiness can easily plug in and consume their stuff. Everybody within NAVFAC, enterprise Navy GIS, everyone is talking the same language now.