Tag: QC

Expanding the power of the Attribute Assistant

Here at Esri, we always try to help one another out.  Well the Local Government team asked if we could expand the functionality of the Attribute Assistant and make a series of construction tools to help with address data management workflows; and we obliged.  We came up with some pretty cool new rules for the attribute assistant and some interesting construction tools that will be used to streamline address maintenance workflows.  Even though these have been designed for managing address information, you may find them very helpful.  Let’s first take a look at these new attribute assistant rules.
The first rule we needed to create was CASCADE_ATTRIBUTE.  This rule will allow you to make a change to an attribute in a table or layer and push that new attribute value to every feature that contains the value. So in the address world, we have implemented a Master Street Name table.  Say a road was renamed, we can go into the table, change the road name, and the rule will open up the Road Centerline layer and make that change to every road with the old name, then open up the Site Address Point Layer and update the road name as well.  Pretty cool, huh?
The second rule we created was VALIDATE_ATTRIBUTE_LOOKUP.  This rule will validate an attribute change against a look up table or layer.  Let’s look out how you would use this in address land.  If I created a new road and I want to make sure the road name matches a road in the Master Street Name table, I can set this rule up to monitor my Street Name field and check that value against the Master Street Name table.  The cool thing about this rule is all I have to do is enter a piece of the road information.  If it finds more than one record in the street name table, it presents a prompt, where you can select any of the matching values.  How is that for data validation?
I also mentioned we are working on some new construction tools.  These are still in development, but here is what we are working towards.  One tool will allow the user to click a reference point, in our case, an address point that represents the location on the centerline of which the address is derived, create this point if it does not exist, then create a series of site address points with information from that reference point or the centerline underneath it.  So basically, you can create a series of points all with information from source point.  The second is a tool to draw a new road and split any intersecting roads and prorate their address information.
If you want to try the new rules out, as well as a few other enhancements, we have posted a beta of the tools on the forum.
Mike from the Water Team

Posted in Water Utilities | Tagged , , , , , , , , , , , , , , | 2 Comments

Integrating ArcGIS Data Reviewer and ArcGIS Workflow Manager to Automate Data Validation

As you know, ArcGIS Data Reviewer and ArcGIS Workflow Manager are available as standard extensions, but they’re also part of Esri Production Mapping. If you are looking to automate your data validation workflows, we just posted a couple of blogs topics describing how to do this. The first blog – Automating Data Validation Workflows – talks about using the custom steps available in Data Reviewer within a Workflow Manager workflow. The second blog – Advanced Data Validation Workflows – takes it a step further and discusses other ways of utilizing the custom steps. It also outlines the additional capabilities that you can leverage while integrating Data Reviewer and Workflow Manager. Be sure to read these posts, they might help you get creative about how you perform data validation.

Posted in Uncategorized | Tagged , , , , , , , , , , | Leave a comment

Advanced Data Validation Workflows

In a previous blog we discussed how to automate your data validation workflows by integrating ArcGIS Data Reviewer with ArcGIS Workflow Manager. We also discussed some of the basic functions you can perform with the Reviewer
Custom Steps
. For this blog I’d like to take it a step further and describe other ways in which you can use the custom steps. I’ll also introduce additional functionality in Workflow Manager that you might find useful when integrating
with Data Reviewer.

Working with Reviewer sessions

Reviewer sessions are a way of filtering records in the Reviewer workspace to show only those records you are interested in. Last time, we outlined how to use tokens with the Create a Reviewer Session custom step to create unique sessions for each job based on the Job ID. There are other ways you can create sessions depending on your workflows. For example, if you have unique sessions that already exist for each user, you can use tokens like the [JOB:ASSIGNED_TO] in the Run Reviewer Batch Job custom step to write the errors to the existing sessions.

Continue reading

Posted in Editing | Tagged , , , , , , , , , , , , , , | Leave a comment

Automating data validation workflows

Several times I’ve been asked, “what’s the best way to automate data validation workflows?” With the release of ArcGIS 10, this can be achieved by integrating the ArcGIS Workflow Manager and ArcGIS Data Reviewer extensions. Data Reviewer has added custom steps that can be used within a Workflow Manager workflow. In this blog topic, I’ll describe how you can configure these custom steps.

If you don’t already know, using Workflow Manager you can define workflows that model your business processes and track the progress of assigned tasks. The steps in the workflow can be configured to execute functions like sending notifications, opening applications like ArcMap, and running geoprocessing tools.

Data Reviewer allows you to manage and track quality control of your data. You can automate data validation using rules stored in a batch job. A Reviewer session is used to store information about any errors that are found and to track issues through their life-cycle to ensure they are fixed and verified.  

Continue reading

Posted in Editing | Tagged , , , , , , , , , , , | 1 Comment

Data quality concerns keeping you up at night?

What keeps water, wastewater and stormwater utility GIS professionals up at night?   Could be doubts about your system architecture or capacity, might be fears about data backups and recovery, maybe your backlog of unprocessed as-builts.  A common concern we are hearing right now from the user community is about being sure that your data is good enough to meet the needs of your utility.  This is driving more water utilities to focus on quality assurance (QA) and quality control (QC).

Across the industry water utilities are expanding their GIS quality control procedures or implementing formalized quality control if they don’t have any in place.  Water utilities are also reviewing their existing GIS implementation and workflows for ways to increase quality assurance.  At some water utilities these changes are coming out the GIS department, driven by proactive GIS managers and staff.  At other utilities these changes are coming top down from utility management that recognize GIS data now runs throughout their utility like a steel thread or from the IT department as it assess the state of all utility digital data.

But haven’t we always been concerned about data quality?

Continue reading

Posted in Water Utilities | Tagged , , , , , , , , , , , , , , , , , , , , , , , , | Leave a comment

ArcGIS Data Reviewer for Water Utilities webinar recording now available

The video for the February 15, 2011 ArcGIS Data Reviewer for Water Utilities webinar is now available on the Data Reviewer and Water Utilities Resource Center. You can find the recording here.

In addition, you’ll find a written copy of the Q&A attached to this blog post below.

Thanks for the feedback and for attending!

Posted in Editing | Tagged , , , , , , , , , , , , | Leave a comment

Developing a Quality Assurance Plan

For our next blog topic, I’d like to discuss how a quality assurance (QA) plan can be used to ensure the production of quality GIS data. A QA plan is a document, usually created by a project manager, which identifies quality related objectives, standards and procedures for your dataset. These guidelines are used to maintain consistent quality control (QC) processes throughout the duration of a project and helps determine the success (or failure) of a project, and whether the deliverables meet customer expectations. In essence a QA plan is designed to ensure everyone involved in the project is on the same page about what quality means for your data and how you are going to measure and ensure compliance throughout the project life cycle.  

The establishment of a QA plan has many benefits including less data rework because quality requirements have been identified ahead of time and are being measured and monitored throughout the project. Additionally, project teams often experience greater productivity since a QA plan drives the examination of the production processes for efficiency and effectiveness. For project managers, the QA plan supports quality policy guidelines that your organization may have already established for your projects. For customers, satisfaction is increased because deliverables of the project will meet or exceed their expectations as a result of taking the time to measure and meet quality standards.

Continue reading

Posted in Editing | Tagged , , , , , , , , , , , , | Leave a comment