Dev Meet Up – Charlotte, NC

In Queen City of Charlotte, next to the Epicenter where the
NBA Bobcats play downtown, developers and managers attended the Dev Meet Up in North
Carolina on Tuesday, Oct 19th. Easily accessible and in a great
central location, a restaurant by the name of Blackfinn hosted the Dev Meet Up,
which brimmed with presentations on both Android and iPhone applications, code
demonstrations, in-house applications, and users’ perspectives on application
design and development.

Keynote

Jim Tochterman, VP of Research & Development from Bradshaw Consulting
Services
in Aiken, SC, had a tips-and-tricks presentation that addressed
the interest that is increasingly becoming a part of many organizations’
development plans: application development for smartphone devices. When asked
why Android as opposed to iOS or Windows phone, Jim responded, “Our customers
drove us down the Android path.”

Developers these days are more interested than ever in what other developers
are doing with smartphones, and Jim provided a great deal of information that
covered Android application development.

And then began the lightning talks…

Lightning Talks                                                  

Glenn Goodrich from Enspiria Solutions started off the
lightning talks by showing how he’s using jQuery and ArcGIS JavaScript API. He demonstrated an
engaging code walkthrough, as it was concise, informative, and well-narrated. Developers
love to watch programmers code and explain what they are doing as they are
doing it. The audience absorbed the way Glenn demonstrated his techniques, and
people appreciated his efforts to transfer the information in a way that kept
people intrigued.

Bryan Townsend from York County
Government
was next and stepped up to present a well-designed in-house
application that makes it easier for people in his organization to search for and
find datasets. The name of his map data management application was called Easy
Search, and his useful tool has been well-received by colleagues.

Coming in as a non-programmer, project manager Rob
Floyd from ARCADIS brought
in a fresh new perspective on design and development from a users’ standpoint.
With Dev
Meet Ups, we welcome anyone who can contribute information
that helps the developer community to grow and evolve. It was neat to hear not
only from a non-programmer, but also from a user and their requirements.                      

Lastly, Nianwei Liu from the City
of Charlotte
concluded the lightning talks with a presentation of an iPhone
application that he built called Virtual Charlotte. This is a public facing
application for helping Charlotte citizens find city services. It’s a great
example of a good iPhone application, and we have asked Nianwei to join the Android API
beta program.

 To Sum it Up

As a final note, there were great presentations and everyone got a chance to
meet other geodevelopers in the Charlotte area while chowing down on some
buffalo wings and hummus with flatbread pita. Folks there are eager to get a
series of these meet ups going in Charlotte and use them to help grow the geodeveloper
community. They appreciated the fact that it wasn’t organized as a one- or
two-day conference and that it didn’t interfere with their work schedule. The
meet ups are the type of event where you can work during the day, and every
once in awhile, just show up somewhere, meet a few folks, and learn something
new.

We have more meet ups in the near future! Be sure to visit the Dev Meet Up schedule
and add our events to your calendar.

EDN Team

Posted in Developer | Tagged | Leave a comment

Patrol Data Capture template available for download

Patrol Data Capture templateThe ArcGIS Defense & Intelligence team and Helyx SIS Ltd. have posted the new Patrol Data Capture template for immediate download from Arcgis.com.  This desktop application template is designed to help you import and clean patrol tracks from a GPS into ArcGIS. This template assists in separating patrol tracks, removing duplicate points, and detecting and removing error spikes. It includes a sample GPX file and a toolbox with geoprocessing tools to help you work with the imported track data. The template also includes a “Getting Started with the Patrol Data Capture Template” guide to help you set up the template and a “Using the Patrol Data Capture Template” guide to walk you through exercises using the template.

Posted in Defense | Tagged , , , , , , , , | Leave a comment

Announcing Some Great New Features in the BAO iPhone App!

 by James Killick

So far the BAO iPhone App has been a great success. Thousands and thousands of you have downloaded the app and judging by the ratings on the App Store, people seem to be loving it.

Today, we are announcing a brand new version of the app which is designed to allow BAO subscribers to get even more value out of their subscription…and if you’re not a BAO subscriber, you can get access to a free trial directly within the app.

Check out the new features below, or even better, download it from the App Store!

Have fun!

James.

Change the Analysis Area

Sign in to your BAO account to adjust the analysis area:

  • 1 to 100 mile ring
  • 1 to 60 minute drive time

Access the Full Set of BAO Reports

Now you can use your BAO account to access the set of reports available under your subscription and generate them for any area while you’re on-the-go!

Select your report…            Generate on-the-fly…

               Get the report back in seconds.

Share your Reports with Others or Save them for Later

         

Share a report via email…                                Review & manage your report archive…

Don’t Have an Account?  Register for a Free 30 Day Trial

  1. Tap the ad to register.
  2. See the benefits and register in seconds.
  3. And now you’re ready to take full advantage of BAO.

Your Free Trial Gives you Full Access to BAO on the iPhone and on the Web!

Access the full power of BAO on the Web!

Posted in Location Analytics | Tagged , , , | Leave a comment

Dev Meet Up – Boise

On Oct 20th, over 50 geo developers and managers came out to join us at the Dev Meet Up in Boise, Idaho.  The event was held at the Bardenay in downtown Boise.  Many of the attendees were from local government agencies but there were also a number of individuals from local and private businesses.  Overall it was a great crowd and the venue offered a great atmosphere to engage in “geo” conversation.

Keynote

Todd Buchanan kicked things off with a presentation on Regional GIS Application Development.  This was a very interactive talk that solicited feedback from the audience on how to improve application and data sharing across different agencies.  A number of pain points were also covered.  This sparked a lot of conversation and interest in things like data and application standards, maintenance and ownership.  We get the feeling this topic will come up again.

From there, the lightning talks unfolded.

Lightning Talks

Zach Maillard from the Idaho Department of Water Resources opened with an excellent discussion on flood hazard web mapping using ASP.NET MVC and the ArcGIS JavaScript API.  The application generated a FEMA flood hazard map for different areas of the region.  The map could be exported as a PDF file so it could be printed or easily shared with others.  Zach described some of the benefits of using ASP.NET and MVC as a development framework, and how they separated the data and file resource logic from the map, UI and printing components.  The application also made use of REST so simple URLs could be used to access and consume information behind the scenes.

One of the great things about the format of these events is that we can adjust the program as necessary, and when one of our lightning talk presenters didn’t make it, Kevin Jones offered to fill in at the last minute.

Kevin presented his dissertation on Open Layers and integration with different platforms.  This was an excellent talk, especially considering he didn’t prepare for it all!

Alan Rea from the USGS followed up with a presentation on how they used ArcGIS to build a web application to perform heavy back-end geoprocessing to generate stream stats across the country.  The system allowed users to visit an ungaged area and estimate stream flow statistics by using back-end raster and other regression analyses.  The scale of the system was impressive, using 17 servers to crunch gigabytes of data to generate the results.  One of the lessons learned was that they are now rewriting parts of the application to expose the server-side functionality via standard protocols (REST and SOAP). This will allow them to more easily integrate with newer client-side and RIA technologies.  Currently the system has been fully implemented across 22 states and 11 more are in the adoption process.

Bill Masters finished off with an excellent presentation on how he designed a vault/utility management application with Silverlight.  This was impressive because he stepped us through how he actually built the application by starting with some of the templates you can find on the ArcGIS.com Resource Centers and how, by stitching together other samples, he was able to build a simple and interactive application to access his data through ArcGIS Server.  One of the challenges he was able to overcome was displaying data in an Access GDB along with a number of associated related tables.  Very impressive as this was his first Silverlight application!

Hot Topics

  • Standardizing applications and data
  • Building web apps with JavaScript vs Silverlight/Flex
  • Moving web apps to mobile
  • Migrating to ArcGIS 10
  • ArcGIS Server SIG group for Idaho?
  • When is the next meet up?

Staying Connected!

One of the great things about these events is that you really get a chance to connect and share information with others in the local community. 

But what happens when the event is over? 

How do you find people or reach out to the community afterwards? 

For this event we put together a Boise MeetUp.com site so everyone can stay connected.  You’ll also find other things there like photos, presentation uploads and feedback on the event itself. 

So whether you were able to make it or not, feel free to visit MeetUp.com and get connected with your local community of geo-developers.  Also stay tuned for future up and coming events.

Until next time…

EDN Team

Posted in Developer | Tagged | Leave a comment

ArcLogistics Training Resources

Now that you’ve installed the latest, greatest version of ArcLogistics, you might be asking yourself, “Now what?” Maybe you’ve used the classic desktop version of ArcLogistics in the past and you’re not accustomed to the new work flow. Or perhaps you’ve never used ArcLogistics at all. Whatever the case may be, you might need some help getting up and running with the new product. Even though the user interface of ArcLogistics Online is designed to be as intuitive as possible, there’s always a learning curve involved with anything that’s unfamiliar.

We understand this and so we’ve provided a comprehensive tutorial that walks you step-by-step through the essential functions that you need to service your orders for the day. When ArcLogistics starts up just click the “Work through a Tutorial Project” button on the Getting Started page and you’re on your way.

If reading isn’t your thing, or if you want to supplement the information provided by the tutorial, there are also a bevy of short instructional videos available in the Resource Center that cover everything from an introduction to the user interface to advanced routing topics. You can access the videos by clicking the “View a Video Tutorial” button on the Getting Started page in ArcLogistics, by visiting the ArcLogistics Videos page directly, or by clicking the links further down this page.

For the basics, you can view an overview video or a video tutorial, which is a recording of the tutorial mentioned above.

There are videos available that show you how to enter details about your fleet; import your daily orders from a file or database; and build, visualize, and modify your routes. For more advanced information about how to refine your route solution by incorporating route barriers and zones, view the zones and barriers video.

To learn more about how to create reports for your routes and provide the routing information to your drivers, check out the deployment video and the ArcLogistics Navigator video.

If after going through the tutorial and videos you’re still unsure about some aspect of the software or you get stuck,  you can always post your questions on the ArcLogistics user forum, or peruse the discussions that are already there to see how other people are using ArcLogistics.

Good luck and have fun routing!

Posted in Uncategorized | Tagged | Leave a comment

US National Grid (USNG) Map Service Now Available

There is now a US National Grid (USNG) map service on ArcGIS Online.  This service contains continuous coverage of grid squares and center points at 100,000 meters, 10,000 meters and 1,000 meters.  The 100 meter grids and center points are available for select, highly populated cities within the United States with more updates on the way.  The REST end point for this map service is http://maps1.arcgisonline.com/ArcGIS/rest/services/NGA_US_National_Grid/MapServer and can be consumed in various clients to ArcGIS.

An ArcGIS.com map is available with current hazards data and the new USNG map service.  This map is in the Esri GIS for Emergency and Disaster Management group.

You can also access this map on your iPhone or iPad.

This map services can also be easily configured within your Situational Awareness applications.  This service can simply be added to ArcGIS Mapping for SharePoint (version 2.0 is now available).

To include the US National Grid Map Service in the ArcGIS Viewer for Flex simply add a line in the config.xml file:

<layer label=”US National Grid” type=”dynamic” visible=”true” alpha=”1.0″

                   url=”http://maps1.arcgisonline.com/ArcGIS/rest/services/NGA_US_National_Grid/MapServer”/>

 

Posted in Public Safety | Tagged , , , , , , , , , , , | Leave a comment

Introduction to scripting with Amazon EC2

My name is Owen Evans, and I work in Esri’s D.C. Technology Center in Vienna, Virginia, where we demonstrate Esri technology to many US government customers. Most of the demonstrations we give are Web-based, so we make our GIS services and applications accessible outside the office by publishing them to a few servers that we have on premises.

Over the past few months, however, we have been supplementing our on-premises hardware resources with virtual resources in the Amazon Elastic Compute Cloud (EC2), using the recently-released ArcGIS Server on Amazon EC2. In this post, I’d like to share some of our experiences using scripting to manage our cloud resources. This post assumes you’re familiar with EC2 terminology, which you can review here.

Why use scripting with Amazon EC2?

One benefit of cloud computing is the commoditization of computing power; you can pay one simple hourly fee for resources that actually have many embedded costs associated with them (procurement, hardware, energy, building space, software installation and maintenance, etc.). Another benefit of cloud computing power is that you can turn it on and off like a water faucet and only pay for what you use.

Scripting is a simple way to automate scheduled tasks. With simple scripting tools that Amazon Web Services (AWS) provides, you can start and stop your EC2 instances during times when your apps and services are not needed. You can also automate other tasks like backups and expansion/contraction of capacity in the cloud. Scripting can help you manage your elastic computing utility bill, thus saving you time and money by conserving computing resources and automating repetitive scheduled tasks. Let’s look at a few examples of how you might use scripts to manage your cloud resources.

Use cases for scripting Amazon EC2

Scheduled automation

Like many organizations, our business operations are mainly during the work week (Monday through Friday). Since our demo servers are not typically used on the weekends, we shut them down at about 9pm ET on Friday evening (6pm PT) and start them up again at 7am ET on Monday morning. This allows us to save 58 hours of computing time each weekend (3 hrs for Friday, 24 hrs each for Saturday and Sunday, and 7 hrs for Monday). At $0.48/hr that is about $30/weekend or $1500/year for one instance. That’s a big payoff for the small effort of writing a script. Automating these repetitive processes saves time since we don’t have to remember to manually start and stop these instances, and it saves money since the resources are shut down when they are not being used.

Flexible administration

We also have EC2 instances that support in-office activities. These instances can be started and stopped by anyone using scripts that we have set up with some desktop shortcuts. When we need to give a presentation that uses these cloud resources, we start the instances. Then, when the meeting is over, we stop them. The person that turns the instances on and off doesn’t have to know the credentials for our AWS account. In this case, our scripts allow a non-administrator to manage cloud resources in a controlled way.

How to get started with scripting Amazon EC2

AWS has a set of downloadable command line tools that enable you to interact with the EC2 instances programmatically. The tools perform functions like starting/stopping instances, creating/deleting tags, and allocating/associating elastic IP addresses. Setting up these tools is not difficult. If you are familiar with the command line and environment variables, that certainly helps; but the configuration is straightforward enough that you can learn as you go.

Below is a step by step list of what you need to do to get started. The hyperlinks point you to download locations and specific steps in the documentation.

  1. Install Java (version 5 or later, SDK or JRE)
  2. Create a folder to store your tools, scripts, and related files (e.g., C:AWS or C:Users[user]DocumentsAWS)
  3. Download the Amazon EC2 API Tools
  4. Create an X.509 certificate and download the two associated .pem files
  5. Set environment variables for your Java install location, EC2 tools location, and your account credentials
  6. Run commands

Once you install the tools, you can run any of the EC2 commands from a command line. Scripting enables you to chain several commands together and save common workflows so that you can reuse them. One of the simplest ways to do this is with a batch file (.bat), which is a just text file containing a list of commands that run in sequence. To make a batch file, just create a new text file and change the file extension from .txt to .bat. Then type your commands in the file in the order you want them to run.

Basic Amazon EC2 scripting examples

Below are a few simple scripts for common workflows: 1) stop an instance (our Friday evening script), 2) start an instance and associate an elastic IP (our Monday morning script), and 3) stop multiple instances.

Stop an instance

Perhaps the simplest script is one that stops an already running instance. This example calls the EC2-STOP-INSTANCES command and passes the instance name that I want to stop. I’m using sample values for instance names and other parameters, you would simply substitute your information for the placeholders in the examples below.

ec2-stop-instances i-abcd1234

Easy, right? In this case we just specify the stop command and an instance name, and the instance is stopped.

Ok, now let’s try something a little more interesting.

Start an instance and associate an elastic IP

This next script starts an instance, then associates an elastic IP address with that instance. This is a common workflow because when you stop and start an EC2 instance, the IP address changes. An Amazon Elastic IP gives you a way to maintain an unchanging address for your instances. The only requirement is that whenever you stop and start the instance, you have to re-associate the address. Here’s the script:

call ec2-start-instances i-abcd1234
timeout 300
ec2-associate-address 0.0.0.0 -i i-abcd1234
ec2-reboot-instance i-abcd1234

The first line calls the EC2-START-INSTANCES command, which has a similar syntax to the EC2-STOP_INSTANCES command showed above. In this case, since we want to run several commands in sequence, we run the EC2 command in its own shell using the CALL command. The EC2 batch commands are actually shell scripts that run Java code, and CALL is needed to return control to the original batch file. If CALL is not used, the batch process terminates after the first EC2 command, and the rest of the commands are never executed. (Note that CALL is optional in the “stop instance” script since it only runs a single command.)

Then there is a TIMEOUT command that waits five minutes (300 seconds) to give the instance time to start. When an instance is first started, the status is “pending” until the startup process is complete. At that point, the status changes to “running.” An elastic IP can only be associated with a running instance, so we need to wait this out in our script. In my experience, five minutes has been plenty of time, but you may find that you need to adjust this.

Next, EC2-ASSOCIATE_ADDRESS is used to associate the elastic IP with the instance, and, finally, the last command in this script reboots the instance. Rebooting is required for ArcGIS Server to recognize the elastic IP. Rebooting is not needed for an enterprise geodatabase instance.

Start/stop several instances

Of course these commands can be chained together to stop or start several instances. One way is to add a separate command for each instance; however, the EC2 commands enable you to reference several instances in the same command. See the example below, which would stop three instances:

ec2-stop-instances i-abcd1234 i-efgh5678 i-ijkl9012

Scheduling scripts

Once you have some scripts to work with your EC2 instances, you can use operating system tools to schedule them to run at regular times (such as Friday nights or Monday mornings). For example, Windows Task Scheduler is available on the various desktop and server editions of Windows and gives you an easy-to-use GUI environment for scheduling a script.

You can set a BAT file to run in the Task Scheduler by creating a Basic Task (Action > Create Basic Task). Once you name your task and set the schedule, you’ll be prompted to select an action to perform. Choose Start a Program and point to your BAT file. Your script is now set to run.

Keep in mind that you can always manually launch a script by just double-clicking it, or running it from the task scheduler. An advantage of running the script from the Task Scheduler is that a history will be recorded of when the task was run. This information is accessible in the properties of any task under the History tab.

Summary

In summary, Amazon EC2 includes scripting tools that can help you automate your work. Scripts make it easier to administer your servers remotely, and, in many cases, allow you to cut costs. Once you have a useful script, you can use operating system tools to run it automatically.

This post is meant as an introduction to scripting, but going further you can use scripting to launch or destroy new instances, create security groups, attach volumes, and so on. Stay tuned to this blog for more scripting tips.

Contributed by Owen Evans of the Esri Washington, D.C. Technology Center

Posted in Services | Tagged , , , , | 5 Comments

Preparing a map for editing: Using basemap layers effectively

When editing, you can incorporate basemap layers into your map to increase productivity. If you have a complicated map, such as a water utility network containing many detailed features and underlying background layers, you can spend a lot of time waiting for the map to refresh whenever you pan or zoom. With ArcGIS 10, you can minimize this by creating a basemap layer containing the contextual reference layers that you are not editing, such as imagery or streets.

A basemap layer is a special type of group layer that is drawn using optimized map display logic that utilizes a local cache to refresh the map quickly. Basemap layers also help reduce network traffic since ArcMap does not need to contact the server repeatedly to retrieve the map extent. To create a basemap layer, right-click the data frame name in the table of contents, click New Basemap Layer, and drag the layers into it. Although a basemap layer can contain any layer format, such as feature classes, shapefiles, Web services, or rasters, some content types are more appropriate for use in basemaps. This post shows you how to identify layers suitable for basemaps, use the editing environment with basemaps, and improve your basemap performance.

Choosing the layers to be in a basemap layer

To use basemap layers effectively, they should truly form a basemap beneath the layers that you are editing. If you edit data for a water district, your operational layers, such as manholes, water main lines, and valves, cannot be part of a basemap layer because you need to edit them and have the features be drawn dynamically to access the latest updates from their data sources. However, any supporting reference layers that you normally display underneath the utility data can be placed in a basemap layer for enhanced performance. For example, you could include a land base of parcel boundaries, buildings, streets, and other built features, as well as imagery layers, in one or more basemap layers. The layers in the basemap look the same as they did before; they just draw faster now. Here is an example table of contents showing the kinds of underlying layers that could be basemap layers.

Basemaps tend to be relatively static and typically are updated on an infrequent basis. Rasters and service layers are good candidates for basemap layers because they are stable and can benefit greatly from improved drawing speed. ArcGIS Online, for example, provides imagery, topography, streets, and other content from several different sources that you can use in your maps. If you click the arrow next to the Add Data button and click Add Basemap, you can add layers from ArcGIS Online directly into a new basemap layer.
Editing when basemap layers are in the map
Because basemap layers are cached, there are limitations on what you can do with them. For example, you cannot edit the layers in a basemap or change the layer symbology. If you need to make edits or layer updates, drag the layer out of your basemap, make the changes, and drag the updated layer back into the basemap layer.

If you attempt to start an edit session with an editable layer in the basemap, ArcMap shows you a warning message. You can edit all the other layers in that workspace, but you cannot edit the layers in the basemap even if they belong to the same geodatabase. If the basemap contains any layers that are related to other editable layers through relationship classes, topologies, geometric networks, or parcel fabrics, or shares data sources with layers outside the basemap, you cannot start editing at all until you move the layer out of the basemap. You can double-click an entry in the Start Editing dialog box to open an ArcGIS Desktop Help topic containing more information on how to fix these and other issues that occur when you start editing.

Although you cannot edit the layers inside a basemap, you can snap to feature layers in a basemap layer. For example, if you were creating a new waterline in relation to building locations, you can still snap to the Building Footprints layer even though it is inside the basemap.

Improving basemap layer display and performance

With basemap layers, you can pan continuously and smoothly by pressing the Q key or holding down the mouse wheel. The rest of the map layers are redrawn once you release the key or the wheel button. If you find that the layers on top of the basemap are difficult to see, you can dim the display of the basemap using the Effects toolbar. This makes the basemap appear washed out and partially transparent, helping your operational layers stand out more. This can be useful for editing, especially in cases where your basemap layers contain orthographic images or other richly colored content that may obscure the details of layers on top of them.

Once you create a basemap layer, you can run diagnostic tests to check its performance. You can do this by right-clicking the basemap layer and clicking Analyze Basemap Layer to display a window listing ways you can speed it up even further. You might see messages indicating that the layer is being projected on the fly or uses complex symbology, which can slow down drawing. For example, the message “Layer draws at all scale ranges” is a suggestion to set a visible scale range on the layer since there is no need to display the layer when the features are too detailed or too coarse at certain map scales. You can right-click an entry to open the Layer Properties dialog box, where you can resolve many of the issues to get the most out of basemap layers.

Data used in the examples is modified from the Water Network Utilities Template by Esri and Fort Pierce, Florida.

For more information, see Working with basemap layers and About edit sessions in the ArcGIS Desktop Help.

Posted in Editing, Imagery | Tagged , , , , , , , , | Comments Off

Free Seminar on Geocoding in ArcGIS Desktop 10

At ArcGIS 10, the geocoding engine has been redesigned so you can accurately map your location-based tabular data and find single locations more quickly and easily than ever before. This seminar introduces you to the new features and covers both geocoding and reverse geocoding workflows.

Thursday, October 21, 2010
9:00 a.m., 11:00 a.m., & 3:00 p.m. Pacific Time (US & Canada)
12:00 p.m., 2:00 p.m., & 6:00 p.m. Eastern Time (US & Canada)
4:00 p.m., 6:00 p.m., & 10:00 p.m. UTC/GMT
 

Posted in Uncategorized | Tagged , , , , | Leave a comment

Tips for a Successful Cache

Once you’ve completed the data migration and map authoring stages of your Community Basemap project, you are ready to start caching! Here is a list of common caching issues you can reference in order to produce a successful cache.

 

Review your map document for completeness:

1. Map document data frame projection is correct for your version of ArcMap. An incorrect projection results in extra cache time as a re-project step is added.

  • Arc9.3.1: WGS_1984_Web_Mercator
  • Arc10: WGS_1984_Web_Mercator_Auxiliary_Sphere

projection image

2. Cache extent layer:

  • Is in the top-most position of your TOC
  • Is NOT symbolized
  • Is split into more than one polygon

3. Cache Mask layer is present in the top-most position of each scale.

toc image

4. All layers are properly sourced and set to display at each scale.

5. The following Representation symbology is correctly set at each scale:

  • BuildingFootprintDropShadow
  • Administrative Boundary Line
  • National Park
  • State Park

6. Any airphoto imagery is removed from your map document.

7. No feature is selected within your project extent or it will display selected in your cache.

 

Caching Set-Up

1. Have all fonts (including Cambria) properly installed on your caching server. Please see this blog if you have trouble installing them.

2. Under your Map Service Properties on the Caching tab, ensure the Image Settings are correctly adjusted:

  • Tile Format: JPEG
  • Compression: 90
  • Click ON: Smooth line and label edges (anti-aliasing)
  • Load the Tiling Scheme from ArcGIS Online / Bing Maps / Google Maps (for ArcGIS 10) or Microsoft Virtual Earth / Google Maps (for ArcGIS 9.3)
  • Do not adjust the Scale levels, leave all 20 scales in the Service Properties.  Then specify which scales to cache when you go to the next step in the Manage Map Server Cache Tiles dialog.

caching image

 

Cache Delivery Options

1. If there are no firewall issues, email the link of your map cache URL to communitymaps@esri.com

2. If #1 is not an option, mail a hard copy (DVD or hard drive) of your map cache including your Cache Extent layer to:

Esri

Attention: Colin Stokes – Community Maps Program

380 New York Street

Redlands, CA  92373-8100

 

Posting Cache on ArcGIS Online

There are three important elements that need to be delivered for posting your cache on ArcGIS Online:

  1. An approved cache
  2. A Citation Layer
  3. A signed Legal Agreement
Posted in Community Maps | Tagged , , | Leave a comment