Making mobile GIS work

Posted by on Dec 7, 2010 in Blog, Uncategorized | 0 comments

Last week I did a presentation titled “Making Mobile GIS Work”.  A couple of attendees have since asked for a copy of my presentation, but as my slides only contained headings, I thought I’d flesh it out here for the benefit of the wider community.

The idea behind the talk was that I see the same mistakes being made over and over again, which in a lot of cases, mean failed projects.  In pretty much all cases, it’s not the technology, or the idea what was lacking, but rather that the system was too complex for the intended operators/environment, or that it was never designed as an integral part of the bigger data management process.  On that note, I hope you find the following of interest.

Reducing complexity

When operators can’t or won’t use a system we provide, it’s due to our failings, not their incompetence.

As people working in the industry, we often are blind to the massive level of complexity we push onto users as processes and systems we deploy.  The majority of mobile data capture is undertaken by field workers, who by the very nature of their jobs, tend not to spend massive amounts of time interacting with information systems.  As system architects and designers, we need to be conscious that we’re deploying into their environment, not bringing them into our environment.

Do you need a map?
As a primer, read this post I did a while ago, which covers a lot of what I discussed.  The crux though, is that in about 90% of data capture operations, a map is not only needless, but actually hampers the job at hand.  I recall at the ESRI Dev Summit in Palm Springs several years ago, where the developer community spontaneously applauding  ESRI for the ArcGIS mobile demonstration, in that the sample application was workflow driven, rather than map driven.  Sadly that enthusiasm and realisation hasn’t filtered though to most of the spatial world.  Maps are cool, there’s no doubt about it, but when we use cool, even though it reduces the effectiveness of a business process, it’s just dumb.

 Capture only what you need.
 Go and read why you should only collect it if you need it, then come back.  To be clear, I’m not referring to metadata, those things you collect to describe the data, and which in most cases, the operator isn’t even aware is being captured, such as capture date/time, capture method (gps, dgps, thumb suck), etc.  No, I’m referring to those attributes that someone somewhere has no use for, but imagines that one day it may be handy to have.  It’s a little bit like never throwing anything away, because you’ll need it one day.  The only trouble is that you have to house/move it with you all the time, and by the time you do need it, its no good any more.

Anyone who has leaned about entity modeling knows that it’s better to have separate entities, joined by relationships, rather then a primary entity, encapsulating other entities.  As an example, following the Black Saturday bushfires, a group I was with in Melbourne were discussing the need for every property record to list the nearest emergency services command post as part of the clean up and response.  While the intent was sound, incorporating that sort of data into a property record was ill advised.  As the clean up progressed, what would happen when command posts were moved, shut down, or new ones set up.  Would the whole process have to be done again.  Chances are it wouldn’t, which would meant that the data would at best be out of date, or worse, down right dangerous.  Far better to collect a separate data set, with locations of all command posts, then use their spatial relationships to link properties to posts.  The best part, is you can collect these different data sets at different times, and associate them at a later date.

Data resolution matters.
When capturing road defects, do you really need to collect every defect as a separate record?  While it may be valid in certain circumstances, chances are that not only will you not be using a GPS accurate enough to later identify individual defects in a limited area, but when it comes to fixing them, all defects in the area will be fixed in one hit anyway.  By considering the required resolution of data, you not only speed up the capture and later reporting process, but simply by fitting in with the existing business process, the technology doesn’t get in the way of the job.

Prototype on paper.
By far the consistently successful projects have one thing in common.  They are using technology to replace a paper based system.  You see when people get caught up in the gadgetry, what they fail to understand is that it’s all about the people and process, not the technology.  So if there’s already a process that people are comfortable with and works well, technology can do a great job of simplifying and automating some of the functions.

Now I’m not saying you can’t go straight to the technical solution, you certainly can, and I have been involved in many great rollouts.  Buy it does take a lot more thought and planning.  And the key is to consider the people and process long before you start to look at the technological solution.  So when you’re talking to potential operators, leave the computer behind and take a stack of paper.

Super simple can work wonders.
I consider the humble hand held GPS an indispensable part of any GIS based department or business unit which captures GIS data. Local government have this great pool of resource through “light duties”.  That being members of the external work force who through one reason or another are unable to do their normal physically demanding work.  Here’s a group of people who are probably more familiar with the external assets than anyone in the office.  As a bonus, they probably already know how to use a simple GPS, because they employ them on the weekends for fishing/hunting etc.  Give them a vehicle, GPS and you’ll get back a thorough data set of the asset of your choice, and they’ll certainly appreciate that more than sorting papers in the office.

How do we integrate.
Is your mobile GIS a system or a component?  We can break data capture into two project types.  Ad-hoc, where it’s a one off, and a manual process to load the data into a corporate format is faster than building an automated system.  The second being a mobile component of a greater work flow, where data transfer is an often and repeated process.  It’s the latter we’ll be discussing.  We need to design with integration in mind.

It’s a new paradigm (Does using this phrase make me cool too?)
Anyone who’s used an iPhone or Android device will understand what I’m talking about here.  Apps aren’t built, only to later tac/hammer/weld/glue a component on to make it integrate.  They’re built to interface to a back end from the start.  Data synchronization is seamless.  You don’t care about the currency of the data, you just trust that the system has taken care of it for you.  In most cases there is no sync button, it just knows what to do, and when to do it.  This is how integrated tools should work.  Sadly most mobile GIS is far from this smooth.  And I’m not talking about having to have a permanent internet connection either.  We’re living in an age where periodic connections, be it hourly, daily or monthly, are more than sufficient.  Taking a leaf our of the new mobile platforms’ book, should see mobile GIS that just works.

Whose responsibility is data quality?
Does the GIS/IT department own the data, or simply host it for the relevant business unit?  To allow seamless uploading and downloading of data, you need to trust the mobile operators.  Sure we can put a step in the middle where data needs to be QA’d before being available to the greater organisation, but in a lot of cases, having data come straight from the field into a corporate system frightens people.  Though if the business unit want to do it, and it’s their data, why are you standing in their way?  It’s an odd thing that when data is manually copied between devices with USB sticks or SD cards, then manually loaded into a corporate repository, people feel more comfortable, as if any evil data will fall out when transferred through several mediums.  I would suggest that those responsible for the data would more likely pick up errors, if they can see it within minutes of walking into the office, rather than having to wait till the next day/week/month.

IT security can be overcome.
Most people working for a large organisation, government or private, find that when they want to do anything remotely ‘interesting’, they are thwarted by their IT department.  There is good news however. Trust me when I tell you that your security implementation has nothing on what the Department of Defense imposes, and rightly so.  Now if a solution can be found in there,  there is no reason a solution couldn’t be found that fits your organisation too.  I find that in most cases it’s down to education and information (often more than you think is required).  Remember, in most cases IT people like doing interesting things too, but it’s their head on the block if it goes to pot, so it’s only fair that the onus is on you to prove the viability of the new idea.

How connected are you.
Depending on where you will be performing your data capture, you may have permanent network connectivity, or patchy, or perhaps none out of the office at all.  The thing is, more connection simply leads to more convenience.  Not being connected in the field is no barrier to seamless integration, though I continue to hear this argument time and time again.  Think of your business processes, what level of connectivity do you actually need to get you job done.  If you do need to connect from the field, mobile and increasing satellite data is cheep.  Connection should simply not be an issue of concern.  Like anything else, it’s a factor when considering your integration, but don’t let it back you into a corner without knowing that options are available to you.  For the record, ArcPad is not the only mobile GIS tool which can be used off line.

Don’t waste human capacity.
A brief note, humans are so smart.  We can do things with ease that computers simply don’t have the capacity to do well, like deduce meaning and significance, or finding solutions to problems.  Why then do we continue to use people for meaningless labor, such as copying data between devices.  That’s the sort of thing machines do much better than us.  Let’s leave that to them and use people for what they’re good at.

The future is now.
There are several technologies which are available to us now, but have for the most part been ignored.  The following is a list of technologies and concepts I believe will become prevalent in the next twelve to eighteen months.

Connected tools in disconnected environments.
There has been much fanfare by the likes of ESRI with all their API’s, and rightly so, they are making it easier and easier for non ESRI products to access data stored withing their infrastructure.  The thing most people are missing though, is that they aren’t restricted to where permanent connections are available.  We need to start looking at them as simply another data source.

Device/OS agnostic.
There used to be a big issue with buying into a platform, even with mobile GIS.  The tools available to us now put us in a place where we can pick the device that best fits the task at hand, ignoring what OS it runs, and put together an application that works with it.  Also in this mix is the goodness that HTML5 brings, which will allow us to start producing web applications, which function more like traditional applications, which work the same across any device or OS, with a compliant browser.

Small, single purpose apps.
Here is one lesson that Mr Jobs has shown us well.  Forget the big application to cover all our data collection needs, rather create five or six, which are small, focused and simple.

While this is working well in the USA due to their freedom of government data, allowing great mash-ups, the sheer usefulness to business and government agencies will necessitate the adoption in Australia too.

Data mining and linking.
Like Wikipedia,  which has gained acceptance as a source of authoritative data, organisations will need to start trusting data which they don’t control.  The sheer volume of data that is being collected, will allow for some truly amazing mash-ups, which rather than a mere curiosity, will hold incredible value.  Though to get the best use, organisations really need to make as much of their data available as possible too, as it’s often the outsiders who will come up ways to link your data to other data sets.

Leave a Reply

Your email address will not be published. Required fields are marked *